3-Langchain Series-Production Grade Deployment LLM As API With Langchain And FastAPI

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello guys so we are going to continue the langin series already in our previous video we have already seen how to create chatbots with the help of both openai API and open source llm models like llama 2 we have also seen what is the use of AMA how you can run all these open source model locally in your system and uh along with this we also created multiple end to-end projects using both of them now what we are going to do in this specific video one more step which is very much important for our product production Gade deployment that is creating apis you know for all this kind of llm models we will be able to create AP and uh through this you will also be able to do the deployment in a very efficient manner now how we are going to create this specific apis there is a very important component uh which is called as Lang serve in Lang chain we're going to use that along with this we are going to use fast API um and not only that we'll also create a sag Swagger UI which is already provided by langin uh the Lang serve library that we are specifically going to use now it is important guys you know this specific step because tomorrow if you are also developing any application you obviously want to do the deployment for that particular application so creating uh the entire API for this application will be the first task that will be required so uh yes let's continue let's go ahead and uh discuss this entire thing first of all how we are going to go ahead first of all I'm going to show show you the theoretical intuition how we going to develop it and then we will start the coding part so let me quickly go ahead and share my screen what we are actually going to do over here over here I se you have seen that I've written apis for deployment okay so if I consider this diagram this is the most simplistic diagram that I could draw okay now let's consider see um the at the end of the day in companies right uh there will be different different uh applications this applications are obviously created by software Engineers right it can be a mobile app it can be a desktop app web app and all now for this particular app if I want to integrate uh any foundation model or any fine-tuned foundation model like llms and all um so what I really need to do is that I need to integrate this with the form of apis at the end of the day see what we are going to do over here this side is my llm models itself right it can be a finetuned llm models it can be a foundation model I want to use those functionality along with my web app or mobile app so what we are doing is over here is that we will create this specific apis now this apis will be having routes okay routes and this routes will be responsible whether we have to probably interact with open aai or whether we have to interact with other llm model like Cloud 3 cloudy 3 or llama 2 open source model so any number of llm models whether it is open source or uh whether it is uh paid API models specifically for llm we can definitely use it now this is what we are going to do in this video we will create this separately we will create this separately and at the end of the day I'll also give you an option through routes how you can integrate with multiple llm models right so I hope you have got an idea with respect to this and any number of llm models it may probably come you can probably integrate it at the end of the day whichever model is suitable for you you can use for different different functionality and the reason why I'm making this video understand one thing guys because in llms also you have different performance metrics some model is very good at some performance metrics over there like mlu other other metrics are definitely there so this is an option where we can use multiple llm models now what we are going to do quickly I will go ahead and open my code document so in in my previous video already you have seen I had actually developed till here right I have basically if you see over here we have created this first folder that is sharebot and inside the Shad bot we have created app.py then local Lama py right we with this did this entire thing and as I said every tutorial I'll keep on creating folders and developing our own project over here now let's go ahead and create my second folder and this time this will be apis okay so I'll just go ahead and write something like API okay now with respect to this API as I said whatever we do with this local Lama or app.py right over here I've used open API key here I've have used open source models so we'll try to integrate both of them in the form of routes okay so that we'll be able to create an API so let me quickly go ahead and write over here app. Pui so one will be my app. Pui which will be responsible in creating all the apis the second one will specifically be my client.py now client.py in this specific diagram is just imagine like this one web app or mobile app because we are going to integrate this apis with this mobile app or web app okay so quickly let's do this first of all I will go ahead and start writing the code in app.py before I go ahead uh we have to make sure that we need to update all the requirement. txt almost all the libraries have installed it but I'm going to install more three libraries one is langu fast API and uvon okay since I'm going to create my entire Swagger documentation of API using fast API right so all these three libraries I'll be using it so first of all I will go ahead and install all these libraries that will once we run the code right now let's go ahead and write my app.py code okay now as usual first of all uh let me just open my terminal okay and let me do one thing with respect to the terminal I will go ahead and write pip install pip install minus our requirement. txt okay first of all I need to just write CD dot dot okay then I'll clear the screen and then go ahead and write paper install minus r requirement. txt now here you'll be able to see that my entire installation will start taking place and here uh the other three packages that I have actually written right Lang serve and all that will get particular installed okay now till then installation is basically going on let me go ahead and write my code so I will write from Fast API import fast API okay so this is the first uh first library that I'm going to import along with this I have to also make sure that I create or I import my uh chat prom template so from Lang chain since we are going to create that entire apis in this okay do prompts import chat promp template right so this is done then from Lang chain dot chatore models import chat open AI so this is the next one since I need to make sure that I need to create a chat chat application so that is the reason why I'm using this chat models okay uh this is the next library that I'll be going ahead and importing along with this I'll will also use langu which will be responsible in creating my entire apis right so from Lang serve import add routes okay so through this I will be able to add all the routes over there right whatever routes suppose one route will be that I need to interact with my opening API one one route will be to interact with the Lama 2 and all So based on that I will go ahead and import this next thing is import uon okay uon will be required over here oops okay next is import OS see I can probably enable my GitHub co-pilot and I can probably write the code but I don't think so that will be a better way uh I usually use this AI tool which is called as blackbox uh so that it'll help me to write my code faster and uh you know it also explains about the code so I'll probably create another video about it okay so how to basically use this then one more thing that I really want to import over here is my olama so from langin community. llms llms import o Lama okay so this is done all the libraries that is specifically required by my code uh I have actually written that okay and these are all the things that I will require to create my open a API now this is done now what I'm actually going to do over here is that I'm just going to write os. environment and first of all I will initialize my open AI API key so I will write open aore API uncore key okay and this specifically I will load it from os. getet EnV and then I will go ahead and write open aior API uncore key okay so this is the first thing that we really need to do before I do this I will just go back to my o app.py and I will just initialize this okay load. EnV let me quickly copy this entire thing and paste it over here and I will initialize this load underscore okay so this will actually help me to initialize all my environment variable perfect uh I've also loaded my openi API key now let's start this fast API now in order to create the fast API uh I have to create an app here I've given title langin server version 1.0 and the third information I basically want is the description now after this I can use this app and I can keep on adding my routes okay so what I will do uh addore routes and this routes is basically to add all the routes over there right for so the first time when we are adding this particular routes you have to make sure that I give all all the information like whether I'm going up with my app let's say I go with this chat open API chat open AI open AI okay and the third information that I will probably give is my path so this is my one of my one of my uh route you can just consider this open a route and this is my model that I will specifically be using so this is just one way how you can actually add route okay but let me just add some more things because see at the end of the day when we created our first application we used to combine prompt llm and output parser in this specific way so what we'll do over here is that when we are creating routes we also need to make sure that we add all the routes in such a way that I also integrate my prompt template with it okay so here I'm going to say model is equal to chat open AI chat open Ai and I'm going to initialize this particular model and then let me go ahead and create my other model also see o Lama I will just use my Lama 2 Okay so this model also I need to basically create it or call it okay because I I want to use multiple models so here I'm going to write llm and here AMA and this will basically be my model is equal to llama 2 okay so I'm going to use the Lama 2 model so this is my one model over here this is my another model over here okay now let me go quickly go ahead and create my prompt one so my prompt one will be my chat prompt template chat prom template dot frore template okay and here I'm going to basically give one chat prompt okay let's say one of my interaction one I want to use open a API uh for opening an API let's say I want to create an essay so I'll say write me an essay write me an essay about a specific topic that topic I will be giving it okay about some topic okay some topic okay around with 200 words or with 100 words okay so this is my first prom template okay I'm saying this will be my prom template write me an essay about whatever topic I give with 100 words okay something like this so this let's go ahead and write this this is my first promt template then I will create my second promp template okay this is important just just hear me out okay and this prompt template will be responsible in interacting with my open source model so let me say write me a poem I'll say write me a poem about this specific topic with 100 words so in short this first prom template will be interacting with this model that is chat open API prompt 2 will be interacting with llm model okay whatever llm model I've written over here that is Lama 2 now let me go ahead and add this routes so here I'll say add routes okay and again as usual I will write app now here you see I will be combining prompt or model right model will be chat open API and you know this prompt one is specifically for my chat open API right then I will also give the route path so my path will be something like this okay and this will be my API right so here say let's say I'll write SL essay that basically means this path is responsible in interacting with the open API and this is denoted by slash essay so that API URL that you'll be getting will be ending with slay okay the other route and I can keep on adding any number of routes as you want so another route will be like with prompt 2 okay and here I'm going to basically use my llm and let me just go ahead and write this as/ poem okay/ poem so this is my another route this is my another route okay and finally I've created two apis in short and now here I will write if underscore uncore name underscore uncore is double equal toore uncore maincore uncore right now what we are going to do over here this is the starting of the application I will say uvon do run and here I'm going to use my app comma host will specifically be my Local Host see here I'm giving you the Local Host uh this application you can run it in any server that you want right so that is the saying that is the reason why I'm saying this is the first step towards building that production grade application okay and the port number will be 8,000 perfect so this is done I think uh it looks really really amazing we have written the complete code over here and this is my entire app.py with all the API so that basically means if I see the diagram over here this API spot I've have actually created routes along with prompt template I've created and I've used two apis one is open aai and one is Lama tool two routes I've specifically used now my time is to basically create this web app and mobile app okay and I here I will just try to create a simple web app the reason why I'm creating this simple web app because it will be able to interact with this okay now what I will do let's go ahead and run this and see whether everything is running fine or not okay so here what I will do I will go to my CD which folder this is this is basically my API folder okay so CD API oops Port let me just write it Port okay done CD I'll save this CD API okay and then here I'm going to write python app.py okay now once I execute this from Lang chain uh Lang chain. prompts just let me see the documentation whether it is correct or not it should be prompts okay so a small issue over here it should be prompts now let me just go ahead and execute it now I think it should work fine uh so here you can see import error uh SSE Starlet okay so this is one of the dependency that is probably coming up uh I will quickly go ahead and update my requirement. txt okay and let me do one thing just let me write pip install this particular application okay so I will go and write pip install control V oops pip install and SSC Starlet okay so once this installation will probably happen I think now we'll not get an error now if I write python app.py it should run I guess so it is now running okay uh so let's go ahead and open this Local Host uh now here you can see I'm getting details not found but if you really want to see that entire apis that is basically created you can just write slash docs now here you can see your entire langin server see I've created open aai input schema everything is given essay essay is also there poem is also there and all right so all the apis is almost created see here you can see what is the input what is the output output schema what is is basically required everything is probably given over here and this is basically called as a Swagger UI documentation okay the entire Swagger UI documentation is done now perfect your apis part is basically created now how do we interact with this apis that is the most important part so for that what we are going to basically do I'm going to uh let let let let let let this terminal be hidden over here I'm going to create my client.py okay now with respect to the client.py what we are going to do and let me do one thing till then I will go ahead and disable this extension okay because I don't want this extension to disturb things okay disable perfect I will show you this extension it is a very powerful extension okay now this client. py is my app it can be my app okay so first of all what I'll do I will go ahead and import requests okay after importing request I'm going to write import streamlet as streamlet as St okay so two libraries I'm going to import request and streamlet understand I'm creating a web app I can create a front end that will interact with the API okay so here first of all I will write two definition or two important function one is uh get underscore open aore response and here I'm going to basically go ahead and write my input undor text okay here I will call I will probably call use this request object and create my response now in order to create the response in order to call the API how do I do it so here I'm going to write request. poost and how to basically call it so first of all I need to give the URL understand we are already running this URL right this entire Local Host if you probably see over here everything is running over here right if you if you see this okay if you probably see this and uh try to try to see this entire URL okay you can directly see that in order to call this apis I will be using a URL that is already provided by this particular Swagger UI that is nothing but let's say I want to call the essay UI Ur L okay now in order to call the saay URL which is using the open a API here you can see HTTP Local Host 8,000 sa/ invoke so this is my entire API URL which will be responsible in calling this and here we also know that we need to give one input so what I will do I will go ahead and write over here as Json is equal to and in the form of Json I will give one of my input so input colon okay and here I will write top topic why topic because see here what we are given over here if you see over here topic is there right now this topic will be nothing but whatever input text I am giving to this particular function that same input text will be visible over here okay so this is what we have actually done over here right so input topic colon input un text so this is what is the input I will give and it will hit this particular URL along with this particular input post okay and then finally I get my response now what I will do I will go ahead and take this response convert into a Json and it has a key dictionary value pairs which where my output content will be present inside this variable called as output and one more thing that I require is basically content okay so inside this key my entire response will be available the next thing that I'm actually going to do is similar with the help of O Lama response and here we are going to use Lama 2 so get AMA response input text again request. poost now instead of using this/ essay SL invoke we are going to use SL po/ invoke why/ po/ invoke because SL poem will be responsible in interacting with Lama 2 okay and then I get my Json again I uh I get my response here I'm writing response. Json of output okay so these are my two functions that I've actually created to get the response now what let me do one thing now let me go ahead and create my streamlit framework very simp simple now now from here everything will be simple okay so st. title this this is there I've created two text box one isex input text and one is input text one first text box I'm saying write an essay on whatever topic the second text box I'm writing write a poem of any other topic okay so first one will be interacting with the openi application the second one will be interacting with Lama 2 okay and finally I'm going to call this particular two function that I have created if input. text if I write any text in the first time then it should basically interact with the openi response whatever response I'm getting it will display it over here if input text one I'll just call the AMA response which is interacting with the Llama 2 on this particular URL okay and then I will get the over here the output so this is my entire application which is interacting with the API which is already running over here right now with respect to this client.py what I'm actually going to do I'm going to create another the command prompt and let me run this okay so here I will say uh cond deactivate and I will say cond activate VNV okay and let me just go ahead and run this now see python client.py okay okay sorry it should be streamlet streamlit run client.py okay okay so this is my client side okay file does not exist okay oh I will say CD and I will go to my folder that is API now let me go ahead and write streamlet run client.py now see this this is my Swagger UI the API is running over here and this is my front end application which is basically in now it will interact with this apis okay now if I write the first one the first the first anything in the first text box this is going to interact with the open aipi so I'll say write an aay on machine learning okay so here you can see name get open a response is not defined why this is the eror let's see open AI response oops just a second this function name is does not match and this also AMA response does not match or what okay now I think it should work let's reload it okay now you can see automatically machine learning is a powerful technology that enables computer this this this so the first text box is interacting in the opena IPI write a po poem on machine learning let's say if I give the same thing if I execute it now this is going to interact with the Lama 2 so machine learning is a field of artist that allows computers to make decision or still running sorry uh this was the first response that I got from them uh okay it's running I guess let me reload it write a poem on machine learning okay and I can also write my own custom prompt if I want everybody I was thinking that we could make a bit of interesting by adding some extra challenges so and so so what do you say something something is given okay uh and if I probably see over here with respect to the prompt input text one this is also going over here uh this is my input text okay perfect so everything works fine and here you'll be able to see write a poem about uh this particular topic where with um for a 5 years child okay now let's see I'll just try to change the prompt and see whether it is working fine or not so this has got reloaded it uh let's see whether this is reloaded or not now if I go ahead and reload this write a poem on unicorn okay I'm just writing something unicorn I think it should be able to give the poem Shimmer cod in various colors here are 100 words that describe a unicorn okay something something information is basically coming and this is basically interacting with the entire Lama 2 so I hope uh you've understood this just imagine guys this is already running in the server and this you have the front end now whether it will be a mobile app front end app or Edge device anything you will be able to interact with this entire apis and that is the most amazing thing over here and this is what you go basically to the first phase of the deployment where mainly you create the apis now it's all about using whichever Cloud Server and doing the deployment in set so I hope you like this particular video this was it for my side I'll see you in the next video have a great day thank you and all take care bye-bye
Info
Channel: Krish Naik
Views: 13,087
Rating: undefined out of 5
Keywords: yt:cc=on, langserve tutorials, langchain tutorials, deploy llm as api, krish naikmlangchain tutorials
Id: XWB5DXP-DO8
Channel Id: undefined
Length: 27min 11sec (1631 seconds)
Published: Wed Apr 03 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.