2-Langchain Series-Building Chatbot Using Paid And Open Source LLM's using Langchain And Ollama

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello all my name is krishn and welcome to my YouTube channel so guys welcome to the fresh and updated Lang chain Series in this video I will be showing you how you can create chatbot applications with the help of both paid API llm along with that we'll also see how you can integrate with open source llms now you should definitely know both the specific ways how you can actually do it one way to basically integrate any open source llm is through hugging face but as you know that I'm focusing more on the L chain ecosystem and with respect to hugging face I've already uploaded a lot of videos in my YouTube channel and how you can actually call this kind of Open Source llms but since we are working with the langen ecosystem we will try to use all the components that are available in langen as you all know guys uh this is a fresh playlist and obviously my plan is that this month I will be focusing entirely on langen many more videos will be coming up many more amazing videos along with endtoend application fine-tuning many more things is going to come up so please make sure that we'll keep a like Target for every video and for this video the like Target is 1,000 and at least 200 comments and please make sure that you watch this video till the end because it is going to be completely practical oriented okay and uh if you really want to support please make sure that you subscribe the channel and take a membership plan from my YouTube channel so that it'll help me and with the help of those benefits I will be able to create more videos as such so let me quickly go ahead and share my screen so here is my screen over here and you'll be able to see in the GitHub that you'll be finding in the description of this particular video you'll be having folders like this so today is the third tutorial not third second tutorial uh in the first and second we just understood that what all things we are going to learn but in this is the real practical implementation that is probably there so as usual the first thing that we are going to do is that create our V andv environment how to create it cond create minus PV EnV python is equal to 3 1 you can probably take 3.10 version and I have already shown you how to create virtual environments in many number of videos then you'll be using Dov file so this will basically be my environment variable um in this environment variable I will be putting three important information one is Lang chain API key uh the second one is open a API key and Lang chain project you might be thinking this open AI API key I've kept it as open no it is not I've changed some of the numbers over here so don't try out it'll be of no use okay and then the third environment variable that I'm actually going to create is my Lin project name that is tutorial one I have written it over here the reason why I have written this because whenever I try to go ahead and see in my lsmith right I will be able to see observe the entire I'll be able to monitor each and every calls from the dashboard itself how we will be using this everything I will be discussing about it okay so all these things will specifically get required and uh all this will be used in our environment variable so these are the three parameters I have already created myb file so let's go ahead and start the coding okay and you have to make sure that you code along with me because this is the future AI engineering things are basically coming up I'll just show you initially with the foundation model later on this complexity will keep on increasing so let's go ahead and start our first code now what is our main aim what we are trying to do in our first project let me just discuss about because these are all the things that we going to discuss in the future but first thing that we will try to create is our normal chat GPT application okay I'll not say chat GPT but a normal chatbot okay and this chatbot will be important it will be helping you to probably create chatbot with the help of both paid and open open open source llm model so this will be the chatbot that we will be creating one way is that we will be using some paid llms now paid llms one example I can show it with the help of open AI API okay open AI API the second one that I will try to probably show it uh or you can also use cloudy API so that is from a company called as anthropic okay that you can do and one more I will try to use it with the help of Open Source llm see calling apis is a very easy task okay but the major thing is that since we have so many many modules we are going to use Lang chain as suggested right and in Lang chain we definitely have so many modules how we can use this modules for different different calls and along with this whenever we are developing any chatbot application what all dependencies we have specifically right dependencies now if you probably see this diagram here you'll be able to see there will be model prompt output parcel so in our video in this video I'm going to to see some of the features with respect to lsmith I'm going to see some of the features with respect to chains and agents and I'm also going to use some of the feature present in model and output parcel so all this combination we are going to specifically use and that is the reason how this is how I'm going to create the all the projects that we are doing entire videos that are probably going to come up will be much more practical oriented okay so now let's start our first chatbot application so here I will go ahead and write from Lang okay from Lang chain uncore open AI since I'm going to use open AI import chat open AI okay chat open AI so this is the first one that we're going to basically do from Lang chain see this three things will definitely be required then one is chat openi or whatever openi you whatever chat model that you're are going to use how to call Open Source I will also be discussing about that first of all we'll start with opening API itself okay so from linore core do prompts I'm going to import chat prompt template okay chat prompt template so this is the next thing that we are probably going to use chat prompt template okay at any point of time whenever you create a chat bot right this chat prompt template will be super important right here is what you'll you'll basically give the initial prompt template that is actually required Okay the third library that I'm actually going to import is from Lang chain uncore core do output uncore parsers okay Import St Str output parsel okay now this three are very important this string St Str output processor is the default output processor whenever your llm model gives any kind of response you can also create a custom output parser that also I will be showing you in the upcoming videos okay this custom output parser you can do anything with respect to the output that probably comes you want to do a split you want to make it as a capital letter anything right you can write your own custom code with respect to this but by default right now I'm going to use just St Str output parser now along with this the next thing that I'm actually going to do is that I'm going to use streamlet as St okay streamlet as St then I'm going to also import OS and since I'm also going to use from EnV import load uncore Dov so that we'll be able to import all our libraries okay so let's see whether everything is working fine or not okay U from EnV so here I'm going to basically write python load uncore dot sorry python app. py I'm just running it so that everything works fine and all our libraries will also get imp cannot uh python app.py okay I have to probably go to my chatbot folder CD chatbot so now I'll clear my screen python app. P oh sorry from streamlet as St okay import streamlet as St I have to write so that is the reason it was coming all the erors now let's see if everything is working fine Lang chain core so here you can probably see that there is a spelling mistake okay but I'm just going to keep all the errors like this so that you'll be able to see it python m.p if everything works fine uh do output parser okay P Capital now so I think my suggestion box is not working well and that is reason now everything is working fine uh here you can see that I'm not getting any error so let's start our coding and let's continue it okay so we have imported all these things right now now as I suggested guys since we are going to use three environment variables one is the open API key Lang chain API key and along with that I will also make sure that the tracing to capture all the monitoring results I will keep this three environment variable one is open API key Lang chain tracing version two and Lang chain API key so lanin API key will actually help us to know that where the entire monitoring results needs to be stored right so that dashboard you'll be able to see all the monitoring results will be over here and tracing we have kept it as true so it is automatically going to do the tracing with respect to any code that I write and this is not just with respect to paid apis with open source llm also you'll be able to do it now this is the second step that I have actually done now let's go ahead and Define my prompt template simple so here I'm going to write my prompt template okay prompt template so here I'm going to Define prompt is equal to chat prom template dot okay from uncore messages okay and here I'm going to Define my prom template in the form of list the first thing that with respect to my prom template that I'm going to give is nothing but system and system here I say that you are a helpful assistant please respond to the queries okay please respond to the questions or queries please response to the user queries okay whatever queries that I'm going to specifically ask a simple prompt that you can probably see over here the next statement uh after this is what so this will be my next see if I'm giving a system prompt I also have to give a user prompt right user prompt will be whatever question I ask so this will be user and here I will define something like question colon question I can also give context if I want but right now I'll just give it as a question a simple chatbot application so that you'll be able to start your practice of creating all these chatbots so now I will go ahead and Define my streamlet framework okay see the learning process will be in such a way that I will try to create more projects and use functionalities that are there right and in this way you'll be able to work it in an amazing way okay so here I'm going to basically write st. title Lang chain demo with the open API std. textor input search the text topic you want okay now let us go ahead and call my open AI llms okay open AI llm so here I'm going to basically write llm and whenever we use openi API so it will be nothing but chat open Ai and here I'm going to give my model name the model name will be nothing but GPT GPT 3.5 turbo so I'm going to use turbo because the cost is less for this I've I've put $5 in my open a account okay just to teach you so please make sure that you support so that I will be able to explore all these tools and create videos for all of you okay and finally my output parser see always remember Lang chain provides you features that you can attach in the form of chain right so here three main things we have created one is the chat prom template next one is the llm and next one is the output parcel obviously this is the first thing that we require after this we integrate with our llm and then finally we get our output so string output parser is responsible in getting the output itself finally chain is equal to we will just combine all these things so here I'm going to write prompt llm and then finally my output parsel right I will show you going forward how we can customize this entire output parsel and all and finally if I write if input text if input undor text colon now whenever I write any input and probably press enter Then I should be able to get this output so st. write and here I'm going to just write chain. invoke and finally I get I give my input as question and that input is assigned to my input text input text right so this is what we are going to basically do right st. write now this is what we are doing a simple chatbot application but along with this we have implemented this this this feature is specifically for Lang Smith Langs Smith Lang Smith tracking okay this will be amazing for to use okay and this is the recent updates that are there so whatever code I'm writing will be applicable going forward in various things that are probably going to come up okay now let's go ahead and run this so in order to run it you'll just need to write nothing but streamlet Run app.py Okay oops that is an error app.py and here I'll do allow access okay so right now now you'll be able to see over here Lang chain series test llm but my my my project name was Project one okay so now if I go ahead and hit hey hi okay and just press enter you'll be able to see that we'll be getting this information over here and here you can see my project something let me reload it tutorial one right so this is the first request that is already been hit and here you'll be able to see your enable sequest chat prom template right all the chat Brom template output message your helpful assistance pleas response to the user queries right along with this you will be seeing chart open AI API and with respect to this what was the cost everything you are able to track so 027 is the cost that actually took with respect to this and finally my string output parser how can you assist today with respect to this output parser it is just going to give me the response clearly now when I develop my own custom output parcel I'll be able to track everything so here what you are able to do you are able to monitor each and everything that is there right all the request that is probably coming up okay so provide me a python code a python code to swap two numbers okay so once I execute this and here you'll be able to see that I'm able to get the output and answer everything is over here and for this you'll be able to see the cost will be little bit High okay if you don't agree with me or let's see with respect to tutorial one the second request that I've actually got 4.80 seconds yes it took a little bit more time and here the cost was 00211 so it is based on the token size right for every token it is bearing some kind of cost perfect uh this was the first part of this particular tutorial now let's go to the second part uh the second part is more about making you understand that how you can call um open source llms in your local itself and how you can actually use it so for this first of all I will go ahead and download AMA okay AMA is an amazing thing because you'll be able to run all the large language models locally uh the best thing about AMA is that it automatically does the compression and probably in your local you'll be able to run it let's say if you have 16 GB Ram you will just have to wait for some amount of time to get the response but Lama 2 and code Lama you can specifically use it over here all the open source llm model and it supports a lot of Open Source llm models and yes uh in Lang chain ecosystem the integration has also been provided over here so what I'm actually going to do over here is that I'll show you first of all just go ahead and download it this is available both in Mac Mac Linux and windows wherever you want just download it after you downloaded it what you really need to do is just go ahead and install it it is a simple exe file for Windows MSI file for Mac OS and then Linux is a different version so you just need to double click it and start installing it once you install it here uh somewhere in the bottom this AMA will be start running okay now once AMA installation is done now what I will do over here I will create another file inside my chatbot okay and create another file local llama okay local Lama py now local Lama py what we are going to basically do over here is that uh with respect to the local llama I will first of all go ahead and import some of of the library see code will be almost same right there also I'll be using chat open API chat prom template string output parser so I'll copy the same thing over here I'll paste it over here now along with this what I'm going to do I have to import AMA right because that is the reason why we will be able to download all the specific models okay so Lang chain community. llm see over here whenever we need to do the third party integration so that will be available inside langin Community okay so AMA is third party cont configurations uh let's say you're using some Vector embeddings that is also third party so everything will be available over here okay now this is done langore community. LM import AMA and then we have this output parser string output parser core. prompts that is nothing but chat prompt template and everything is there okay now let's go ahead and write import streamlet as St so I'm going to going to use the streamlet over here along with this import OS and not only that we will also go ahead and import from EnV import load uncore dot loancore dob okay now we'll initialize it load underscore Dov okay once we initialize all this random all this uh environment variables as usual I will be importing this three things now see in my previous code when I was using open aipi prompt template we have written it over here right same promt template we'll also write it over here because it we just need to repeat it because the main thing is that you really need to understand how with the help of AMA I can call any open source models okay so here it is and then finally you'll be able to see where is my uh code to call my open a llms that we going to see over here so this is done now stream late framework also I will try to call it over here okay it's more about copy past the same thing that we have actually implemented and then you will also be seeing this is the code that we going to implement it okay but here we are calling chat open AI okay I specifically don't want chat open AI instead I will be calling AMA okay so o Lama whatever Library we have imported so o Lama okay and then here we are specifically going to call a Lama 2 okay now before calling any models now which all model are specific supported if you go ahead and see in the GitHub right of AMA you'll be seeing the list of everything every every every libraries that it supports like Lama 2 mral dolphin F 52 neural chat code Lama all are mostly open source GMA GMA is also there but before calling this what you really need to do is that just go to your command prompt let's say that I want to use GMA GMA model okay so what I have to do or I have to use Lama model right so in order to do this I have to just write AMA run whatever model name because initially it needs to download it right uh this will get downloaded from some open source some GitHub it can be GitHub it can be hugging pH somewhere right some location there will be there we have to download that entire model so let's say that I want to go ahead and write AMA run gamma so this what will happen it will pull the entire GMA model right wherever it is so here you can see pulling will basically happen now this is right now 5.2 GB right for the first instance you really need to do it now since I I I am writing the code with respect to Lama 2 I've already downloaded that model so that is the reason I'm showing you another example over here run GMA now once this entire downloading happens then only I'll be able to use the gamma model in my local with the help of AMA so I hope you have got an idea about it now what I'm actually going to do so here I've called AMA model Lama 2 okay then again output parser is this and I'm combining prompt llm and output parser and everything will be almost same and that is the most amazing thing about Lang chain the code will be only generic now only you need to replace open a or paid or open source it is up to you again I'm saying you guys the system that I'm currently working in has a 64GB Ram uh it has Nvidia Titan RTX which was gifted by Nvidia itself so with respect to this uh amazing system I will be able to run very very much quickly that is what I feel so let's go ahead and run it so here what I'm actually going to do I'm going to write python uh so it is streamlet so streamlet run run local Lama py so once I execute it here you'll be able to see now now instead of open AI API I should had okay no module name Lang chain Community let's see where is Lang chain Community okay I have to also make sure that in my requirement. txt I go ahead and use this langin community and I need to import this Library since I need to do that and that is the reason I'm getting an error so if I go ahead and write pip install minus r requirement. txt oops CD dot dot okay now if I go ahead and write pip install minus r requirement. txt so here you'll be able to see my requirement. will get installed this Lang chain Community will get installed once I'm done with this then I can probably go ahead and run my code okay so this will take some amount of time so if you liking this video please make sure that you hit like uh there are many things that are probably going to come up and it'll be quite amazing when you learn all these things okay so uh once this is done then what will happen is that we can and you can use any model up to you okay and I don't want this open a key also only this two information I specifically want I'll be able to track all these things okay and later on I'll also show you how you can create this in the form of apis again it some time it'll take this but uh let me know uh how do you think all these tutorials are blank chain I see a lot of purpose for this particular Library it's is quite amazing that people are doing um the company is doing amazingly well in this open source world and it is developing multiple things over there so now I will go ahead and write CD chatbot I will go inside my chatbot and then I will run this python local Lama dopy once I execute this now I don't think so it should be an error okay it should be streamlit come on streamlit run local Lama oops local Lama py not python run streamate run now here you have again I'll be getting open AI text over here let me change this also so that I can make it perfect with Lama 2 okay so I've executed it saved it I will rerun it I'll say hey hi so once I execute it you'll be seeing that it'll take some amount of time in my system even though I have a 64 GB Ram but I'll get the output over here so assistant says hello how can I help you today now if I probably go ahead with respect to this dashboard uh let's see where it is so now tutorial one you'll be able to see that this will increase okay there will be one more over here right so I've reloaded this page okay and you'll be able to see it okay you'll be able to see the new AMA request see hey hi High 4.89 second token 39 but there is no charges because it is an open source model right so here you'll be able to see if I extend this there you'll be able to see chat prom template ama ama is over here now this AMA is specifically calling Lama 2 over there and whatever open source libraries that you specifically want just to call this it is very much simple you have to just go into the GitHub and download any model first of all just by writing o Lama run that particular model name once it is downloaded it is good that you can probably go ahead with and use it okay now I will say uh provide me a python code python code to swap two numbers okay if you want more coding well chat bot you can directly use code Lama if you want okay so here you can see all the examples are there and this was quite fast right so this is good you know so if you have the right kind of things so here you can see 4 seconds it has Pro taken okay AMA is over here all the information is probably over here prompt and completion and all right so I hope uh you like this specific video I hope you able to understand things uh I said guys again uh if you're new in this Channel please make sure that you subscribe the channel there a lot of tutorials that are probably going to come up but here I've just shown you multiple ways of creating chatbot application using both uh open Ai apis and open source models with the help of langin so yes this was it for my side I'll see you in the next video have a great great day thank you and all take care bye-bye
Info
Channel: Krish Naik
Views: 41,609
Rating: undefined out of 5
Keywords: yt:cc=on, langchain tutorials, chatbot using openai, ollama tutorials, building langchain application, krish naik generative ai
Id: 5CJA1Hbutqc
Channel Id: undefined
Length: 27min 0sec (1620 seconds)
Published: Mon Apr 01 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.