"How to give GPT my knowledge?" | Openai, Langchain &Mongodb | Knowledge embedding 101

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
one of the core use case of large language model is its good while chatting you might have seen how chargpt remembers your chat history and defines the output accordingly there are many tutorials online which shows how to create a chatbot but one problem I see is it doesn't remember your chat history suppose you are going to create a chatbot that talks to people the people can be one or multiple so in that case you can adjust host an API of a chat as there will be will be multiple people saying the same thing as the ones we were bought in that case we need to create a season for each user so that history is maintained for individual user just like the judge B dots I will show you example of how you can create a session for each users so that your Bot remembers its chat history just like LGBT does and provide the response accordingly so let's get started now to maintain a season we'll be using mongodb let's open a project folder in Visual Studio code there will be the sixth step the first step is to set up an mongodb client where it will store all the information about the user to the database according to season it maintains then we'll connect mongodb client to an LinkedIn after that we'll set up an buffer memory in the buffer memory we'll transfer all the chat messages that is in the mongodb client to an buffer memory then finally we'll connect llm perform memory to n line chain then we will get the result for our queries accordingly first let's create an environment variable as ENB to do that hop into the terminal and select the new terminal now you can write python minus m v and b a e and B this will create an environment variable as ENB you can name your invalent variable as you want now once invent variable is created install all the required packages by pip to do that you need to type web install and name of the packages here we require line chain one AI Pi and fast API for API creation this breakdown all the project structure part by part the first step is to set up a mongodb client since I will not be showing you how to install the mongodb that you can do by yourself looking other videos but if you are following the video then you can use a free version of mongodb client online as just type mongodb cloud in your browser the first link that pop-ups is mongodb cloud service you can try free and sign up with Google as this now you can use your Google account after logging to the mongodb cloud service the interface that opens is this now simply you can just create a cluster and use share folder which is free and click create cluster since I have already created a cluster so I'll be using that to user cluster after creating any cluster you will get something like this where you can just click on connect button and choose the connect to an app drivers so we are using a python so in that case we'll select the driver to Python and use this client server as our mongodb client after that you can just paste your client URI to n as this so The Next Step the next step is to connect the mongodb client to an lengthen to do this it is quite straightforward in the length end we can find mongodb set message history the URI that we have just obtained from the above can be used here as URI can be to connect string and there is one more field as session ID we'll talk about this later now the third step you set up a buffer memory to set up the buffer memory you can just use conversation buffer memory from an lension then this next step is to add message history from mongodb to buffer memory since this is the database connection that we have established earlier which will store all the information of a user as message history now to pass this message history to a memory we can just do memories Dot save content where we'll pass as input 0 as an input to a message and output as this now finally we'll connect llm buffer memory to Langston in that case we just have to set up an open API as an LMS and use an conversation chain which connects LM and memories together then final and last step is to get the result to get the result we can just do conversation dot predict where input will be the user queries and Link will be anything you want now the final part as our main objective here is to create an API so let's start by combining all this step together and create an API so let's start phrase create a new file as API dot pi and import all the required libraries that we have installed earlier and discussed here we required open API conversation chain and conversation buffer memory for step one setup and mongodb client where we'll Define the URL that we have created earlier from mongodb cloud now is simply in the same time we'll also Define and open a AI with temperature 0.5 and create an app object from Fast API as this now you need to use an open API key from open AI where you will put your key as this in here now we will Define our route as slash chat with post request which can be done as app Dot post with name slash chat this is a route that we need to hit to use this API then let's pack all this Step Above in the function called chatbot the chatbot requires the input that the user types and session ID of individual user then we'll use mongodb chat message history from Langston as our next step which requires a session and URI that we are defined above after this the mongodb is connected to an LinkedIn now if you want to use a prompt template you can use it as this and defined and prompt variable from an prompt template Now to create a default template we can use or create it above Now default template can be anything now for demo propose let's see it by writing something like you heard the chatbot that is prepared by West pairing AI now the prompt is also clear after that the comes next step we will set up an conversation buffer memory which will hold all the message history that the user types in it for now we'll use K is equals to 3 that means it will just keep the record in the memory from an mongodb of only the three conversation that the user has done after this comes the next part to save the message history but is the twist if the user is the first to chat in our platform then there will be no message history of an user or there will be no history stored in a mongodb or a particular user now we have to check that before creating an API to do that we can just take it by if Lane message history dot messages else will do something after that now if there is an history of an user in the mongodb client then we simply then we simply add the content that the user speak in the memories as this where will save the content in the memories from the message history that is stored in mongodb after that we will simply connect memories prompt message history and open AI in line chain through conversations in where we'll pass llm as open AI prompt as prompt template and memories that we have saved earlier then comes the inference part now for imprints you can just simply use conversation dot predict and pass the input that the user asks to end open AI the input we have taken from an API call then link will be session ID the session ID here is also passed in API call so finally we need to save the response that the chatbot has given to an particular user for his question now to do that we can just simply message history dot add user message as pass an input to it and aims is as conversation then your API will be ready but if there is no message history of a particular user then there will be no saving of an content from a message history to an buffer memory then in that case you simply can use conversation chain and predict it directly then we'll pass add user message and AI message and save it in our database as this now your API is ready here you need to return your input as also to do that simply return that the conversation has predicted and also you can just print return con here as well then simply you can activate your environment that you has created by Source ENB script and activate this will activate the environment that you has created earlier after that you simply can type after that you need to install one more thing you'll be gone which is required by the first API for hosting an interface to do that simply can PP install uvicon now to host an API you have to write UV con API and the name here which is app and it enter to inference an API you simply need to go to the URL this then black backslash Docs this will create an UI of an fast API to test your API which is hosted in Swagger as we have defined two fields in our function as input and session the input is the user query and session is or will be defined by the front end when getting an information from the user as of now to test we simply can click on try it out and click my name is assist and session that will be defined by an front end for each user that comes to your site for now you can give as sh or anything you like to type execute you can see your response in here which is this given by our chat as is nice to meet you what can I do for you so now the main part the main objective for creating an IPA API is is to remember you now let's test it so for the same session as sh I can write if it remembers me or not so let's write what is my name and let's see the result so it says your name is Asis in that case it remembers me so you can do for anything like my name is Ravi and I live in Nepal with session of RAB and you can just execute so the response is this now to check for and remember history we can just write what is my name and you should give your name is Ravi so it gives it your name is Ravi hence it remembers you so that's all for this video if you like my video please like share and subscribe and thanks for watching I hope you learned something new today
Info
Channel: Whispering AI
Views: 4,343
Rating: undefined out of 5
Keywords: ai, artificial intelligence, embeddings, openai embeddings, word embeddings, how does word embedding work, embedding in neural networks, chatgpt, gpt, large language model, large language models, finetune an llm, finetune your own llm, finetune llm model, train gpt on your own data, train custom gpt, how to train gpt like model, how to train chat gpt 4 on your dataset, how to train chat gpt to write like you, chat with your data, machine learning, how to train gpt 3 on custom data
Id: srTiN30QwSY
Channel Id: undefined
Length: 17min 2sec (1022 seconds)
Published: Fri Sep 01 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.