How to add memory to your LLM to remember previous conversation. #llm #ai #chatgpt #datascience

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hello friends welcome to my channel so today in this video tutorial we will learn about one of the very important modules in Lang chain which is memory so basically we will answer two questions what is memory in line chain and why memory is needed in line chain and then after that we will also develop a demo application where we will implement the usage of memory with the llm application so let's come to the first question now what is memory length in so memory is basically a buffer storage where you can keep the history of your conversation with the llm application for example what are the messages which user sent to the llm application and what were the responses received back from the llm application now second question is why memory is needed in line chain so suppose you are developing a conversational bot so what can happen is that whenever user asks a new question it may variable belong to the previous questions and also to the answers provided by the llm so in order to respond to the current question by the user llm has to understand the previous context through the conversation and then accordingly modify its current answer so this is the main reason behind having this module of memory in LinkedIn now we will develop a demo application so let us start import OS import from app secret import open AI API key so app Secrets is the file which I have created where my API key is stored first of all create the environment variable because we are going to use open API now we should import Lang chain modules or length in classes from link chain dot llms import open AI from LinkedIn dot prompts import prompt template from link chain Dot memory import conversation buffer memory okay so let us first create the instance of memory memory equals to conversation buffer memory now first of all I will show you what how the data of the object memory looks like so let us give some hard-coded messages to this memory object so memory dot chat memory dot add user message hello and then memory dot chat memory dot add AI message hi how are you if you want to print what are the details in your memory object you can simply print memory dot load memory variables notice this thing that you will have to pass an empty dictionary the reason behind it is that your memory object can be dependent on some input variables if that is the case then you will have to pass those input variables as well but in our case it is not so if I save it and I then simply run it python demo.py I can see how that my messages are stored in the memory one thing which we need to note here is that all the conversation is being returned with the key history this is very important guys because this we will need to use as it is in our prompt template so let us create next our prompt template we should also comment this prompt template equals to from template input variables thank you we will have two input variables one will be history for the memory and then another will be input which will be a user input then it will be template equals to let us give one role to our llm you are a conversational bot user will communicate communicate with you maintain formal tone in your response then we will give the conversation history conversation history you will pass the history key here and then Truman will be our input and a I we will leave it blank so this is our template we are done with the template next we need to create llm llm equals to open AI lets us keep temperature 0 for now you can keep any value now next we will create an llm chain so let us call it conversation chain equals to llm chain okay we did not import the llm chain so from Lansing Dot change import llm chain llm chain we will pass llm equals to lrm next we will pass our prompt equals to prom template next we will pass memory equals to our memory object and we will keep verbose equals to true so that we can see what final prompt is being ready and then getting passed to our llm let us pass some messages here print conversation chain input will be hello and let us pass another message in the sequence conversation chain what is the day today or let us say how is the day today okay let us run this python demo.py okay now you can see that when our first message was passed which is hello our prompt template was passed with no conversation history at all you can see only the human message here and the air response now when we pass our second message you can see that after hello the AI responded as with message hello there how can I help you today and we will see that this same message was added to the AI message and then our next message was also appended how is the day today and to that the llm responded it is a beautiful day today so that's how you can implement the memory module within your llm application and Keep A Memory of your conversation with the llm chain to your application this is all for this video in the next video I will show you how to use the memory object of Lang chain or memory module of plank chain with streamlit application thank you guys if you find this video helpful please like this video And subscribe to my channel thank you see you in in the next video
Info
Channel: TechLycan
Views: 578
Rating: undefined out of 5
Keywords:
Id: SikRy3U7B4A
Channel Id: undefined
Length: 9min 45sec (585 seconds)
Published: Fri Sep 01 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.