LangChain Crash Course: Build a AutoGPT app in 25 minutes!

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
large language models like those the power chat GPT are taking over software every startup is rapidly moving to get some form of large language model powered machine learning into their stack Wolfram Alpha is now plugged into chat GPT Khan Academy has their education AI Salesforce has Einstein GPT Bloomberg in fact took it one step further and fine-tuned their own GPT model so my question is why don't we all do it I'll explain why in a second but in this video I'm actually going to show you how to build your very own Auto GPT model using a framework that's taking the programming Community by storm so up until now this has been difficult to do organizations have needed teams of machine learning Engineers to be able to build train and productionize machine learning models it was too big of an investment for most companies to bother taking on but now it's not and Frameworks like Lang chain make it dramatically easier to get started think of Lang chain like Spider-Man with a jetpack it's super fast and it's connected to the web programmers can use it to leverage large language models from open AI hugging face cohere and more but where it gets super interesting is it provides the ability to use agents this means you can have your own app reach out to the web or use your own documents inside of your own GPT pipeline there's six key modules in link chain models prompts indexes memory chains and agents models gives you access to large language models prompts structure your prompts with templates indexes prepare your documents for working with large language models memory gives your llm chain access to historical inputs kind of like chat GPT chains allow you to string it all together and agents allow you tools like accessing Wikipedia or Google Search now that you know a little bit about the Lang chain framework we're going to go on ahead and build an application with it with streamlit and Lang chain we're going to run through a bunch of the modules we'll take it end to end and be able to build an application that allows us to create a YouTube script generator and title now in the interest of time I'm going to set a 15 minute timer to try to get this information to you as quickly as possible so the first thing that we need to do is create two files so we're going to create a file called apikey.pi and we'll also need to create one called app.pipe so inside of our API key.pi file we're going to create a new variable called API okay and I'm going to set that equal to the API key that is available inside of open AI now you can use different llm service providers I'll show you what that looks like a little bit over here so if you didn't want to use the open AEI service you definitely don't need to now inside of our app.pi file we are then going to import that in so I'm going to run import OS and I'm also going to bring in the API key so from API key import API key now we need to make this available to our open AI service so to do that I can run os.inviron and then inside of a set of square brackets I'm going to set a dictionary key so open AI open AI underscore API underscore key and we are going to set that equal to this API key over there so we're going to set it equal to that beautiful now the next thing that we need to do is we need to go on ahead and install some dependencies to be able to leverage the lane chain service I'm going to open up a terminal and then we're going to run pip install oh that's looking a bit weird bring that up let's clear that oh we're getting errors already so pip install streamlit we also need Lang chain the man of the art we also need open AI Wikipedia chroma DB and we also need tick token so six different services that wins on so pip install streamlit Lang chain open AI Wikipedia chroma DB and tick token if we go and run that that is all of our set of services now installed now what we want to do is we actually want to bring some of those in so we're going to import streamlit as St streamlit is going to be our application framework that allows us to work with our different services so bring in depths I'm just going to separate these out a little bit and then what we want to do is bring in the open AI server so we're going to go from Lang chain dot llms we're going to import open AI so this is going to give us the open AI service to be able to leverage a large language model the next thing that we probably want to do is start setting up a little bit of our app so we can see what it's going to look like so first up we'll create a title so St dot title and we are going to grab some emojis from the langchain documentation and we're going to say that this is going to be equal to instead of a set of strings um YouTube GPT creator you can probably think of a better title than me we're then going to create a place that we can pass through a prompt to our llm so to do that we're going to create a new variable called prompt and we're going to set that equal to St dot text input and here we're just going to include a label so plug in your prompt yeah okay so this is our app framework now what we should probably do is let's start up our app so I'm just going to clear this so to start up our app we can run streamlit run app.pi okay that is our application looks like it's running beautiful so you can see that we've got our title there and we've got our label that allows us to plug in our prompt let's move that and so if I wanted to I could pass through a prompt here and say what is the fastest car in the world right now it's not going to do anything because we don't have this hooked up to our llm so let's go ahead and now hook it up so to do that what we need to do is we need to create an instance of our open AI server so let's do that so llms so we're going to create a new variable called llm and set that equal to open Ai and to that we need to pass through a temperature value or position keyword argument and this is going to dictate how creative or not creative a large language model is going to be so we've just created a new instance of that open AI service then we want a way to go and Trigger our prompt to our llm so we can say ift prompt then what we're going to do is create a new variable for our response and we're going to actually pass through this prompt to this llm so we're going to say llm and then pass through our prompt and then we actually need a way to render this back to the screen so I can do St or write St dot write and pass your response there so this is going to actually show stuff to the screen if there's a prompt right so now we can go back to our app let's rerun it right now we don't have a prompt so nothing's going to run to the screen but I could say what is the fastest car in the world and this should theoretically go to our open API service and take a look we've got our response so the fastest car in the world is the Bugatti Chiron Supersport 300 plus with a top speed of 304 miles per hour we could also say write me uh YouTube video title about deep learning all things holding equal take a look deep learning fundamentals a crash course let me know if you want to see a video on that regardless let's jump back on to our app so right now we've had to go and ride out this entire prompt we don't want to have to go and do that specifically if we want a user directed type application we ideally would want them to just pass through a topic and have our application Drive what prompts should be written out based on that this is where prompt templates come in so we're going to jump into our app and we're going to bring in a prompt template so from Lang chain dot prompts we are going to import prompt template and we're also going to import a llm chain as well so from Lang chain so our length chain chain is going to allow us to run our topic through our prompt template and then go and generate output so we're going to say from linkchain.chains import llm nope don't need you Siri uh llm chain Okay cool so that is that now imported so we've gone and written these two lines here so then what we want to do is we want to create a prompt template and our prompt template is purely going to take in our topic and it's going to write us a prompt that almost says write me a YouTube video title and then it's going to plug in that variable there so we don't necessarily need our user to go and write this entire thing going forward so let's go ahead and create our prompt template so we're going to create a new variable called title template I'm going to say set that equal to prompt template then what we need to do is dictate a input variable so our input variable is going to be equal to a topic and we also want to create a template here so the template is going to be effectively just a string that we can pass a variable into it almost like string formatting so this is going to be right me let's actually just grab what we wrote out here so let's copy this and plug it in here so write me a YouTube video title about and then we want to pass through our variable so we can pass through our topic variable here this means that when we go and use this prompt template we only pass through a topic and it is going to prompt format it to that over there with our topic down here now to use this we're actually going to use a llm chain so rather than just using the raw llm we're actually going to chain it together so let's create a bit of a comment over here so these are going to be our prompt templates and then we're going to create our llm chain actually this is going to be our title chain and we're going to set that equal to llm chain llm chain and then to that what we need to do is we need to pass through our llm and set that to equal to this llm so this is being passed through to our llm chain then what we want to do is we want to set our prompt so we can say prompt is equal to our title template and then what we can do is rather than running the llm directly down here we'll actually grab our title chain and we'll do I'll pass through use the run method based on that prompt so what we should effectively get is our prompt topic here so really what we're doing is we're setting our topic equal to the prompt that we're getting from over here now if we go and run this we should effectively prompt format our topic into this specific prompt so it should actually be running write me a YouTube video title about a specific topic I'm going to move this over to the side because we do get we can actually set verbose to equal to true in here verbose equal to true and you actually see the Run chain as it's actually going through and running this so let me zoom out of this so you can see it a little bit better so let's actually go ahead and bring this over to this side so now if we get rid of that and just change it to just our topic or deep learning when we go and run this now we should effectively see the chain run so let's go and refresh we've got an error Kiara input variables what has happened there um it's probably input variables change that over there let's go and rerun this that looks like it's running successfully so you can see down there I don't know if you can see it's a little bit small but you can see it's actually entering a new llm chain and you can see that this is the prompter to formatting so write me a YouTube video title about deep learning so it's actually gone in prompt format it it's taken our raw topic and passed it through to our prompt template this is the advantage of using prompt templates because they simplify stuff an absolute ton but for now we've only got a prompt title or a YouTube title being generated we want a script as well it doesn't just stop there we want to take this a little bit further well this is where chains come truly in handy so right now we're just using a single chain we're using an llm chain well what we can actually do is we can chain a bunch of these together so we can sequentially bring them together to be able to do a bunch of stuff so from our Lang chain module over here we're also going to import simple sequential chain and this is going to allow us to stack a bunch of these together to be able to generate multiple outputs but we'll come back to that in a second multiple outputs important to know so we're actually going to create another prompt template and what this next prompt template is all going to be to do with actually generating a YouTube script not just a title so we'll copy this prompt this template that we've got over here which is currently title template and we're going to convert this to script template the input to our script template is going to be our YouTube title so our input variable over here is actually going to be our title zoom out it's a little bit too zoomed in for me and then our template is going to be write me a YouTube video script based on this title and we'll say title and then we're actually going to pass through this title variable into our prompt template do you see how these prompt templates can really come in handy because it means that you've already got the prompt kind of formatted you just need to pass through the context specific variables makes your life a ton easier particularly when you've got to work with a bunch of documents which we'll come to later on okay so that is our script template now done now what we need to do is we need to create another chain but this chain is going to be for a script chain so we're going to copy our title chain down here and we're going to create or convert this to be a script okay and how are we doing for time four minutes left oh well we're not going to make this to our script chain we're going to grab our script template and we're going to set that as our prompt down here so you can see that we've now got two chains so we've actually got two templates so we've got a title champ so we've got a title template and we've got a script template and we've now got a title chain and a script chain now we need some way to join These Chains together because right now they're just operating independently one is not going to interact with the other this is where that simple sequential chain comes in that we brought in up here so we're going to copy that over and we're going to create a new instance of our sequential chain so I'm going to say sequential chain and set that equal to simple sequential chain there's one positional argument that we need to set to our simple sequential chain and that is the chains positional argument and that is just a list of all of these sequential chains order is really important here so we're going to basically specify that run this chain first then run the next chain or run a specific llm chain first then run the next chain so the first chain that we want to run is our title chain which is going to generate our video title and then the output of our title chain is going to get passed to our script chain it's a simple sequential chain one output goes to the next chain now we can make this a little bit more sophisticated which we'll do in a sec but I'll come back to that and I'll show you why that's important so to our chains we're then going to grab our script chain and that's going to be the second one and we're also going to specify verbose equal true so you can see that what we've gone and done now is now we've got two templates we've got two chains and we've also got a simple sequential chain that chains them all together wow how many times have I said chain so let's grab the sequential chain and then we are going to run sequentialchain.run and we're going to be passing through our topic to that sequential chain so this should effectively now run the title chain generate the output then pass the title to the script chain and generate the script so let's make sure we can see the terminal because this is going to get interesting so if we now go and refresh our app it should go and automatically run the chain that's the first one write me a YouTube video title about deep learning and then all things holding equal have we saved this we have not saved it let's save it all right let's open up our terminal again let's rerun missing some keys input oh no we've got an error if you're enjoying this video and you'd like to learn a little bit more about python for machine learning deep learning and data science head on over to go.courses from nick.com forward slash python where you can take my entire Tech fundamentals course from scratch for free it takes you completely from the ground up and gets you started and ready for machine learning deep learning and data science back to the video my bad I realized I don't need topic over here let's get rid of that beautiful let's jump back over to our app let's rerun let's open up the terminal take a look it's obviously going to take a while because it's actually generating now and take a look so it's actually going to run our sequential chain so you can see the first chain that it's going to run is generating the title and that's the title and then that title has gone and been passed through to our next chain so if we now go and take a look so this is our script which is actually going to be run based on deep learning how awesome is that now take a look right so this is only outputting the script itself it's not actually outputting the title so this is where the simple sequential chain kind of fails a little bit because it's only going to Output let's minimize that terminal it's only going to Output the last output of the sequential chain if we wanted to go and generate or grab multiple outputs then it gets a little bit trickier so we're out of time anyway but I want to keep taking this further and show you just how powerful this is so you saw that we used the simple sequential chain but keep in mind that this will only output the last output what we can do is we can actually sub this out for sequential chain which is going to allow us to get multiple sets of outputs what we do need to do though is we need to specify output keys from each of our different title chains and script chains so what we can say is that our title chain is going to Output or the output key is going to be the title so we're just going to update that chain and we also need to update our script chains here the output key there is going to be the script what we can then do is swap out our sequential or simple sequential chain for our sequential chain and then we're going to specify that our input keys or our input variables to our sequential chain are going to be purely the topic let me word wrap this so you can see it a little bit better so the input is going to be the topic and output variables are going to be multiple output variables so what we're going to be doing is we're going to be outputting our title and we're going to be outputting our script so this should allow us to grab multiple outputs from our sequential chain rather than just getting the single one this is where I find the API a little bit weird because you can't just use run here you actually need to go and use a or pass through a dictionary and our dictionary is going to take in topic as a key as well as our prompt so there's is a little bit of nuance when it comes to using each of the different types of chains so now what we're going to be doing is we should effectively be able to get our title out as well as our script now what we need to do though is when we go and write it to be able to grab each one of these keys from our response variable what we're going to do is we're going to grab our title and we'll also write out our script so this should allow us to access both of those variables separately so we're going to grab the title and the script separately over there so you should be able to see that there okay what we now need to do just make sure our code is saved let's jump back into our application let's throw the terminal on this side so we can see it and let's rerun you can see we've got our prompt formatting happening down there so beginner's guide to deep learning and introduction to neural networks then all things holding equals should write out our title as well as our script take a look there is our title so beginner's guide to deep learning introduction to neural networks and then right down here we've got our script separately so now you're able to get multiple different sets of outputs now a big part of what makes chat jpt call is this ability to incorporate memory it knows what's happened previously so far we haven't brought in memory into our Lane china but let's do that now in order to add memory to our application we're going to access another link chain class so from Lang chain dot memory we are going to be importing conversation buffer memory which is over there what we then need to do is create an instance of this now we're going to be using the memory class but we're not actually going to be using it for prompting we're purely going to be using it for storing the history if you're building a chat based application this would come in handy a ton so we can create a new variable called memory and set that equal to conversation buffer memory and then we need to go and set two values we're going to set our input key and we're also going to store our memory key so our input K is going to be equal to our topic and this is going to be deep learning at the moment and our memory key will be our chat history now what we can do is pass this memory value or memory class to both of our llm chains we're going to break this out once we go and start adding some tools so stay tuned for that then we're going to add that to our script chain as well and then what we actually want to do is we want to render this back to the screen so we can actually see our memory so right down here under our prompt 12 being doing both of our rights we're going to use a streamlit expander so uh we'll say with st.expander and then the title for our expander will be a message history and what we'll do is we'll put in an info box so St dot info and we're going to pass through our memory dot buffer value so now if we go and rerun we should get an expander which is almost like an accordion and then inside of that we're going to get an info box which stores our memory which is coming from each of our chains so we've created our buffer and we've passed it to our chains and we're now writing it out to our expander so if we now go and refresh our app so right now there's nothing below if we go and rerun it just by hitting r on the keyboard if we scroll on down take a look we've now got our message history so if we go and open that up you can see that we've now got our output now this is the one thing that I don't necessarily like about it so right now because I'm maybe using the incorrect class or a class that maybe is not ideal for sequential classes you can see that our input right over here is deep learning and the AI is outputting in the title if anyone knows how to do this a little bit better let me know in the description below because right over here you can see that it's taking the topic again and we're outputting AI there might be a way to Output the title but we're going to break this out in terms of two separate classes so you'll be able to see that once we get onto tools now there's one thing that makes Lang chain really really interesting it's its ability to Leverage tools all right so the next thing that we need to do is add some tools so we're on the home stretch so stick with me here so we're going to take this step by step the first thing that we need to do is go to Lang chain dot UT wow that was hard for me to type link chain utilities oh wow we're going to import the Wikipedia API wrapper perfect so this is going to allow us to make API calls to the Wikipedia API what we then need to do is we need to update our prompt templates our prompt template is no longer only going to take in a title we're going to take in some Wikipedia research as well so what we'll be able to do is we'll be able to prompt with Wikipedia as a backup so right now our current prompt is saying write me a YouTube video script based on the title but what we could also do is say um but also while leveraging this Wikipedia research and we could pass through that variable so we could say Wikipedia research over here copy that paste that into that beautiful so we've now gone and brought in our Wikipedia API wrapper we've now got an updated our prompt the next thing that we need to do is we need to break out our memory so we're actually going to have a title memory and we'll also I don't like word wrap at the moment let me zoom out a little bit so we'll have a title memory and we're also going to have script memory and our input key for our script memory is going to be our title which is going to come out of there and our input key for our title is going to be our topic Okay cool so what we can then do is we need to update each one of these so our title memory is going to go my head is probably covering that let me make sure let's bring that this up a bit so our title memory is going to go to our title chain all this code is going to be available via GitHub as well guys so let me just we probably should put word wrap and our script memory is going to go to our script chain so that's looking good and then what we actually need to do is we actually need to get rid of our sequential chain I know it's served us well but we're done with it now so that means that we've now got two llm chains that are running independently the next thing that we need to do is bring in our Wikipedia API wrapper create a new instance of that so I'm going to call it Wiki and what's going to happen now is we pass through our topic to our title chain it'll generate a title we'll then take that title and the Wikipedia research and we're going to pass it through to our script train so what we're going to do now is we're no longer actually going to run the sequential chain we're going to be running three separate calls so our title is now going to go back to our title chain and that is going to take in our topic which is going to be equal to our prompt and then I will pass our title to our what we'll actually do is we'll do our Wikipedia research and that's going to use the Wikipedia API wrapper so what we'll do is we'll pass through our prompt to that as well and then the last thing that we want to do is generate our script and our script is going to take in two inputs now so script chain that a script chain is going to take in remember right up here it's going to take in a title and the Wikipedia research so the title is going to come from up there so title will be equal to title and our Wikipedia research what's the key code it should be full uh the full value of Wikipedia research so it should look like that so now we'll get our title we'll get our Wiki research and we'll get our script we also need to update our buffer because we've got two memory buffers now so let's create another expander so we're going to have uh the title history and that's going to come from the title memory buffer and then we're going to have our script history which is going to come from our script memory buffer perfect and what we'll do is We'll add another expander for our Wikipedia research so right down here is we'll go Wikipedia edia research is going to be over here and we're gonna say uh to do what do we let's just out with the full Wikipedia research over there perfect and over here what we need to do is change this so this has got we're just going to write out the title from over there and we're going to write out the scripts from over there so adding tools breaks it up a little bit I know it's a little bit unfortunate because we've built it all up but that's the easiest way that I found to do it based on the documentation it is continuously evolving so all things holding equal we should now be able to go and run this let's give it a crack so if we go to our app let's refresh we've got an unexpected error so inside of our title chain let's get rid of the keyword topic so over here let's just remove this uh we should also be running it so title chain dot run scriptchain.run let's go and refresh take a look doesn't look like we've got any errors yet so take a look so we've got our script being output we've also got our video script being output now if we go to our title history we can see the human has passed through deep learning and the AI is responding with that we go to our script history the human is passing through the title and the video script is now being output so that's a much better way of using the memory particularly when you have multiple inputs we can also see the Wikipedia research take a look it's bringing up deep reinforcement it's bringing a South Park episodes have we incorporated anything to do with South Park I think it did previously but in our particular case that is our video script now done and that is the Lang chain crash course now done as well there's only one thing that I didn't really cover in here and that's indexes but if you'd like to see a video on that do let me know in the comments below I'll catch you in the next one so what did you think what are you going to build with Lane chain if you'd like to see me build a machine learning model from scratch that allows you to do exercise based detection take a look here
Info
Channel: Nicholas Renotte
Views: 213,900
Rating: undefined out of 5
Keywords: machine learning, python, ai
Id: MlK6SIjcjE8
Channel Id: undefined
Length: 27min 27sec (1647 seconds)
Published: Sun Apr 23 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.