Combining Langchain & Ollama | Local LLM/AI series

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi everyone in this video I will show you how you can use Lang chain with o Lama now you might have already heard about Lang chain that's why you in this video and Lang chain you know for really quick recap is a framework for developing applications powered by language models like chat GPT and other models out there right but it's a framework that makes it working with large language models Easier by giving you a bunch of templates and libraries and modules and agents and chains and all that kind of things right but what about o Lama AMA is basically one of the easiest way to run uh local large language models open source large language models right and you'll see how simple it is uh in this video once I you know download a model and then run that model but for now AMA is um a tool that will allow you to run open source large language models uh such as llama 2 locally very easily and in this video we'll be combining these two and doing some cool stuff and maybe in the um future videos we'll build on what we have here you know if you like me to focus on this uh combo right olama and Lang chain but for now let's start doing something you know let's follow this tutorial and let's do something with it right so let's um uh follow the instruction to set up and run a local AMA instance now the thing is I've already uh installed AMA right very simple uh to install olama and by the way I'm on Mac and you know this tutorial is mostly focused on Mac but you can also do this on Linux okay so for this tutorial what I'm going to assume is you all already have Ama installed on your machine now if you don't have it installed you can follow instructions um right here and I'll be posting this guide on the description uh below let's say you have olama install which I have done in my uh machine already right so the first thing you do to integrate Lang chain with AMA is you download AMA you fetch a model um basically you run a command like this right so after you install AMA you have access to a command line like this and you can type AMA um and because I've already uh downloaded some models um I can say olama list and you can see some of the AI models that I've already downloaded right one of the very popular model right now is mistol and I can now do AMA pull pull we're pulling a model from the registry I'm going to say pull mistol and what this command does is it basically will download the model uh into our machine and next time we run AMA list obviously after this is downloaded we'll see that model available to us and at that point we can start running that model all right so it took me a while to download this but now uh if I do ol list you'll see that now I have the Mistral model downloaded and now I can start running the model and I can start talking to this but what we want to do in this video is use Lang chain to talk with these models using AMA so so let's go back to the tutorial and let's start writing a python script to work with AMA all right so let me install Lang chain okay the Lang chain library is installed okay so once the Lang chain framework is installed using the PIP command what we'll do is we'll create a new file Lang chain [Music] o L.P file we'll create this file and then and let's start importing all of these Imports that we need to run this tutorial so Lang chain call back manager streaming standard out callback Handler and olama so we import those things and the second thing we do is we Define what llm we want to use in this case we want to use mistol which we just downloaded um and that's one of the cool things about Lang chain and AMA right you can quickly swap out these models right now I can start with mistra and now if I want to experiment with a different model like llama 2 I can just change this to llama 2 right but let's stick with mol for now uh we can also optionally pass this uh streaming standard out callback Handler to stream tokens you know let's skip this for now we want to keep this pretty straightforward and I'll just say uh this you know this command to start talking to the mistra uh model you know using AMA right tell me the history of AI and tell me okay so tell me the history of AI and tell me some influential figures uh tell me names of some influential figures in the space of AI right so this is this is it let's see if this runs python Lang chain AMA and it's not printing anything so we may need to do this so just print and let's see if this do the trick and yeah um it did give me some response now right once I printed the the the command so uh this is how simple it is to um combine Lang chain and AMA and you can also do things like swap out the model you know if you want which you know we can we can do uh really quick but before that let me show you this is this was the output right the concept of artificial intelligence as we understand today can be tracked back to 20th century da da da da da you know and some of the influential uh people are Jeffrey Hinton F Lee and deis hassabis right so yeah we did get a good response from the language model M but because I have other models with me you know I have um some other models here including I have the Mixr model right which is the mixture of experts um model created by mistra and it is very big 26 GB compared to the Mistral model that we're using so why don't we try to see the response with that right and maybe ask a couple of uh things to this so now I'm going to say uh this same thing and let's see what we get as as a response but this time once again we're using the bigger model right the 26 gig mixture of expert ore model so let's see what we get as you can see we got a response back and this time we get we get something uh a bit deeper right especially I can already see that for the INF influential figures in AI space I get a lot more names than just the three names mentioned previously right like um uh Claude Shannon Alan Turing and a lot of other people right so you know this is just a quick demo of how easy it is to integrate your AMA models with Lang chain just because it's so easy to set up AMA you know it will take you probably like 5 or 10 minutes so easy to add new models with AMA and then Lang chain makes it even easier to you know just start asking questions right so hopefully this was useful to you you know you learn something from this video so I'll be adding this code to this repo I'll I'll update this repo so that you can find all the uh scripts and guidance on how to run this if you have any questions put them in the comments below uh I also want to explore more ways to integrate llama uh or o Lama with Lang chain especially with the O Lama API uh so if you're interested in that video please let me know in the comments uh thank you so much for watching and I'll see you soon
Info
Channel: CloudYeti
Views: 3,157
Rating: undefined out of 5
Keywords: local llm, local ai, langchain, ollama, mistral, mixtral moe
Id: Mv2t505oHiM
Channel Id: undefined
Length: 9min 29sec (569 seconds)
Published: Tue Jan 23 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.