Local Function Calling with Llama3 using Ollama and Phidata

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi guys if you're moving back and forward with your assistants and you're not able to get that proper output from the assistants then you may be missing these three components memories knowledge and tools these three are essential in building a a very good chatbot or AI assistant and integrating them into your daily lives I want to introduce you to fi data this is a library which will help you to turn any raw llm into an AI assistant and that assistant will have memory will have knowledge and it can utilize tools so for example if you want to search the web this is the code it's pretty simple if you want to make an API call which is a tool this is the code if you want to add knowledge for a rag application this is the code if you want to add memory to your chat history as a chat history or you want to query data base you know the code is pretty simple now before I move on i' would like to show my channel to you so if you go over to my channel and and see the videos the most popular videos we can see that uh people really like the autogen configuration because uh about 5 months ago it was a very extraordinary thing where the agents does all the things automatically then we have M gbt which provided long-term context memory then we have crew AI that was a simpler implementation uh comparing to autogen and MGT F data is even more simpler it makes it so much easier uh to build AI assistance uh very quickly I'll just show you we have memory so when we enable memory uh we can enable the long-term conversations like chat history summary entities facts I was working with this particular organization which which requires uh to create a chatbot uh from the rag it's rag sort of a chatbot from the database data that it has and they want it to have a long-term memory so this is what they can Implement knowledge we can provide the rag sort of interface for providing knowledge in the vector DBS so we have PDFs HTML Json docs SQL any knowledge base we can Implement different tool for example run different queries calling different apis sending mails and integrating llm with your application that is pretty cool now before we move on we can see that this is the implementation of using AMA llm which is open source LM so from fire assistant we just import the assistant we just import AMA use olama so we are testing we will be testing out with llama 3 here and then we give the system prompt and just give the user input that is it uh this is pretty simple to get started it's pretty simple not just to get started but to build highend applications as well which we'll see in this video so what do we have again we have assistants which assistant is simply a combination of llms a raw llm plus memory plus knowledge plus tools then we have an assistant which you can interact with so llm when powered with memory you can Implement chat history summaries entities facts and we store that in a database llms can be improved with knowledge base for example PDF website docs Json text you can put it in a vector database now Vector database I have an entire rag Series going on separately in this Channel please follow the channel so Vector DB is different from normal database because it needs to get that semantic meaning when we ask a query it needs to get chunks from the vector DB related chunks and it should have a similarity it should be similar chunks so it needs to have that semantic meaning embedded when storing the text as numbers itself or as vectors itself tools so we can Implement different tools like search the web uh Ser API send mail run queries call different apis your own personal apis if you have your API running somewhere as a CL API you can call those that API uh trigger workflows you can start up a new sequence of workflows uh once we trigger some some events so you know such powerful and it might if I don't think it's complicated it's looking complicated but if even if you feel that it's sort of complicated to get that whole thing running you know F data has already done it it runs now the only thing remaining is to select the appropriate memory to select the appropriate knowledge base what you're going to use select the appropriate tools and think of your own use case in this channel this is the first video uh on F data I will be coming up with more uh videos even if like say 10 videos so if you like if you have any particular use case in mind you can go ahead and read the documentation and do it but if you are following me I'm sure you will find many use cases which will be relevant to your work so this is the GitHub repo and uh we are going to test out uh one or two examples so what I'm going to do is copy this code here copy this URL to clipboard and then I'm going to go to a particular folder on my local system then I'm going to say CMD and uh then I'm going to get clone and paste in this while this is cloning you need to get uh the AMA r running as well so. a you can go here and you can just download I am on my windows so you can download this for Windows here get it installed and check if olama is running so we have this app installed you click on the app then you will have this AMA running on the taskbar once you have that you go to a CMD and then you can see ama we have this AMA responding so AMA is running on my system so we can close this we can close even this now we were cloning this so we have this folder where we have cloned the F data here so what I'm going to do is enter into fi data CD change directory F data and now I'm going to uh go further even to a particular uh notebook a cookbook here where we going to test this example we have so many other examples and therefore please follow my channel where we are going to test out each and every uh tools and assistance that we have uh created using fata I've collaborated with the creators of fata and they're happy to share the quotes and all the doubts if any that I have so I changed the directory to cookbook llm AMA tools and now here I'm going to open the uh Visual Studio code editor so code space Dot this is going to open up the code editor now I'm going to open up a new terminal here and it's always good to have it's always good to have a separate environment to work so I'm using cond cond list info cond info D- envs so these are the environments that I have now I am going to work on fi data but how did I create it F data you can create an environment like this cond create dasn then the name of the environment is like uh F data then you can say python equal to 3.11 and Dy once you run this you'll have a new environment known as F data on your system now we can activate that environment so we say cond activate F data okay we can see this here so F data is activated now so what I can do now is I can go through these files that I have here on this cookbook so this was the cookbook cookbook llm tools okay so we have these requirements files here these requirements have been converted to the requirements of txt where uh this has been freezed version has been fixed using this command for now now we need to install the requirements.txt so what I can do is PIP install - R requirements.txt so you can press enter I've already pressed enter and uh I've already installed therefore we have we can see this requirement already satisfied okay now in the readme.md file you just have this readme file which you can observe here as well so this is the readme file we can close this next we have the app main app the stream L app and the assistant so in the assistant what we have done is you can see this is a pretty uh simple implementation of an assistance now what you have done is I'm just going to change this environment first to F data so we import these libraries uh the main thing is we import from fi we import of f. assistant and assistant F lmama we import the AMA tools and then we import the DU Duo Tav tools y Finance tools so these tools are imported it's pretty easy then we created a new function ONN as get local assistance which takes these variables it takes llm models which is a string variable and the default is kept as Lama 3 for now we will select our models and then we can change the model as well ddg tools du the go tools uh it's a Boolean and default is false and similarly we have defined the other tools that we have used we also Define some IDs here and the debug mode is kept as true this returns an assistance and now we have this piece of code which creates an empty tools list then we have three checking for three tools if that search is enabled we can activate and append those tools here next to create the assistant so assistant isal to assistant and name is this run ID llm is TOs and the model name is llm we give these three tools here or we show the tool calls this will show as the tools that it is using mainly used for debugging purposes uh once we are live and uh you know production environment you can remove that markdown is true you can see the outputs in markdown we can add date time to instructions and debug mode then we add an introduction here this is the introduction hi I'm a local AI assistant that uses function calling to answer questions select the tools from the sidebar and ask me questions and return the assistance so this entire function Returns the assistance it takes these inputs and it returns an assistants and the assistant has these functionalities these name this name run ID user ID llm these tools show called show show tool calls markdown add daytime to instructions in debug mode so whenever I say get local assistants I will get an assistant okay so this is the main app here so we are using streamlet as St so what I'm going to do is first of all run this and then explain this because there are so many uh systems that have been used it will be easier for me to explain once we have this running so for running an extremely app I can do what I can do is streamlit run and then the app.py okay so once I run this we can see that uh it is hosted in the local URL it is hosted here and the public URL if it is running if I keep this running you can access my system via this URL cool so I have this pretty cool so let me explain everything here uh so we have imported the libraries you know from five we imported assistants we imported the logger as well from assistant which means from this py file assistant we import this function get local assistant okay then we say nestore async so this is used when you're working on a jupyter notebook we can have different Loops going asynchronous Loops okay so this piece of code ST which trimate as STD so std. set page config uh page title is this page icon is this we can see the page title and icon on the top so local function calling and you see this yellow heart orange heart sorry then we have this title local function calling as with AMA so this is the title here then we use a markdown orange heart so this is an orange hard then built using F data and you give this link so this is a clickable link if you click here and goes to the F data get up page okay so we create this restart assistance where this restart different variables so it restarts you know resets the DU go search to false it resets the search to false it resets the wi Finance tools to true and then it deletes the session and it rerun this is basically for restarting the assistance now this is the main function it returns none so the llm model we can uh see we have sd. sidebar select box and on the left you have the select box where there are uh select llm will be written here and there are three options for selecting the model so we have Lama 3 open Hermos you can add in thousands of models here next we are checking if the llm is in the session state if not then we are setting this up and restarting this assistant okay next in the sidebar we have select the tools so there are three tools we can select wi finance. to go search and enable so if we select the Y Finance then we save the internal State as y Finance uh tools enable equal to true and then we restart the assistance okay so this part of the code we add du du go search enable to uh session state so this is the variable this is set to true if you select if you click dug Duo search then the ddg search enable is set to True uh again here we have this Tu search if this is clicked then if T search is clicked then we have the T search enable is set to True okay so this is the place where we we call the assistant we generate an assistant here so get local assistants is again if you remember we have this get local assistant here and then we put in the large language models which you have selected in the Llama 3 then we put in the tools that is selected so uh if you just select why Finance then what you can say is why Finance wi Finance tools enabled and this will be true the other two will be false okay then we enable the chat history as well so if we put in U this is the message that is stored the role is assistant and content may ask me questions if you put in a message that is saved as your role of user and and content is prompt the output of my model is saved as content is response so that is the history that you can see now on the sidebar we can uh if we press the new run then it reruns or restart the assistance okay then then we call the main function and this is the main it calls the main function which starts up uh the then it starts of the main function so now we can test this out so let's select Lama 3 and Y finance and uh let's give a prompt what is the uh stock price of let's say Tesla the current stock price of Tesla is 17018 let me check Tesla stock price so that's 17.18 okay so this is the uh stock price now on the back end if we go ahead and look what was the code that it executed we can see from the start it starts from somewhere here so you are a function calling AI model with self Recreation have provided with the function signature with these tools XML text you as you can see the prompt is pretty good it's very detailed this is pretty cool so you can just change the tools here and you can start up dug du go search and you can put in a question and this will be it now this is just one example if you want to see different examples you can go to the cookbook here or you can go to the assistant here there are various assistants uh that you can try and run but if you follow this channel we will go through each and every one of them in the subsequent videos and you don't have to go through the manual uh labor of uh you know understanding each assistance just follow this Channel and we will delve deeply and study all the things that are necessary for getting F data started for your AI agents and for creating your AI assistant now having said that I think this should be it please watch the next video on F data and uh if you have any questions please shoot the question on the comment section and I will be happy to answer your queries also please subscribe to this Channel and join my patreon if you want to support for this efforts also like share this video with your friends please watch the next video and I wish you best of luck in building your own AI assistance with memory knowledge and tools thank you
Info
Channel: Prompt Engineer
Views: 4,781
Rating: undefined out of 5
Keywords: prompt, ai, llm, localllms, openai, promptengineer, promptengineering
Id: RfIXVlMEi4c
Channel Id: undefined
Length: 19min 30sec (1170 seconds)
Published: Fri Apr 26 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.