Private Chat with your Documents with Ollama and PrivateGPT | Use Case | Easy Set up

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
all right guys in this video we are going to use olama and private gvd to talk to our documents so this is a PDF file uh a book about Think and Grow Rich book uh about success and a mindset but it's a 200 plus page book and we are going to ask questions to this book using a technology known as private gbt and that private gbd we will use the olama for powering the private GPT in the last video of mine uh which is the chat GPT like interface for AMA open source uh I receiv received a few comments where people wanted to ask questions for example upload a CSV file and ask questions to the file uh how do I ask questions to my CSV file uh can I chat with docs through this interface what are the limits so these were some of the questions that I received in the last video and therefore I was uh motivated to uh put up this system where we are going to start up a large language model locally using AMA and then we are going to start up the private gbd or we're going to power the private gbd with the large language models from olama and then we are going to put in our files or any number of files and we can ask questions and chat with our files so let's get started so first of all we need to install AMA and as you know we go to. and click on download here let's wait and we have this uh Mac OS we need to download this once we download this we need to install that we have for Linux as well but it's still not available for Windows so for the Mac I have already downloaded and once you download and install it you will see this icon on the top where it says that it is running now you can quit AMA from here by just quitting there and you can go to this icon and start up AMA again and then you will have the icon appearing again on the top now this is running AMA is running how do we test it out let's open up a terminal let's go to Applications and let's open up a terminal here and in this terminal what we are going to say it's very easy we say o Lama run and for example mistl models so we we put just this command AMA run Mist if mist is present in your system if it is downloaded then it's going to directly start up uh this system for chatting so for example I have Mell installed here so tell me a joke I'm giving the same example again again again and again I'm putting the same example again and again another one why don't why do programmers prefer dark mode because light attracts bugs and another one you can see the speed I've already shown this to you this is MacBook Air 8GB memory and uh now what I can do is since this is running we can now exit this uh Slash exit and clear this so we have ol running now once we have ol running we need to integrate a private GPD so what I have done is that instead of you know creating different repos for uh on my GitHub profile uh for you to share the codes what I've done is if you go to Google and search for prompt engineer 48 GitHub then the first link that you will have prompt engineer 48 GitHub if you open that page uh open that GitHub and you can see my profile here and in the profile if you go to the pinned repost you go to O Lama and there you will find that there are different folders which I made so the first folder is already done and in the last videos we have already seen how to get this running I'm just going to show it so this was the code this is just the integration of Lang chain we're just calling mistal and we are using Lang chain but in this folder the next folder this is the video about this folder so AMA private GPD chat with docs this is the folder that this video is based on and I shall add different videos uh number in the sequence of the videos that I have so in the next video we can expect uh three folder uh on uh on on a different concept but based on Olma I shall publish everything here and you can get the ripo Clone uh from this link so let's go to this folder and in this folder what I have is uh different files and what I need you to do is to read the read me here and here I have mentioned the different steps that we need to take so what we are going to do now is we are going to clone this so let me go back so this is my main repo prompt engine 48 olama and from there we're going to clone this so click on the green button and then we're going to copy this okay so let's go to visual studio code editor and there we are going to open up a new folder uh I'm going to go to to my documents and create a new folder here uh let's say chat uh chat with docs okay chat with docs this is my folder and I'm just going to open this folder once you open this folder we have an empty space here so here we need to pull our GitHub uh code that we have here since we have already copied this link we can go to the terminal here create a new terminal here and say get clone and put in the link you need to have get install uh to for this to work so we press enter this will copy everything in the in this folder inside this folder AMA folder now we change the directory to AMA we can do CD AMA and then we can change the directory to cd2 olama but instead of doing that you can just right click here and you can just copy the path here and and you can just change directory to that path so we can see that we are inside the AMA private gbt now so we clear this we can list the different files that are here so license is just a license the main file here that I wanted to read is the readme so we are going to open the readme file here so this is these are the different steps that you need to take so let's uh do the different steps I am also uh testing this myself because I want you to have a seamless experience so uh the whatever is written I would like to follow that and if there is any mistakes uh in the writing I shall correct that so that you get the best version of the different steps that you need to do to get this running now as a first step we need to set up a virtual environment and uh let's set up a virtual environment uh how to create you can use cond or you can use VNV but I'm going to use cond this time so we can say cond create uh the name of the file or name of the environment is let's say private I already have a private therefore I'm calling this private one and the python version that we're going to use is 3.11 it's as simple as that so we have created AA environment U proceed yes so we are creating AA environment and inside the cond environment we are going to do everything and all the installations and all the running that we need this a setup but we need to activate that we can see that here the there is this base now but we want private one to appear here so that we know that we are working on the private one uh cond environment so what we can do is we can just say cond activate sorry activate a private one sorry so we can see that private has been activated now now we can do this clear and so we are inside the private cond the environment and we are inside this folder that we have here to AMA now inside this we need to do the installations uh if you see the requirements here these are the requirements that we need to install and we can just install by saying that pip uh install d r requirments R is- R is for reading the file so read the requirements.txt file and do the installations of whatever um we have there so requirement is already satisfied inside the private one okay this is done so we clear this and now uh we are going to follow these steps that we have uh in the in in the in the read me file so in the second step is done we did the installation third step is you want to pull the models if you didn't have any models in the AMA but since we know we have models running on AMA because we have already checked with mistl so we are going to just pull uh for an example I'm just going to pull AMA pull mistal let's see so pulling manifest and since we already have so this is fast because it's already available so it's a success and we just clear this so pulling is done next is we want to put your files in the source document folder after making a directory or creating the directory Dory so we want to make a directory called Source documents so the shortcut is make D of source documents okay just press enter and we'll see that we have this folder here now if I go back to my folder here go back to downloads I have this Think and Grow Rich book I'm just going to copy this I'm going to go to documents and chat with docs here or Lama uh two and and the source document inside here I'm going to paste this so I have this uh book inside of the private documents now if I go to my code we can see that this private uh this Source document has been uploaded the Think and Grow Rich has been uploaded inside the source documents I can check it up okay we have the book now let's go to the next step so after uh uploading The Source document what you're going to need to do is to ingest the files we are going to take in the files okay so we have this ingest dop file here we're going to run this so I'm going to say Python 3 for Mac or for Windows you can just say python then inj just. P just press enter and you should see something like this where the file is being ingested okay so it's reading the file here it has loaded the new documents uh loaded 235 new documents and split it into 1268 chunks and it's now creating the embeddings may take few minutes but I don't think so it didn't take a few minutes it's completed the video is not sped up or anything like that no editing on this part okay uh so we have ingested it and now we need to do is what we need to do is run this private gp. Pi so we say Python 3 private gbd dopy Okay so let's wait okay it's very fast enter a query and uh let's say why do most uh people fail in executing the strategies okay mentioned in the book Think and Grow Rich Let's see the results now the different kinds of files that you're going to upload is are this CSV doc files you know Mals ePub HTML and all sorts of documents we have used PDF here but all sorts of documents now it is running and you can see the speed I have not uh cut or done any editing here now let's see the result here most people fail to execute the strategies mentioned in the book because one of the main reasons for failure is lack of De decision making which is reveal through an accurate analysis of over 25,000 men and women who have experienced failure procrastin ation which is the opposite of decision- making is a common enemy that many people must conquer in order to succeed list five key learnings from uh this document let's see it Al it also uh gives us the sources where uh it found so these are the five learnings reading this book is an indication that you are earnestly seeking knowledge as you read underline sentences that impress you favorably okay if you only learn as a students there may be much you didn't know failure is possible if you choose to follow some instructions the document claims that the forging instruction will open the way for complete understanding and Mastery of the principle of success not bad it actually depends on the model that you're using we are using mrail here if you remember correctly uh if you go to the AMA and go to models here you can see the list of models that are available so Mr AMA now depending on what documents you are importing you can choose the different models uh that you want here uh but yeah this is how you integrate and read and ask questions uh to your document using private gbt and that is powered by olama now I would like to summarize everything what we have done I have created a repo known as olama if you go to prompt engineer 48 GitHub you will have a repo known as olama inside that repository you'll find different folders uh we are talking about second folder now so this second folder contains AMA integration of AMA and private GPD we download the code and follow the different steps that I have mentioned in the readme section and then we are able to successfully ingest the file and successfully ask question questions to the file so this is in response to the comments that I received in my video if you want me to make uh videos on your questions that you post on my comment section I shall be happy to do so because I have so many projects lined up and I do find time to bring out the videos that is required but the most important project that I'm currently working is the integration of that uh mgpd autogen and the AMA apis or AMA server calling now having said that I think this should be the end of my video If you like this video please subscribe to my channel for interesting content like this if you found this video useful share this video like this video and leave a comment if you like the video I also request you to join me if uh you like and if not it's okay but please subscribe to my channel and I will see you next time this is your host prompt engineer signing off bye-bye watch out the other videos that I have on my channel thank you so much
Info
Channel: Prompt Engineer
Views: 11,421
Rating: undefined out of 5
Keywords: prompt, ai, llm, localllms, openai, promptengineer, promptengineering
Id: lhQ8ixnYO2Y
Channel Id: undefined
Length: 15min 55sec (955 seconds)
Published: Sun Nov 19 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.