How to Use Ollama On Windows

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone welcome back again my name is Jesse and in this wonderful and exciting tutorial be trying to see how to use and install AMA on Windows right that is a very nice thing so initially AMA was only for mac and linos but now there is a Windows preview so let's see how to work with it so just just have to go toa.com and then when you come back you can download it here there we have the windows preview so I click on this one it requires Windows 10 and then later right so it's installing or downloading so once it has downloaded so this is it right then I'll click on this one to open so that is how to work create it so let's go back to download folder I can open the download folder in case I have it here is it and then yeah consider this it's opening here right and then I can just go with install and it's installing in this particular location so this is the location that is installing my local this particular location here right apps local programs that's where also the model is going to be and now you can see that it's showing it Lama is running right so that is how to work so let's go back again and open our terminal so I go back to my windows terminal here and let's open it and run AMA it's already running so in case you want to stop it you can just go back to this here and then you can see view the logs to view any logs and you can also create it in case you want to create it so let's see how to work with it I go back again and I'll see check and see if it's already added to patama and I can list if I have any models so there are no models on my AMA right I don't have I I don't have any model so let's install a model so AMA then I I can just go with the pool or let's go with run tiny Lama I don't have a lot of space but I can work on it so in case you want to get all the models you go back to the official website come back to models here and then you can search for one so what we need is Tiny llama so this is the model that we using it's not that big if I check the tax this was the recent one so I can download it so I copy this and then we can download you can also install mrra and the rest right but this is quite not big so this is how to work it and then it's downloading the nice thing about now is that there is also python that you can also use so if I go back to the official AMA location there is also Lama python which you can also check so there is AMA python yeah which is the new library which you also install and see how to work with it right very cool that is very nice so it's installing them perfect so now I can ask them message so uh what what is a large language model right and then it's going to return the result right which is very cool I like this UI it's a large language model it's a powerful tool that can be used for test processing and blah blah blah right that is very nice that is how to use AMA on Windows very simple and once the server is running you can also do a lot of things with it right so now after it has finished in case I want to see the other model that were installed I can go back again let's open another tab let's make it and go with AMA list and you can see that now I have my model here which was created right and then in case I don't want it I can just go AMA remove and then I'll just specify the name in case I don't want the model anymore perfect well it's still generating perfect so let's see some other stuff here can also do by default it's running on a particular Port that we can also listen on I think the port is Local Host and then the name of the port is is um let me P the put here and this is your Lama right so you can also query or in case you have an post man or any of them can carry to this particular uh this end point just as you doing here and then it's going to work right with the same interface like in case you're using open AI or that which is very cool now that's how to install Lama on your system that's is very cool very nice you can also set veros in case you want to see I think set Vos we are setting Vos and then we can now type in something right there so that it will just give you a lot of information with the set weos so let's see how to work with the python version I'll go back again to my location go with Pip install AMA right and then we have Ama there is also for JavaScript which is also very cool there is python Library which is very nice right just P install AMA and then it gives you the option of using this wonderful interface like with uh ch ch gbt and open AI right which is very cool so you just go with the amaama chat model then you specify your message and then it's very nice right you can also stream it in case you want to stream the information like in one character at a time very nice okay so let's see now we just finished installing and then we can work on it right you can communicate with our API just as we did earlier on okay so here set varos is giving me some information about it in case you want to stop it from running you can just go back again and then you can just click on this and then you can just create uh AMA perfect so there is also the logs option in which you can see the applo the Ser locks in case there are any locks from the system you can see it here there is also some for the app locks right which is very cool and then there are many things you can also do if I come back to here yeah very nice so that is how to work with AMA on windows so thank you for watching this tutoral I hope you have learned something see you another time stay blessed bye
Info
Channel: JCharisTech
Views: 3,747
Rating: undefined out of 5
Keywords: ollama, ollama windows, ollama on windows, how to install ollama on windows, ollama python, ollama nodejs, ollama desktop, llama2, mistral llm, openai, chatgpt, jcharistech, how to run llm locally
Id: qTR70UrrLIU
Channel Id: undefined
Length: 6min 56sec (416 seconds)
Published: Sun Feb 18 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.