How to run an LLM locally on Windows as a patent attorney

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey it's me Bastian and in this video I'm going to show you how to install a large language model locally on your computer in this case I'm on a Windows laptop here so lately the tool I've been using is called olama so what you want to do is you want to open up a browser and go to. a um my main computer for my day-to-day work is actually a Macbook so I've done uh installed uh um all the tools there already and I use it on a daily basis but for this demo and since many of my attendees of my power claim seminar are using Windows let's see if this runs on Windows as well I haven't tried that so this is the first time here so let's see how that turns out um AMA says that it's available for Mac us Linux and windows preview so let's see let's click on the download button and let's download the windows [Music] version okay now once this has downloaded we can install it so I'm just starting the setup all right there we go let's click on install so the olama tool is installing so what AMA is it's actually oh only uh let's say the the the user interface into which you can download all sorts of uh op source llms which you can then use through your local terminal on your computer I'm just while the tool is installing I'm just going to fast forward so that we don't waste your time okay so now that orama is installed um you can see the little icon here which uh shows you that it's running in the background actually um as I said you use it through your windows terminal so let's fire up a new [Music] terminal and the nice thing about AMA is that you can download all sorts of different open- Source models there's a very large and uh growing list of models so for this demo let's go for one of the most popular U models which is Lama 2 so what you want to do is you can browse through all the models in uh on the AMA website what you just want to do is uh run this command in your terminal so we we say run Lama 2 and since this is the first time I'm requesting to start Lama 2 it now pulls the model from the AMA Repository this should take some time because it's 3.8 GB in size so again I'm fast forwarding until the model is installed okay there we go that's it the model has been installed so now I have uh llama 2 locally on my laptop now let's try it out uh to really show you that it really runs offline let me just um remove my Wi-Fi connction connection and let's also go to airplane mode so that there is no connection to the external world and now we are right into the terminal and we can just uh chat with the running llama 2 model so uh let's for example ask it um have you been trained on patent documents so Lama 2 is thinking and let's see what it responds okay and so while AMA 2 while llama 2 is still typing um it has actually said that it was indeed trained on a large Corpus of patent documents from around the world including um both granted patents and patent applications in various technical Fields so which is nice because we want to use it to draft patents in my upcoming coming seminar um a quick note while the model is still typing on performance so this is very very slow um until this point it took the model like almost 5 minutes to come up with this um answer on my MacBook it runs much much faster um the problem is probably that uh this laptop here this Windows machine ah now it it's finished this Windows machine is a a very very basic laptop machine so let me quickly show you um what kind of computer this is so this is actually a Surface Book five has some tiny little i5 processor and only 8 gigabyte of ram so that's really uh probably uh among the smallest laptops you can get um it's working quite a bit so you can see that uh of course running an llm offline on your computer is Hardware intensive so this laptop would probably be too slow for me to use it productively I will stick with my trusty MacBook and if you're running Windows um now you have seen that it is possible in principle but you should have um a better um Hardware configuration probably okay so that's it for this video now you know how to run um an llm offline on your Windows machine and let's see you in the seminar bye-bye
Info
Channel: Bastian Best
Views: 344
Rating: undefined out of 5
Keywords:
Id: Sgfd4EWUX5g
Channel Id: undefined
Length: 6min 8sec (368 seconds)
Published: Wed Mar 20 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.