LM Studio: Easiest Way To Run ANY Opensource LLMs Locally!

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
I just probably found the most easiest way you can run local LMS on your computers introducing LM Studio basically you're able to discover your favorite model download it and run the llm on your local comp now it's quite a lightweight program that doesn't require much Hardware to actually operate and what this actually does is that it makes it more efficient while you use the program now this is something that is quite easy than actually installing text generation web you in my opinion and it has a Sleek UI with a chatbot interface now it lets you run any sort of model that you want whether if it's Windows or Mac OS which is quite awesome throughout today's video I'm going to be exploring what LM studio is actually about by showcasing how you can actually run local LMS on it as well as showcasing the inferences as well as starting a server within it now this is just the top of the crop we're going to be taking a look at so many more other features with LMC video so make sure you stay tuned and let's get straight to it hey what is up guys welcome back to another YouTube video at the world of AI as you mentioned at the start we're going to be taking a look at LM Studio you're able to discover download and run local LMS within this application not only are you able to do this but you're able to start different types of inferences start servers and so much more so with that thought let's get straight into it if you want install this I'll leave this link in the description below you can do this with Mac OS or you can do this with Windows in this case I have a windows so I'm going to move forward with the Windows installation it's going to prompt you to download the installer once it has finished installing it I'll get to the next step and I'll showcase how you can actually install it if you guys would like to access our private Discord in which you can get exclusive subscriptions to AI tools for free giveaways you have networking opportunities consultation and so much more definitely take a look at this link in the description below now if you guys would like please follow world of AI on Twitter so you can stay up to date with the latest AI news lastly make sure you guys subscribe turn notification Bell like this video and check out our previous videos because there's a lot of content that you will definitely benefit from so with that thought let's get straight back into the video once it has finished installing the install you can double click on it and it will then prompt you up into actually start installing it so this might take a couple of seconds but once it has has finished loading up I'll be right back there we go we have something going on it's going to take a little bit longer than expected because my computer's quite slow and just like that we have it finally installed it's easy as that it's 400 MB so keep that in mind but you can see that it has now finished installing and you'll be able to start playing around with this it's fairly easy and this is the next step of the video in which you will start covering and start going over so with that thought let's get to the that and start going over what you can actually do with LM Studio as mentioned at the start you can see that it has quite a nice Sleek UI you're able to see on the homepage you're able to discover and start copying pasting whatever hugging face repo that you want to play with or you can just search something up for example if you want to play around with open Orca you can basically just search it up and you will see that there is multiple different types of models in which you can play around with now guys note this this is just open source and you're able to sort around with it as well as by compatibility which is filtered by the compatibility of your computer you're also able to see and Source it out to the most downloaded as well as the least downloaded now if you go back into the most downloaded you're able to see that there's different options in which you can install for that model so it gives you a good customizable option to basic basically work with and that's one of the great things about LM Studio you have so many other types of options within this which is easier for you to decide which model you want to work with so if you just search up open Orca once again you to see that you're able to download uh different models that are best fit for your computer you can have the gptq version you can play around with other smaller sizes and you can even rank them up to whatever smaller size that you want to play with for example I was able to search up the new open source model which is mistol and I basically categorized it based off the model's size so in this case it categorized it with the least Hardware extensive size which is the 3.08 GB model versus the one that is 7 gab so in this case you can download whatever model that you want uh it showcases the different types categories of the model if it's instruct I mean all of them are instruct but if it's quantized the model size and Etc so in this case if you want to download it you can just click download and you will be able to load it up in your models and this is where you're able to start loading it up as well as start playing around so that you can start chatting with it within the actual chat UI within LM studios in this case you are able to chat with the large language model you're able to send it certain types of prompts have a prompt template set up within this which is really really unique and cool now guys say if you want to start your own inference server it's barly easy it's just a matter of a couple clicks you just simply load up the model that you had just recently downloaded and you just start it up as easy as that guys they have done a quite easy job in starting a server up in which you can connect with other models for example if you want to connect different types of models within your local server you can do that within the server log as well within the configuration settings now guys there's a couple things that I want you guys to keep in mind before we actually get into installing whatever model that you want with LM studio in this case it gives you a good rule of thumb for actually getting practical generation speeds now it asks what model or which model should I choose the 7 billion 13 billion 15 billion 30 billion or Etc and it gives you a good depiction as to how much RAM do you actually need to actually host these model sizes in this case if you have 8 GB of RAM you can choose the 7 billion models or smaller uh it recommends that you have 16 GB of RAM if you're going to be choosing the 7 billion or 13 billion and 32 GB of RAM will require you to have we not required it will allow you to run the 15 billion 30 billion models or smaller as and it just keeps on going on and on and on with the 96gb of ram you can obviously run anything that you want in my case if you have have anything lower than 8 GB of RAM I highly recommend that you don't even play around with this because there's no point it's not going to load and it's just going to cause a lot of headaches for you on your Hardware in terms of data privacy you might be wondering is it 100% private and the answer is yes because LM has an encryption method as well as a statement that basically says that this app makes HTTP requests in three cences uh you can see over here or occasion sorry uh there's three different methods as to how it only utilizes uh the HTTP requests and it's fairly safe from what I've seen so you shouldn't worry too much but if you do require some sort of like investigation to actually test it out I highly recommend that you do it at your own discreption I highly recommend that you check out their Discord in which you can give you a lot more information on the documentation of this this software as well as giving you a breakdown towards certain tips that could help you out there's also a feedback tab in which you can give constructive feedback for the application you can also request certain types of features you can also help each other out over here and if you're interested I highly recommend that you check this out I'll leave this link to their Discord in the description below so that you can get further idea like further ideas as well as recommendations to improve this app and folks that basically concludes today's video I hope you enjoyed this video on LM Studio it's quite an easy userfriendly way that allows you to run any sort of large language model on your local computer not any but open source but it's great to hear that it offers the best of the best lightweight Hardware to efficiently run your model off this basic application there's a lot of custom customizable options a lot of settings in which you can run your own server as well as play around with your inferences so if you're interested in this I'll leave all the links in the description below the LM Studio link as well as their Discord link so you can get a better idea as to what you can do with it but that's basically it for today's video guys thank you guys so much for watching I really really appreciate it I want to mention one thing we actually hit 25k subscribers yesterday and I cannot actually fathom that that is out of my mind and I did not expect this if you go on the about tab I started this uh channel on February 2 which is insane I did not see this like basically blowing up this fast I just started this off as something not even for like just to work with I just started it cuz it was something that I always wanted to do not in relation to AI but I just wanted to become a YouTuber for some reason as a kid and I ended up fulfilling my dreams and it's just something that I always wanted to do but I actually made it happen not CU of myself but because of you guys uh without you guys none of this is possible so I just want to say thank you guys so much from the bottom my heart I really really appreciate all the support and all the love that you guys have been given me just note that I'm going to keep on working making sure that I can provide you guys the best value going to keep on improving all my skills making sure that I can make the best content for you guys so with that thought guys thank you guys so much for supporting this channel just note that this is just a start I'm going to keep on working making sure that you guys get the best content so have an amazing day spread positivity make sure you check out the YouTube page subscribe turn notification Bell like this video check out the Twitter page and if you guys want to access to our private Discord I'll leave these links in the description below with that thought thank you guys so much for watching have an amazing day and I'll see you guys fairly shortly peace out fellas
Info
Channel: WorldofAI
Views: 20,187
Rating: undefined out of 5
Keywords: lm studio, ai, llm, lmstudio, llama2, open-source, openai, lm studio ai, lm studio llm, download any opensource llm, opensource llm, mistral, easiest way to run llms locally, run llms locally, langflow, huggingface, LM Studio, Local Large Language Models, LLMs, Discover LLMs, Download LLMs, Run LLMs Offline, Hugging Face, Model Support, OpenAI, Privacy, Security, Technology, Research, Developers, Language Models, llama, llama 2, ai agents, chatgpt, run ai locally, local ai, opensource
Id: Z5_LvCwbgqg
Channel Id: undefined
Length: 10min 49sec (649 seconds)
Published: Wed Oct 18 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.