Run Ollama on Windows - Step By Step installation of WSL2 and Ollama

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
when you work with multi-agent Frameworks like autogen task weaver or crew AI you sometimes need to work with multiple llms locally on your machine especially when task are running sequentially by the agents in autogen or crew AI we can use AMA to switch between the llms so that different agents can use different llms AMA is available a on Mac OS and Linux but the windows version is not ready yet so to be able to run olama on Windows we need to turn on the windows subsystem for Linux feature or vsl and then download olama for Linux so let's dive in to see how we can install olama on Windows first we search for Turn Windows to find turn Windows feature on or off in control panel and open it we scroll down and activate virtual machine platform and windows subsystem for Linux and click okay Windows applies the changes and needs to restart after restarting we navigate to Microsoft support pages to see a stepbystep guide on how to install Linux on windows with vssl to do this we open the terminal as administrator and give the command vsl D- install DD and the distribution in this case Ubuntu this will take some while and then we can enter a username and password for the Linux user account after vssl version one is installed we get a message that a newer version is available and we can update vssl using the common vssl D- update AMA does not work on vssl 1 so we need need to update the update process takes some while and after vssl is updated to version two we need to set the default version of vssl to version two by typing vssl D- set- default D verion to two additionally we use the command vsl D- set- verion ubun 2 to2 after the conversion is done we are ready to install AMA on our Windows as if we have a Linux machine we navigate to. Ai and then to download page and click on the Linux version to see the command needed to install olama on Linux and copy it back on our machine we search for Ubuntu and run it as administrator when we see the prompt we paste the install command for AMA and hit enter this may take a while after olama is installed the prom comes back and depending on the hardware you may have Nvidia GPU or CPU installation on this machine AMA detected the Nvidia GPU when we enter AMA serve we see it is already listening on Port 11434 we can run AMA with an llm like mistr as it is the first first time we run AMA with mistol it first downloads the llm on our machine this will take some time depending on the size of the llm file after the llm is downloaded we can send our prompt to our llm you can send tell me a story the speed of the inference is depending on the speed on your hardware and after the process is done you will get the answer we can use slash question mark to get help and slash show to see some more information and/ show info to see some information about the llm like the parameter size and with Slash bu you exit you may want to install additional models locally for example for a better local function calling we can download the llm Open Hermes we can type AMA run open open Hermes and as it is the first time AMA uses open Hermes it first downloads the LM to our machine after the installation is finished we can type tell me a joke to test the llm and finally exit to wrap it up AMA is a very useful tool to work with multiple llms locally and switch between them quickly you can download it on Mac and Linux but on Windows for the time being we need to First install vssl 2 and then olama after olama is installed we can download multiple llms on our machine and start testing our multi-agent Frameworks like autogen or task raver or crew AI with open source free private local llms and save tokens and money Good Luck running olama on Windows
Info
Channel: business24_ai
Views: 5,483
Rating: undefined out of 5
Keywords: ollama, ollama windows, ollama windows install, ollamawindows wsl2, openhermes, openhermes 2.5
Id: C7rFk-GbdCg
Channel Id: undefined
Length: 5min 29sec (329 seconds)
Published: Sun Jan 14 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.