Open Source ChatGPT with Gemini Pro & Local LLM | LibreChat

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
our local chat G is ready and let's test it what llm are you and there you go it can also generate code it has able to read the image and face the text but if you can run chat GPD plus like system on your own local PC with locally hosted large language model or Google's gin Pro API is all for free and with much more control and capabilities in your own hand hi my name is you are watching not together where I try to simplify latest takech and Innovations for your future and your business let's get started welcome back I know many of you spend $20 a month to get chat GPD Plus subscription but don't really use it enough to make use most of your money certainly I am one amongst you so I keep on thinking if we could have a similar chat GPT like interface running in our own local system and it has almost all the capabilities that chat GPT Plus offer without spending that much money now especially when the open source large language models are getting stronger and competent and also models like gin Pro is becoming free for developers we could definitely have a system running in our local which will be more like chat J plus and be free so this video will have four section in it as usual in the first section we'll take a look at this open source project called libat and we'll install in our system in the second section we will set up the Gin Pro API in an open a API standard so that we can use it with libat in the third section we will add Google search capabilities so that libat can use live data from the internet and finally we will try to integrate lib chat with our locally hosted large language models and see how it is performing so without much further Ado let's get started so this is the Libra chat open source project which is more like a clone of chat GPT interface and it comes with different features it supports multiple language it also supports gp4 and gemin vision and it also supports L chain and so much more in this video we will not be able to cover every aspect and every feature of it but we will quickly take a look that why you should consider again your chat jpd Plus subscription and rather you opt for Libra chat and run it in your local first thing first what you have to do is clone this repository and I have come to my root project directory and I'll do git clone lib chart. git now I have already cloned it so I'm not going to clone it again but please go ahead and clone it now that you can see the Libra chat folder this comes with a env. example file just copy it and paste it and rename it to the EnV here is the installation procedure we will use Docker compos installation guide because we will run libard as a Docker image we have already cloned the repository and then it says you have to do the AI setup it support almost all of the major AI API provider that you can integrate it with it also supports light llm and you can also set up it with open router and some free a API provider out there like Naga AI with that said let's proceed to our second section where we will configure Google's gin pro model and integrate with liart and use it for free I have already covered how to run gmini models in an open AI API standard in this video you should be able to check out the video from the link on your right top so in that video I have explained how you can run this Docker command and run this proxy to get the gini API model responds as open API standard so that you can use it in different projects now I have a small Home Server running where I have deployed this proxy running 24 cross 7 so you can see this is running and I can access this proxy on 8824 so I will now going to close this I also spin up my light LM because as you already know I always like to use a API proxy for any of the project that I do here we will go to light llm proxy where I have different light llm proxy configurations already set up I'll just quickly explain it through but if you really want to see more about light LM how to install it and configure it and use it as your a API proxy server check the link on your right top where I will attach the link to my previous video about light llm we will only focus on our gini model for now my local Home Server is running on this IP and the proxy is available on 8082 P This Is My Gin API key when lib chat will connect to this light LM and provide GPT 3.5 turbo that will connect to my proxy which will also present the model as GPT 3.5 turbo and the proxy can connect to the gini pro model and F with the result similarly I have set up the gp4 vision review model and Google is currently offering 60 qu per minute for free on both their Vision model as well as their text model so we are good we should be able to run this system for completely free I'll start my light llm proxy server so my light llm proxy started on 80004 so I have now opened my Postman and I'm going to create a temporary key this is more sort of a temporary key that we can create to be used with light llm this is to track spending and manage API key locally so you don't have to manage API keyy for the provider every time you are working on a new project if you want to know more about all these features in light LM write down in the comment and I will bring a whole new video about how you should set up your AI API proxy to be use in your event production project so now we will come to our EnV file so we have got the API key which we will set here and we will set the reverse proxy URL to be host. do. internal col 8,000 the reason why are using host. doer. internal is because we are going to run libart as a doer image and my light llm is running outside of the docker image in in the host basically so in this home PC on 8,000 Port just one quick thing to also mention that liart also support Google KY directly so if you are in a country where gini apis is available you can directly specify the Google gin API key that I was using here I could actually specify the same thing here and it should work because I'm in UK and G API is still not told out that's why even though I specify it here it will not work for me next what you have to do is run Docker image build but because I've already done the build I'm not going to run it again but please go ahead and build the docker image for Libra chat by using this command once that is done you can run Docker compose up and this should start my Libra chat and a local mongodb and it's also using MA search this is to save your chat as a history and you can later on go and search from your chat history okay now you can see my Libra chat is running and I should be able to access it on 3080 Port so now we'll go to Local Host 3080 and there you go it exactly looks like the Open chat GPT registration interface isn't it we have logged in we can see these three models these three models we are running in our light llm config as well as I have shown you earlier so let's give it a quick test who designed you and you see it's saying it has designed by Google AI best thing is you can actually further fine tune the response so you can set the temperature to be more deterministic you can also set custom instructions let's see how good it can do coding and there you go it can also generate code as you have asked let's create another chat and this time we'll t div Vision capabilities that we can do in chat jpt Plus for example this one and we'll say what text is in the image it has able to read the image and face the text wow this is more like chat GPT plus isn't it in the next section we will integrate this chat GPT plus like system with Google search to integrate Li chat with live internet data or Google search we need to use SAR apis or by using Google search API for this demonstration we'll quickly use the Sur API but I'll also show you how you can set up Google search API and using Google search API you will have much more daily limit to use for free so I am now into the ari.com now that I have this private API key I'll copy it now if what I can do I can use plugins and here in the plug-in store if you search SARP you can install the SARP API and can give the key here as well and then when I search what is the weather Focus for tomorrow in London with this our API enabled it can go and check in Internet and then it is able to get the weather Focus for tomorrow for London so to integrate Google search using the Google Cloud search API you need Google API key and Google search engine ID you need to go to this link create a search engine ID first and then you have to create an API key from this link but please remember you need to have Google cloud account enabl to use that let's proceed and integrate libat with our locally hosted large language model now for that I have already opened the WSL and I have already started the AMA so that means my locally hosted large language model through AMA is running if you want to know more about how to use AMA to run multiple large language model in your local system please check out the video in the top right corner I'll attach the link in the description as well now coming back to my llm settings what I'll do I will comment this one out I'll use the locally hosted open chat model as GPT 41106 model I'll close this and start this again okay so my my light LM has restarted so when Libra chat will send gp4 1106 preview model I would instead connect to my locally hosted large language model and respond so I don't have to actually do any change in my Libra chat or even restart Libra chat with new chat we select GPT 11106 review what llm are you it's taking a little bit of time because AMA is loading the Open chat model and then it will respond back and there you go it has responded by saying it's open I gp4 because open chat model has used synthetic data created using open eyes gp4 so but but if you see the light llm log you'll be able to see that when we have asked this question what llm are you it has in it has actually connected the AMA in our Local Host with the Open chat model so Li chat project comes with a lot of different features for example you can integrate with stable diffusion in your local system to generate image you can use it to connect with Wolfram you can also use different plugins um you can also create plugins so they have a very good documentation on how to create plugins and use it with lib chat if you want me to do a more Deep dive into Libra chat and show you all the features that is possible with this open source project please write down in the comment and I'll definitely cover it that's it really for this video in the upcoming videos I will bring more such cool open source projects so stay tuned take care subscribe to the channel and I'll see you on the next one bye [Music] hello
Info
Channel: Kno2gether
Views: 3,921
Rating: undefined out of 5
Keywords: open source, free software, chatgpt plus, Gemini Pro, open source software, open source chatgpt, open source projects, open source alternative, librechat, librechat install, librechat ai, chatgpt plus free, chatgpt 4 free, local llm model, local llm, openchat, SaveOnSubscriptions, open source ai, llm, gpt 4 free, llama, Ollama, free apis, large language models, langchain, chat gpt 4, openai, open source llm
Id: k61MJ2D4Tbw
Channel Id: undefined
Length: 10min 30sec (630 seconds)
Published: Sun Jan 07 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.