How To Use ChatGPT (and GPT-4) in Python!

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello there my name is Gary Sims and this is Gary explains I'm really excited about this video because it joins together two of my favorite topics that's programming and large language models now a little while ago I did a video about a GPT 4 and I asked in that video who would like a uh an explanation of how you can connect to GPT 4 or GPT 3.5 turbo through python now the advantage of connecting uh through python is that you have complete programmable control over the input and the output so for example you can go and fetch something off the internet you can include that in the input and then when you get back the output you can do something with that display it store it in a database store it in a file whatever you want to do all because you can use Python so in this video what we're going to do is going to create a smart search agent so let's say we're looking for something about custard pies now rather than just asking gp4 about custom pies we could go onto the web download some web pages about custard pies get gp4 to summarize those pages maybe ask gp4 its own opinion about custard pies and then amalgamate join all that information together in some great uh executive summary or some longer format if we want so we're using the web and gp4 at the same time under our program control so if you want to find out more please let me explain okay so as I said accessing chat GPT or GPT uh models be better to say from python by writing a smart web search AI agent now before we get cracking of course we're talking about python so you're going to need a working python environment I'm going to assume that you know what python is this is not a Python tutorial just a few things if you're using Linux then you're going to probably have python pretty much installed by default if you're using Windows Windows or Mac OS do go to this website python.org downloads on Windows make sure that you tick add python to path when you do the installation and also to install one of the modules that we're going to mention in a minute you are going to need Microsoft C++ build tools installed on windows so make sure you go to the Microsoft site download and install that now talking of those prerequisites you're going to need the following python modules and they can be installed using pip again this is not a Python tutorial so if you don't know what this is then I suggest you just go and read a tutorial about pip and so on but you need to install requests open AI Lang chain and beautiful soup 4 again note on Windows because Lang chain needs to do some compiling make sure that you do the module install using an x86 native tools command prompt which is what you get when you install those Microsoft build tools otherwise Lang chain won't install on Linux that shouldn't be a problem okay once you've got your python environment set up once you've got all the right things installed we can start talking about how you actually use it now to talk to a large language model over at open aai so we're going to be using Lang chain it's a framework for developing applications powered by language models now the great thing about it is that it's first of all it's an easy to use API as I'll show you in a second but also it adds a layer of abstraction so actually you can swap so when there are other uh API is available you can very easily swap to just using a different API and basically you your code still works you don't need to say oh I've written this using the python library that comes with open a I've written this with using the python have to use with Claude I've written this with the for llama CPP you can use the same uh abstracted uh higher level uh API and your code really stays untouched no matter which model you're using and in fact I've actually tried the code that I've written here using uh LM Studio uh and basically the code remains exactly the same except for you know defining the model name here really so three things is all you need to do in Python to get this working a shooting of course you on all the right Imports uh stuff uh at the top of the file but basically you connect to chat uh open AI uh which is the sort of the chat GPT model you give it the model name you can optionally give it your API key which we'll talk about uh in a moment or you can have that defined using an environment variable you then create your prompt and then you basically say hey send this prompt and give me the answer which will come back in this uh string here uh answer so three lines of code and you're up and running so I mentioned the API key to access open API GPT model you need to have uh set up a pay as you go account you add credit just normally a few dollars and then you are able to use the account and it deducts uh as you go along you can't get into debt by just giving it your credit card number and then suddenly you wake up with $1,000 a bill it a pay as you go set up which is brilliant uh and you know if you put $10 on it it will last you a long time because we're talking about you know cents or fractions of a scent to actually use these calls the key will look something like this big long string number remember your API key is secret do not share it with others or expose it in any client side code actually funny story by mistake I upload the wrong version to GitHub even though I saying to I'm not going to upload the wrong version to GitHub with my API key in it I still managed it somehow and I instantly got an email from uh from open a uh AI saying your key has been rescinded because we've detected it in public so uh just to note this is a fake key it's not the real one but that's what they do look like now as I said we're creating a search agent so it's going to be important to also get things from the internet so to fetch web pages from the internet we use the python request Library which is something you saw us install there with Pip earlier on it's designed to make HTTP uh hypertext protocol uh requests simpler and more human friendly it abstracts the complexity of working uh with things like you know like cookies and headers and URL encoding all this kind of stuff um to you know SLS for https it does all that stuff and all you really need to do is here's an example you just say response is equal to request.get then you give it a URL and then you can response. text if you just want the text part of what uh comes back from the response not the headers and so on so really really simple to get things off the internet now of course once you get that text back it is going to course send you back HTML there might be some JavaScript in there there might be some CSS in there so beautiful soup is a library that makes it easy to scrape information from web pages beautiful soup passes anything you give it and it does the tree traversal stuff for you if you only want the human readable text inside of a document you can use the get text method it returns all the text in a document as a single string which is just perfect for what we're trying to achieve so think about now we we've got the uh Lang chain for accessing uh the the open AI model we've got requests for actually fetching things off the internet and we've got beautiful soup for actually making sense of a web page use those three together and you've got a very powerful combination so as I mentioned earlier you're going to need to use a search engine to find the web pages we then fetch the first page we fetch the second page we fetch third page you could fetch 10 pages if you want to uh and then you create get chat GPT to create summaries of each ones of those you then also ask chat gtpd or GPT 4.0 to about the topic as I said custom pies earlier on and then you create a meta and executive summary using the web pages and uh chat gpt's own information to give you your final result so that's what we need to do so using a search engine well Brave offers a search engine with an API so you can access it via python using requests replies are in Json so they're very easy to pass from python point of view you'll need to sign up for a free API key which is called an X subscription token which you include in the header and once you get back the result so look here I've there's different ways of doing this but just to make it simple data is equal to Json loads response response is what you got back of course from requests then inside of that there is a web section inside of the web section there's a result section and then the first URL is result zero URL get the URL and then you can use that to go and fetch the web page use results one results two uh and so on so pretty good way to get the results from the brave uh search engine then once you've got that you can then ask request to go and get the web page URL 1 for example you then use beautiful soup and you can say give me the human readable text here and then here's the important thing you can then create a prompt which you then send to uh the GPT model summarize this text into a 300w extractive summary ignoring all HTML CNS and Jud I put that in there just in case Cas anything has leaked through the summary should be easy to read and engaging so you can create your own prompt however you feel best about that and then notice it's plus human readable so that's the bit that then adds on the actual web page which is after kind of a codon and a new line telling uh the GPT module that this is actually the text we want to do and then you run that and you get back the summary here in this a variable and if you do this three times you might have a you know you might have a a list or or you know summary one sum two summary 3 whatever you want but you can then collect all these different bits of data together and also you can add in that fourth one create a detailed summary about and then terms this is the terms that you've passed in to this program however in my example it's something you typeing at the keyboard so custard pies so create a detailed summary about custard pies again you run the model and it will just give you back the summar now I've got GPT summary maybe I've got summary one two and three whatever in fact let's have a look at that so once you've got the F all your data rewrite the following text as a blog post I've chosen that seems to be good results again you can do your own bit of prompt engineering here see how you want to do it ignoring all duplicate information this post should be easy to read and engaging and then I've got sum one sum two sum three this is summary One summary 2 summary 3 and then GPT summary so it's got these different paragraphs in there and then finally I get the meta summary which comes out of by running this prompt with all that text the only thing to watch out for if you're using some of the GPT models in the number of tokens cuz if you've done a lot of searching on the web and you've got all this text maybe that prompt will be too long if you're using GPT 4 Turbo then uh it's not not a problem as far as in my experimentation now the full code as I said is in my GitHub repository you go to this URL have a look if yourself it's not very difficult it's literally just using those things there you type it in you connect to the search engine you get back the list of results you go and fetch the three pages you create summaries of them you add them all together and say create me a final prompt okay so there you have it accessing GPT 4 GPT 3.5 turbo whatever you want to use from python love to know what you think about these techniques in the comments Below have you used python to access open ai's uh large language models please do share your stories below okay that's it I'll see you in the next [Music] one [Music]
Info
Channel: Gary Explains
Views: 2,811
Rating: undefined out of 5
Keywords: Gary Explains, Tech, Explanation, Tutorial, Python, ChatGPT, GPT-3.5, GPT-4, GPT-3.5-Turbo, OpenAI, Langchain, ChatGPT API, Open AI, OpenAI API, API Key, Brave Search, API request, LLM, LangChain Libraries, Beautiful Soup, BeautifulSoup4, JSON
Id: 1BjFHn7Zqeo
Channel Id: undefined
Length: 11min 37sec (697 seconds)
Published: Wed Dec 13 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.