GPT-4 API Crash Course - Get Coding In 10 Minutes

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
welcome everyone in this video I want to show you how to add AI to your code using the openai GPT apis I did just release a course on openai apis yesterday it has tons of really cool projects I had a blast making it there's a link in the description of the coupon if you're interested so most of you I'm assuming are familiar with chat gbt the web-based client where we can interact with gbt3 3.5 or gpt4 depending on if you pay extra money to have access to GPT Ford well we also can interact with these models directly via our code where we can add all sorts of really cool capabilities to our code without having to understand anything about how the AI works you just need to know how to talk to it so there's apis to connect with gpt3 gbt 3.5 which is not a fully new model but it's a more finely tuned version of three and gpt4 which is the whole brand new fancy model that just came out last month in beta or at least with a wait list gbt by the way stands for Gen generative pre-trained Transformer Transformer is the underlying architecture powering the model so all of these GPT variants are language models that take text as input and continue that text as output right text in and text out so we can do super simple things like ask it to complete a sentence the color of the sky is blue I ask you to tell us a joke or ask it to translate you know from English to Korean for example or ask it to write code for us to debug our code to perform sentiment analysis there's tons of different applications it really is just kind of up to your imagination now the first step here is to sign up for an API key if you don't have one already so you'll need to go to openai.com which is actually not the correct location if you go to openai.com it takes you to their marketing page and you won't see a sign up button but if you click any developer page here it will take you to the other website platform.openai.com which is where you want to go to sign up you'll need to verify an email put a phone number in and it will give you an API key so today I'm going to be working in Python because it's super easy to integrate open AI with python it also can be done with any language out there it's just an API that you can connect to but there's a nice node client for JavaScript and there's a nice popular python client so I'm going to be using python in a jupyter notebook so what I've done is installed the client that I'm going to be using which again you don't have to do you can manually make post requests to the correct endpoints like this is one of the endpoints but it's much easier to use the client where it's just a matter of calling a method or two so you'll need to install this client pip install openai for using python or if you're going to try doing it a node you'll need to install with npm the openai package when that's done the first most important thing is setting up your API key so if you're in a rush and you're just playing around you can hard code your API key right here of course you don't want to share that key with anyone so it's not the best idea so I'm using a DOT EnV file where I'm writing my environment variables I'm loading that.enc file with a package called dot EnV and then I'm setting that API key on the openai client to be equal to my environment variable however you do it we have to provide the API key to the client now that we have that done let's make our first request so the first thing you really have to know is that there's two different ways that we can make requests there's two different families of endpoints you can see this on the documentation there's a completion endpoint and a chat endpoint both of them work with the GPT family of models there's two different endpoints where we provide text and we get text back out from some GPT model but there's a significant difference the completion endpoint is the old way I don't know if it'll ever be deprecated but it used to be the only option and it would look something like this if we're using the python client openai.completion.create and we specify a simple prompt tell me a joke translate this sentence write a function analyze the sentiment whatever we wanted to do we give it a single prompt the newer option though is to use the chat based format which expects a full list of messages in basically the format of a conversation this is new it was just released in March of 2023 and the reason it really matters is that the chat format is the only option we have to work with gpt4 and also GPT 3.5 Turbo so you kind of just want to use the chat format to be honest because it gives us access to these other models and it seems to be the direction that openai is heading so here's the same exact type of a query asking for a joke written using the chat format instead of a single prompt we provide a list of messages and the message has some content tell me a joke so why don't we take a look at trying this for real in this notebook again I'm using the python client I'm going to do openai.chatcompletion.create and then we have to specify a model now we don't really have time to talk about all the different flavors of models and how they compare and pricing and tokens but just know if you have access to gpt4 it's the most expensive it also tends to be the slowest so I'm going to use GPT 3.5 turbo just to start I'll show you that gpt4 works too turbo is very quick but it's also very cheap but in a lot of cases it's completely indistinguishable from gpt4 for simple queries so the next step is we provide a list of messages so it has to be a list I have some rules written out here a list of objects or in Python what you would call a dictionary where each object has a role who is the message coming from remember it's supposed to represent a conversation so I'll talk more about this in a moment but most importantly each message also has to have the message content so I'm going to start by adding a single message in here I'll just ask it to um maybe write me a function in Python so I'm going to set the role to be user and then I'm going to set the content to be write a python function to convert let's say Fahrenheit to Celsius I'll do F to C because I don't want to spell Fahrenheit uh and that's all we need to do this will give me a response back and it looks like I never ran my API key there we go it didn't have the API key setup okay I got my response back first of all you can see in here this is the actual text that it continued for me based on the prompt that I gave it in the form of messages further down though we have some meta information most importantly is this uses usage area of the response that tells us how many tokens were used in the actual text it completed and in the prompt we provided and the reason this matters is that this is how we're built these models don't work with words as we know them in the English language but instead smaller pieces called tokens that are typically about four characters of English text on average here's an example the sentence I like hamburgers is three words but it's five tokens in the eyes of the GPT family so each model has a different price per token and openai Will Bill US based on the total number of tokens in the input plus the total number of tokens in the output it varies a little bit with this messages format it gets sort of complicated but here's how the pricing shakes down shakes out I don't know what the right phrase is there so gpt4 is the most expensive 6 to 12 cents per 1000 tokens 3.5 turbo very cheap 0.002 dollars for a thousand tokens so I'm using gdpt4 here it's 159 tokens it's a fraction of a fraction of a cent that's all you need to know for now okay so let's take a look at how we actually access the content out of here the response is inside of choices and then that gives us a list I want the zero with choice there's only one right now and then I want message and then specifically I want content out of there and I'll print this so we can get our new lines formatted nicely and here's the function it wrote for me uh let's see if it works let's just copy this function and let's call it convert F to C let's do this an easy one 212 Fahrenheit should be 100 Celsius there we go so seems like it's working now there's a lot to be said around writing a good prompt and how to control the output because if we're asking it to write code for us do we really want all this extra text and examples and explanation maybe we just want the pure code there's ways of telling it exactly what we want it to give us I'm just showing you how to get any sort of result okay so let's return to the messages format remember that in order to use this chat API which is the only API that supports GPT 3.5 turbo and gpt4 we have to provide a list of messages and the idea is that it represents a conversation between you or the user and the assistant which is the model basically gpt4 the the chat assistant so that's what the role attribute or the role property is for we can set it to be user if it's text coming from us assistant if we're trying to show it basically an example of a response from the assistant and then system is used to provide some general overall context so let me show you a more complicated example here's a function I wrote called get tweet sentiment and I want to call your attention to the messages list I made first we start with a system message now a system message is used to give some overall direction to the assistant or to the model so in this case my message says you are a sentiment analysis assistant given a text input respond with the sentiment as either positive neutral or negative so I'm telling it how I want it to behave then I continue the conversation with an example so I probably could have gotten away with no examples but sometimes sometimes it's very useful to provide an example to specify exactly the format that I want back so in this case in our conversation the user role says this tweet it's a tweet about Frank Ocean uh who had a controversial performance at Coachella last weekend and then uh the assistant responds back again it's not actually responding with this this is me telling it how I wanted it to respond with the word negative and then I prompted again this time the user role specifies the content as whatever tweet is passed in and by building up this simple just one example really I'm telling it exactly how I want it to respond with a single word sometimes if you don't provide examples it might respond with a sentence saying the sentiment is negative the sentiment is positive here I'm telling it no I want you to just give me the word negative or positive or neutral so then I take that list of messages I pass it off to openai.chatcompletion.create I'm using gpt4 this time and I just return the message content we get back so here's a positive tweets Frank Ocean performing godspeed I'm so moved it responds with positive here's one that's more neutral it could have been better but it wasn't a terrible set either and it responds with neutral so that's just a slightly more complex example of how you can use the messages syntax you can create your own conversation history that gives it some context and informs how it should answer your query so this was just a super quick introduction of course there's a lot more to working with these models and these apis sentiment analysis is not even the best use case it's just a very simple example to show you how the conversation mechanism works and there's just so much you can do with these apis to enhance your code and come up with really cool projects so if you're interested I do have a course just released on working with these apis you can find a link in the description it's on sale blah blah blah hope you enjoyed the video and I'll be back with some content tomorrow
Info
Channel: Colt Steele
Views: 14,312
Rating: undefined out of 5
Keywords:
Id: hmTjZ8FvkfQ
Channel Id: undefined
Length: 11min 48sec (708 seconds)
Published: Wed Apr 19 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.