ChatGPT’s Amazing New Model Feels Human (and it's Free)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so today May 13th marks the beginning of a really interesting couple of weeks today we got some new announcements out of open AI which came right before Google's making announcements tomorrow for whatever reason open AI loves to do that they love to pick the timing of their events to try to overshadow Google as much as possible and today's open AI event seems to be exactly what they were trying to do I'm actually out here in Mountain View California right now cuz I'm I'm going to be at the Google keynote tomorrow but wanted to make sure that I got a video out as quickly as possible about what open AI just announced with their new model instead of calling this new model GPT 4.5 or GPT 5 they went with GPT 40 and as it turns out that mysterious chatbot that we've been playing with called gpt2 chatbot and I'm a good gpt2 chatbot and I'm also a good gpt2 chatbot we're all actually us getting the opportunity to test this new GPT 40 now while the model is improved a little bit the biggest features about this model are the lower latency when having voice conversations it seemingly has better multimodal capabilities and instead of me breaking it all down let's watch little bits of the keynote and sort of talk through some of these new announcements that open AI just made the big news today is that we are launching our new flagship model and we are calling it gbt 40 the special thing about gbt 40 is that it brings gb4 level intelligence to everyone including our free users so this is actually a big piece of news here up until now if you were on the free version of chat GPT you were using GPT 3.5 well this new state-of-the-art model GPT 40 is now going to be available both for plus and free users which means anybody can use this new state-of-the-art model completely for free but if you're a plus member you just get to use it a lot more a very important part of our mission is to be able to make our Advanced AI tools available to everyone for free and today we're also bringing the desktop app to Chad gbt so this is another big update that they made during this keynote is that we now get a desktop app now in their demos that they show here in this video they're only showing the desktop app being used on Mac they didn't really say if it was Mac only or if it was going to be Mac and PC my guess is that it's probably going to come out for both Platforms in the demo they only showed Mac and they didn't really speak to that piece of it as you can see it's easy it's simple it integrates very very easily in your workflow GPD 40 provides gp4 level intelligence but it is much faster and it improves on its capabilities across text vision and audio so we're very very excited to bring dpd 40 to all of our free us users out there and for the paid users they will continue to have up to five times the capacity limits of our free users so I'm going to go back real quick on the screen here and these are all the things now up here on the top right that chat GPT free users are going to get access to you're going to be able to access the GPT store and all the custom gpts Vision the browse model which allows you to search the internet using chat GPT the memory functions and Advanced Data analysis which used to be called code interpreter free chat GPT members get that GPT 40 is not only available in chat GPT we're also bringing it to the API so basically when they're saying they're also bringing it to the API they mean that developers can actually work with this new model as well however that means that you can play with this model directly inside of the openai playground as well if you go over to platform. open.com playground and click on chat in the left sidebar you can see on this little drop down we actually have GPT 40 over here and one thing that's interesting to note is inside of the playground we actually have the ability to upload images or link to images I don't believe that was ever available inside of open ai's playground before this is the first time I'm noticing it maybe it was there before but I'm pretty sure they just added the ability to upload images here with this new rollout of GPT 40 our developers can start building today with GPD 40 and making amazing AI applications deploy deploying them at scale 40 is available at 2x faster 50% cheaper and five times higher rate limits compared to gbd4 Turbo so we'll do some live demos I love that they're doing these demos live as well because I think it's sort of a little bit of a message to Google when Google put out their launch video for Gemini they showed off all of these really cool capabilities and a nice pre-recorded polished video and it turns out that a lot of what they were showing was not actually real time they made it look like it was doing these things a lot faster than it actually was capable of doing them open AI is saying hey check this out we're going to show it to you live in real time no camera trickery this is how it works they also kind of made that little nudge in their blog post as well about GPT 40 all videos on this page are at 1X real time that note there was very intentional it's a message to people who are comparing it to Google's launch guaranteed that's why that's there hi I'm uh I'm Barrett hey I'm Mark so one of the key capabilities we're really excited to share with you today is realtime conversational speech and if you see there's this little icon on the bottom right of the chat GPT app and this will open up GPT for's audio capabilities hey chat GPT I'm Mark how are you oh Mark I'm doing great thanks for asking how about you now although they've been doing a lot of talking about GPT 40 and this new model and how it's going to be in chat GPT for free and it's going to be in the API I think they really wanted to do this keynote today to show off this voice feature this is very reminiscent of the movie Her where he basically has a chatbot companion that he talks back and forth with who's voiced by Scarlet Johansson and if you listen to this voice a very similar sounding voice just what they're showing off here makes me think that we're about to see an explosion of AI girlfriend apps because the conversation feels so much more realistic and as they're having this conversation one thing to note is the latency difference you used to ask a question there'd be a little pause for five or six seconds and then You' get the response now the responses are a little bit more real time it's feeling a little bit more like a real human to human conversation hey so I'm on stage right now I'm doing a live demo and frankly I'm feeling a little bit nervous can you help me calm my nerves a little bit oh you're doing a live demo right now that's awesome just take a deep breath and remember you're the expert I like that suggestion let me try a couple CLE deep breaths can you give me feedback on my breaths okay here I go whoa slow a bit there mark you're not a vacuum cleaner breathe in or count of four that's pretty crazy that it actually could listen to your breathing and tell you you're breathing too hard and shows us that it is doing a pretty decent job of understanding some of the context of the conversation going to breathe in deeply and then breath out for four and then EX exhale slowly okay I'll try again breathing in and breathe out that's it how do you feel I feel a lot better thank you so much they we having a little bit of Internet issues probably because there's so many people all trying to connect to the same Wi-Fi the demo definitely was live and you do get a little bit of an extra delay that you probably wouldn't have normally just because of the connection issues if you've used our voice mode experience before you'll notice a couple key differences first you know you can now interupt the model you don't have to wait for it to finish your turn before you can start speaking and you know you can just butt in whenever you want second the model is Real Time responsive that means that you don't have this awkward 2 to 3 second lag before you wait for the model to give a response and finally the model picks up on emotion right when I was breathing super hard there it could tell and it knew hey you might want to calm down a little bit you're kind of running a little bit too fast you know it really has that capability across the board to perceive your emotion not only that though the model is able to generate voice in variety of different emotive Styles and it really has a wide dynamic range so my friend U Barrett here he's been having trouble sleeping lately and uh I want you to tell him a bedtime story about robots and love oh a bedtime story about robots and love I got you covered gather round Barrett once upon a time in a world not too different from ours there was a robot named bite bite was a curious robot always expl started this story but I want a little bit more emotion in your voice a little bit more drama got it let's add some drama once upon a time in a world not too different from ours there was a robot named no TR I really want maximal emotion like maximal expressiveness much more than you were doing before understood let's amplify the drama once upon a time in a world not too different from ours there was a robot named bite bite can you do this in a robotic voice now initiating dramatic robotic voice once upon a time in a world not too different from ours there was a robot named bite bite was a curious robot so that's pretty cool it can actually change the way it speaks make it more dramatic make it sound like a robot that could come in handy for things like Bedtime Stories and maybe meditation apps to get that real calming voice things like that it seems like it'll be really really good at now the other thing they mentioned is that that latency has been sort of fixed where there's not as much of a gap when you talk and when you get the response and it does seem like that Gap has been closed but it also kind of feels like maybe they're putting some filler words in there to close that Gap so you ask the question and it goes hm let me think about that for a second okay here's what I think and it will sort of add filler and then give the response I believe that's what it's doing they didn't really speed up the latency on there they just made it start speaking before it was sort of ready to start speaking if that makes sense can you end the story um but do it in a singing voice and so bite found another robot friend and they live circly Ever After thank you so much I definitely feel a lot more relaxed we also want to show you the vision capabilities as the model can also see the whole world around us yeah absolutely like as you can see you know you can chat with chat GPT in real time but you can also interact with it um with video as well hey chat PT hello there how's it going it's going really well today I'd really like your help solve a math problem I'm all ears what math problem can I help you tackle today so I'm going to write down a linear equation on a sheet of paper and I'll show you and then I'd love your help working me through it but importantly don't tell me the solution just help give me hints along the way okay I see it no I didn't show you yet just give me help along the way one second whoops I got too excited I'm ready when you are chat PT what equation did I write down there ah I see it now you wrote down 3x + 1 = 4 yep exactly so what's the first step I should take to try to solve this the first step is to get all the terms with X on one side and the constants on the other side so what do you think we should do with that plus one okay I'm going to try to subtract one from both sides and then I'll see what I get great idea perfect now what do you get when you subtract one from both sides okay so chat GPT this is what I ended up with how does this look bot on now you've isolated the term with X on one side and it looks like 3x = 3 what what do you think we should do next to solve for x I'm not sure can you give me another hint of course you have 3x and you want to find the value of x think about what operation would undo multiplication is it subtraction close but think of the opposite of multiplication because 3x means 3 * X okay I think it's division let me try to divide both sides by three and let me see what I get Bingo go ahead and divide both sides by three so chat gbt this is what I ended up with how does this look it looks perfect you solved it and x equals 1 nicely done now this isn't really showing off its math capabilities that was a super super simple math problem I don't really know if it's greatly improved with math although people were claiming the gpt2 chatbot was better with math so it probably is a little bit better with math this is more showing off the vision capability it can see as he writes the numbers on the paper and solves the problem and really again this is showing in real time something that Google was kind of squeaking and speeding up and making it seem like it was able to do really fast but it was really an edited video so I think they're really trying to show you again look what we can do but Google hasn't been able to do yet so chat gbt I really love that you you know taught the value of math to my friend Mark and I wrote uh one last thing i' love if you could take a look at of course I'd love to see what you wrote show it to me whenever you're ready okay so this is what I wrote down what do you see a I see I love chat chpt that's so sweet of you let me take out a computer for this so I'm going to open the chat gbt desktop app like miror was talking about before now they're going to get into the desktop app which is pretty cool one of the things that you can do with the desktop app is copy everything on your screen save it to your clipboard and then Chad GPT will use that information for its chat it's also got a button to share your screen and chat GPT can see everything that's going on on your screen and use that as context as well pretty cool so check this out and to give a bit of background of what's going on so here we have um a computer and on the screen we have some code and then the chat gbt voice app is on the right so chat gbt will be able to hear me but it can't see anything on the screen so I'm going to highlight the code command see it and then that will send it to chat GPT and then I'm going to talk about the code to chat GPT hey chat GPT hey there how's it going yeah it's going really well I was wondering if you could help me with a coding problem today of course I'd love to help you out what's the coding problem you're dealing with okay I'm going to share with you some code one second sure thing take your time okay so I just shared some code with you could you give me a really brief one sentence description of what's going on in the code this code fetches daily weather data for a specific location and time period Smooths the temperature data using a rolling average annotates a significant weather event on the resulting plot and then displays the plot with the average minimum and maximum temperatures over the year awesome thanks so much and in the code there's this function Fu could you describe to me what the plot would look like if I had versus didn't have this function sure the function Fu XY takes in a data frame X containing temperature data and a window size y for calculating the rolling mean it Smooths the average temperature minimum temperature and maximum temperature columns by applying a rolling mean over the specified window with the fu function the plot would show smoother temperature lines because the rolling mean reduces the noise or fluctuations in the data awesome this makes thanks so much so yeah I definitely will want to apply this function and so I'm going to uh generate the plot and then show it to you give me one second absolutely take time I'm excited to see the plot when you're ready so so far chat gbt has only been able to see the code and now I'm going to run it and then I'll use the vision capabilities of the chat gbt desktop app so it can actually see everything going on on the screen as well now I don't really know if this is an improved coding ability inside of chat GPT and GPT 4 I feel like what they just showed it's been able to do but what's really cool about it is the fact that they were able to copy the code and then the chat GPT was able to read it basically just cuz it was copied to the clip board and give this sort of verbal explanation of what was going on in the code and what would happen if you Chang the specific variable that he was referencing I'm sharing with you the plot now I'm wondering if you can give me a really brief one- sentence overview of what you see the plot displays smoothed average minimum and maximum temperatures throughout 2018 with a notable annotation marking a big rainfall event in late September fantastic I have another question for you um which months do you see the hottest temperatures and roughly what temperatures do those correspond respond to the hottest temperatures occur around July and August the maximum temperature during these months is roughly between 25° and 30° 77° F to 86° F what's going on with the y axis is this in Celsius or Fahrenheit the Y AIS is in Celsius the temperatures are labeled accordingly such as average temperature minimum temperature and maximum temperature if gbd 40 is capable of realtime translation Mark you want to try this one sure yeah so it's really cool to see this translation feature because this is going to make communicating in other countries so much easier you just open up your phone to the chat GPT app and then you can have a conversation with anybody in almost any language check this out let's do it I speak Italian so we can English Italian try to do English Italian sure let's do it I would like you to function as a translator I have a friend here who only speaks Italian and I only speak English and uh every time you hear English I want you to translate it to Italian and if you hear Italian I want you to translate it back to English is that good Mike she wonders if whales could talk what would they tell us um they might ask uh how do we solve linear equ certainly yes can you tell what you're feeling just by looking at your face Barrett you want to give this one a try absolutely let's try it out yeah so I'm going to show you um a selfie of what I look like and then I'd like you to try to see what emotions I'm feeling based on how I'm looking sounds like a fun challenge go ahead and show me that selfie and I'll put my emotional detective hat on so here's me so what kind of emotions do you think I'm feeling H it seems like I'm looking at a picture of a wooden surface oh you know what that was the thing I sent you before don't worry I'm not actually a table um okay so so take a take another look uh that makes more sense ah there we go it looks like you're feeling pretty happy and cheerful with a big smile and maybe even a touch of excitement whatever's going on it seems like you're in a great mood care to share the source of those Good Vibes yeah no uh the reason I'm in a really good mood is cuz we were doing a presentation showcasing how useful and amazing you are oh stop it you're making me BL now one thing that was really interesting about that little clip there was that at first it said I see something like a wooden table and then he said oh no look again that makes me think that it's actually not watching the video it's just taking screenshots at certain times because he must have had his phone face down on the table or looking at the table it took a picture of the table and then when he put the camera up to himself and he told it to try again it then saw his face so I don't think it's watching video footage I think it's just taking snapshots when you ask questions and again you can play with it right now inside of the open AI playground and I don't know if it's rolled out for everybody yet or not inside of chat GPT but I just opened my chat GPT account and I got this notification introducing GPT 40 you can now try our newest model it's faster than GPT 4 better at understanding images and speaks more languages try it now and we can see up at the top we now have the option of GPT 40 GPT 4 and GPT 3.5 so I am a plus member but GPT 40 is available inside of my account right now so my biggest takeaways from this event I think the chat feature is really really cool we've been able to chat on the mobile app for a while now but it doesn't have the inflections it's got the longer latency you weren't able to sort of cut it off mids sentence and continue the conversation so there's a lot of improved features over that voice chat which I think is really cool I don't know if we saw a huge leap between gp4 and what GPT 40 does I feel like it's a pretty similarly capable model we saw from all the gpt2 chatbot videos and tests that it's slightly better than GPT 4 in a lot of areas but not a huge leap above them it also seems to be lightning fast so if you've used GPT 4 inside of chat GPT at all GPT 40 is a lot faster if I give it a prompt here and hit enter we can actually see in real time how fast it actually completes the writing of this prompt it just cranks it out really really fast also another sort of theme with Chad GPT is is that almost every time they roll out an update like this and make some big announcement it seemingly kills a whole bunch of little SAS companies that have been building on top of their apis just like that inside of the free version of chat GPT we're going to have translation which was a whole industry of tools that have been popping up built on the GPT apis we have ai girlfriends this is a niche with an AI that has been really really rapidly Rising lots and lots of apps have been submitted to the Future tools about these like AI girlfriends now it looks like GPT 40 can just kind of act as that AI girlfriend or boyfriend or significant other we've seen tools like Devin and GitHub co-pilot now while I don't think this is going to kill tools like Devon or GitHub co-pilot it may make it so you don't need a thirdparty coding tool that you pay extra for you might be able to do it all just with the free version of GPT 40 there's also apps out there that sort of watch your screen and listen to you all day and going to help give you a recap of what you did throughout the day well it looks like the desktop version of this might be able to do that I don't know if you could just leave it running all the time and just have it kind of keep track of what you're doing but I wouldn't imagine that that's too far out of the question I just find it really interesting that open AI has this model of building apis letting companies build on the apis and then going and building a ton of these features just right into their own products so that you don't need the tools that were built with the apis it's just really really interesting to me this to me seems like what Siri should be and the rumors are right now that Siri is probably going to use open AI Tech and this might be the future of Siri we'll find out pretty soon most likely at WWDC now what's really cool is if you go to the blog post over on open ai's website here they have a whole bunch of other demos that we didn't get into yet for example this one Greg Brockman here got the two phones to basically sing to each other back and forth between the two phones we've got some interview prep we've got it playing rock paper scissors testing sarcasm harmonizing pointing and learning Spanish summarizing meetings real time translations lulaby talking faster singing happy birthday just all sorts of demos and use cases that you can check out so we'll make sure to link to this blog post below the video so you can watch some of these demos I don't want to just sit here and just watch demo after demo after demo but it looks to do some pretty cool stuff now is it as big and as of news as everybody was anticipating probably not the voice and the desktop app are really really cool GPT 40 is a slightly improved model which is also cool but that voice capability in the desktop app we don't really have access to yet they said that's going to be rolling out soon so we can't actually play with that now really all we have today to play with is that GPT 40 model but not the cool voice features not the cool apps not the stuff that really gets us excited that we want to play with right now we just have a slightly improved version of GPT 4 that we can play with right now but this definitely brought us one step closer to that movie her where you could legitimately have real life conversations back and forth with a chat bot and feel like you're talking to another human it's pretty crazy where this stuff is going I'm so excited for it I've got a lot more announcements and events coming up again I'm at the Google event right now so it's going to be interesting to see if Google's going to be able to top what open AI just showed off and then next week I'm going to be at the Microsoft Event where they're going to be trying to one up everybody that came before them so really really interesting time in the world of AI I'm going to be making a lot of videos you're going to see me in hotel rooms a lot over the next couple months cuz I'm going to a lot of these Keynotes should be a good time I'm going to do my best to keep you in the loop so if you're not subscribed to this channel already make sure you're subscribed I will make sure you're updated with all the latest AI news and if you like this video specifically give it a thumbs up it makes me feel good and helps the algorithm thank you so much for tuning in I really really appreciate you hope you enjoyed this little nerdout session about open AI keynote today and I will see you in the next video bye-bye
Info
Channel: Matt Wolfe
Views: 246,921
Rating: undefined out of 5
Keywords: AI, Artificial Intelligence, FutureTools, Futurism, Machine Learning, Deep Learning, Future Tools, Matt Wolfe, AI News, AI Tools
Id: 6XBtQZK4MIo
Channel Id: undefined
Length: 25min 2sec (1502 seconds)
Published: Tue May 14 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.