Learn NextJS 13 by Building a Modern Full-Stack OpenAI Chatbot

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi today you're gonna learn modern web development together with me and with Benny Benny is especially hyped about this build because not only does it look good and is super functional and makes for an amazing portfolio project but it's also secure we're going to be securing our API endpoints and rate limiting them so nobody can spam your apis and make you go bankrupt that way the rate limitation we're gonna do with a sponsor of this video that is up stash their sponsorship has allowed me to really take the time for the video make sure everything is easy to follow and let us three build an amazing project together now I could go on and on about the Technologies you're going to learn most notably typescript very in demand language and modern web development in Nexus 13 but instead of talking through it for 10 minutes let me show you okay let's navigate to localhost 3000 and take a look at what we're going to be building today I marked out the website content and as you can see this is your go to bookstore for fantasy and mystery books I called it book body a name I just made up hopefully no real bookstores actually named that you know just make it look presentable but the main thing I want to direct your attention to is right here in the corner quite subtle but that's the entire point and it's the book body support chat saying hello how can I help you and as soon as we open it first off it looks amazing and then secondly we are all ready to get started typing so for example I can say I'm interested in a mystery book hit enter that's a very subtle loading animation and then there's one of the coolest parts of the build we are streaming the response back in real time as we get the response we stream it back to the client for the best possible user experience and also what you can see while we are sending the message it's still in here and if there were an error along the way at any point we would be able to gracefully handle that error because we're doing optimistic updates that means as soon as we're writing a message it gets put into the chat field and if there's an error anytime along the way we can take the message out again because in 2023 users expect optimistic updates it's super important for best possible user experience and that's what I wanted to ensure in this build there is a beautiful scroll bar right here we can scroll through the content and everything looks exactly how it should and not only does it look good but if we try to spam it we cannot do that so for example um recommend me a fantasy book this time in a different direction we're going to go with this chat it will recommend us fantasy books that actually work it links us to the correct section of the website that I just marked out but your website would obviously go there and if you tried spamming it now with messages we cannot do that we have protected or API Road and rate limited the access to it so no idiot can go ahead and just Spam your endpoint and bankrupt you for the service we're using in the background that was also very important for me not only does it look good not only is it super functional and makes for a great portfolio project but also it's secure oh by the way one thing I want to show you it also has information about the books we have in store so which books can you recommend me for example let's hit enter and let's say mystery we're interested in mystery books and I artificially in um decrease the time it takes for red limiting um just to demonstrate in the video and now we can say mystery it links us the mystery page I want a specific mystery book it asks me for a specific title anything you can recommend we can actually interact with the bot and it knows which books we have in store so it actually knows about the titles it can link us the books so for example right now for mystery books it recommends us sharp objects by Gillian Flynn a book about Camille preaker's troubled past and then the past Never Ends by Jackson Burnett and it also knows about the book content so this is not only a bot that can link us to certain pages but it knows about the website content and can actually make recommendations for the customer I don't think it gets much better than that but again with that said and out of the way I hope you're as excited as I am to get this build started as a company I would love this because it saves so much time for the customer they get a feedback really fast and it refuses any question that doesn't have to do with a bookstore if I ask it how are you today then it's probably won't respond and it will say say it's just here to answer the questions about the bookstore which is super cool you can't make any just bogus with it but it's actually very focused on just helping customers in their book journey so after all provide you with the best possible user experience and that's the goal this is what we're going to be building it's super cool and it demonstrates a lot of best practices and cool things we're going to learn on the way in modern web development let's close this down and actually get started with the build let's slowly minimize this this is the actual project I built out so I don't get lost anywhere on the way and I can provide you with the best video possible and then we're gonna go into our CMD right here navigate to the desktop and get started with this whole build now I decided to build this out in next.js it's a new version 13 that we're going to be using it's super cool it has a lot of functionalities built in and it makes the developer experience of this customer support chat that is being streamed back in real time super enjoyable so to get started I'm going to zoom in even further you can so you can see this easier let's type npx create Dash next Dash app and then add latest and because I really like the new features that introduced in Nexus 13 let's also have a dash dash experimental Dash app this is basically going to skip the question if we want the new experimental app directory it's automatically going to say yes to that because I think the development experience in that is much Superior to if we said no to that let's call this chat box chat bot YouTube because I called the original just chatbot and yes we're gonna be using typescript don't worry I'm going to introduce all the relevant typescript Concepts to you so I'm not going to assume you know any typescript let's also install linters for styling in this video we're going to use Tailwind CSS this is not a necessity if you prefer something like scss or any other approach that works totally fine as well and then we also want a source directory so it's going to bundle all our files just into one source folder in the root directory and we want all the import aliases great that's going to install the project for us and then I'm going to be right back once it's done installing great it's done installing the project so what we can now do is see the internet CD stands for change directory and we can say CD chatbot Dash YouTube the directory we have just created and to open this in Visual Studio code we can say code dot which means open the current directory inside our Visual Studio code it's going to open it up right here on my right hand and monitor and here we are let me zoom in so you'll have an easier time following along and here we are in a brand new project that we're all ready to get started with now as the first step what we're going to do is mock out the actual website content because it would look pretty bland if we didn't have that and the way we can achieve that in nexjs 13 is by changing this page.tsx right here if you didn't know nexjs has a file based routing structure which means if you create directories and files inside this app folder that you can see here on the left hand side that is actually going to be represented as URLs on the website so that means if I created a folder called my folder and then inside of there a page.tsx which is a reserved Name by nexjs then that my folder path would be an actual URL while the page.tsx would not be visible in the URL but instead just defines which component should be shown for the current directory we are in if I close all of this and we start up the server you'll see what that means because we're we are in the root directory um let's quickly allow the additional typescript and I typed in yarn Dev to start this development server up so let's navigate over to the browser and refresh the page I've stopped the other server and then we are going to see a normal page that we see when we create a new um project with Nexus and because we're in the root directory in the page.tsx we can see in the URL bar this is just the main index page that we're on that we can now work on changing now the main index page is going to be super straightforward let's switch back into it I'm just going to put a main HTML tag in here to mock out the website content your actual website content probably already exists and again this is not the focus of the video the focus on the chatbots so this is going to be super simple I'm going to give this an absolute impalement which is a position property position of absolute an inset of zero so the top bottom left and right are all set to zero and then a flex a justify Center for horizontal alignment and then and an item stash Center for vertical alignment let's save that and it should just say website content in the center of the screen that's exactly what we want to kind of mock out the actual website perfect so now we're all set to get started with the with the actual chatbot so to get started let's navigate back into our application and into the main layout dot TSX if we look at the file tree it's at the same level as the page.tsx and a layout specifies it's kind of like a higher order component for Pages the page.tsx that you can see right here is being passed through this layout and being then rendered as the children so if we decided to put anything right here for example in H1 of hello this would be rendered kind of above the children and let's save that go back into here and you can see in the top there's a hello and this would be for this hello would show up for any page that is in the file directory tree below this layout.tsx so because this is the root layout.tsx this hello will be visible literally everywhere we don't want every word to be a hello right but what we do want everywhere is the chat component maybe you don't want it everywhere in my case I will put it right here in the root layout component and call it chat because you know that's what it is now the chat doesn't exist yet but let's quickly update the website metadata so when you navigate to the page it also shows a correct title and description I'm going to call this book buddy and then as for the description and in the metadata this is going to be your bookstore for fantasy and mystery novels and everything else can stay just the same so we are setting a specific font in our case that's going to be enter which comes from an xjs in turn package so we don't need to worry about that and what inter gives us is a certain class name that's automatically going to be applied to the body this is kind of cryptic in the actual HTML but we don't need to worry about it it's it just defines the main font that is being used on the website and now let's work work on creating our actual chat components and the chat component is going to encapsulate all the logic of what's going to happen in there with the chat header and the messages and the input and so on so what we're going to do is navigate into our source directory and then create a new folder called components so we don't just mash it all into one page well technically we could but it's not a good idea so let's put these into separate it components so the first components we're going to create is this chat with already declared right here and that's going to be a chat.tsx now again I've done this in my previous longer form videos I've got a very cool snippet where I type FC and hit tab that generates a full-blown typescript component for me if you want to learn how to use the snippet for yourself look into the description where I've provided a very simple instruction on how to do it enough also if you can't follow the instruction for some reason link a video where I explain it in even more detail but it's it's very straightforward now don't worry if you've never used typescript in fact we don't even need an interface in here because this chat is not going to receive any props the only thing we're going to do is declare this as an FC which stands for which stands for functional component and that's just for typescript it really doesn't matter we could leave this away I just like to do it but you don't have to it just may makes it easier to pass props into this component if there were any okay and in this chat let's first import it into or layout.tsx to do that let's switch over to the layout and import the chat we can save the layout and now navigate back into our chat and actually worry about doing all the chat integration because if we check on the website the chat is actually showing up already on the page which is exactly what what we want but it doesn't look like the chat I demo to you just a second ago and that's what we're going to work on next and I'm going to zoom out just a bit because it's very hard to keep an overview over the code if it's um it's all zoomed in but this should still be very visible to you okay in this chat in the demo I gave you remember there was an accordion when we clicked on the chat you can expand and subtract it and that's an amazing feature we want in this chat as well however implementing that or self would be quite a lot of work and that is why we're gonna rely on a UI library to do that but it's not a classical UI library in the web I'm going to navigate the UI Dot at cm.com a link will be in the description you don't have to go to this website and this is just for your understanding and this is basically an aggregation of react components that are built with Radix UI in accessible but perfectly customizable UI library and in our case we're going to make use of this accordion component right here that you can see this is what we're using under the hood we're just going to make our development so much faster and also or automatically Implement all the accessibility best practices so you can see the code that shared CN used for its website right here for this accordion and the installation is super straightforward it's the most straightforward I've ever seen we can copy this Command right here the installation command and then navigate back into or into our project clear the screen for now and paste in this command again you don't need to navigate to the website it's npx Shad cn-ui add accordion we can hit enter on that and that's gonna load and ask us if we want to overwrite existing files and where we want to install this component I'm going to say dot slash Source because we're using the source folder then slash components and inside of these components let's quickly create a folder called UI specifically for UI components and we're only going to have two of those one being the accordion so let's also type slash UI and hit enter that's going to put the component inside of that directory and install the dependencies great it's done installing and it seems like there is a arrow in here so there are two lines red one because there's a peer dependency missing for icons this is an icon library in this little seed react you can see right here so let's quickly install that by saying yarn add or npm install whichever you prefer the Lucid react for the icons let's hit enter on that and then there is one very small utility function missing in the CN which we're going to implement right away to do that let's navigate into our source directory and then create a folder called lib a lib folder is generally for preparing libraries for the context in which they are used in this project and in here let's create a full a file called utils dot TS that's going to export this CN function and what the CN function is for is for merging class names you're going to see exactly what it looks like a bit later um actually pretty soon and it's just a super handy Helper and there are two dependencies which we're gonna need for this function which are gonna be one Tailwind Dash merge and then two CLS X for conditional classes so yarn add or npm install Tailwind merge and clsx let's give it a second and then let's write the utility function it's super super simple it's going to be an export function CN for class names and then here first I'm going to disable GitHub co-pilot we're going to have a bunch of inputs and those are going to be if you're in typescript of type class value which we get from clsx it's a type array if you're in JavaScript lift this out you don't you don't have to worry about it um I generally prefer typescript because it must because it makes development faster and it's generally a more enjoyable developer experience that's why I'm using it here and then this function is going to return a TW merge which combines certain Tailwind classes into one for optimization purposes so for example say you had something like SDS detail when class like a left of zero and a right of zero then tail would merge will turn this into inset x 0 because it means the same thing but in a more concise fashion that's what tail would merge is for let's close out of these by the way and then in here goes the clsx that we import from clsx as well from the library and this is going to take the inputs great and you're going to see exactly what this does um in a second just allows us to write conditional classes so if this and that is true then apply this Tailwind class that's what it's for okay now the accordion is all done that was the most straightforward accordion integration I think I've ever done um it's it's super handy with the UI Library we are using and that gets us a massive Step Ahead in the chat component so what we can do now is actually start writing out with this accordion that we get from the components UI accordion now this accordion is going to get a type of single this is going to be collapsible because you know we want to be able to collapse it and if the user does not want to chat it should be collapsed and then a class name of relative a BG of white a z or z z or Z index of 4D and a little Shadow just for sliding purposes and then inside of this accordion we need to Define which items are in there and if we take a look at the accordion code and go down here um we can see we can do that via an accordion item that we can just also import as the accordion item from the exact same component this accordion item is going to get a value of item-1 and that's the only item that's going to be in here but nevertheless it's important to declare that accordion item as a component and inside of here let's first put a div with a class name of fixed with a write of a to give this bit of spacing from the right hand side a width of 80 that's how wide the chat is going to be a bottom of eighth so it's not exactly on the bottom of the page that would look a bit ugly a background of white a border and this this by the way is the Styles we're applying to the main chat that you can see in the bottom right corner that's why we have the bottom and the right radium because that's where the chat should be if you wanted it on the left side you know change the right to a left and you're golden this border is going to be gray 200 with a rounded value of medium to give it a bit of Border radius and then overflow of hidden perfect and then one last div in here and this is going to get a short class name of with full height full so w Dash full h-4 with a flex and a flex column to list items beneath one another and but if we save that and take a look at what this looks like we won't be able to see much and I probably also need to start the um development server backup for for us to see anything at all and as you can see we can't see much there's a little stripe here in the corner but that's pretty much it so what we need is something to click on to make the accordion expand and we can do that by importing the accordion trigger component from the same component as we've already imported the other two as you can see it's all thought about it's all there and super convenient to use this accordion trigger is going to get a class team of padding X padding X of 6 a border Dash B bottom border and a border of zinc 300 it's kind of like a kind of like a gray but a bit different and then in here goes something called the actually let's just put um hello and let's see what happens we can see okay this does show up the hello text is kind of weird and but it's not meant to be there anyways I just wanted to demonstrate you what this will look like and we can does it make sense to move this into a side by side not yet later on when we get more into the styling I'm gonna move this into a side by side view for now we're just gonna do this and then switch back over and then we can worry about the accordion content later actually let's first think about what should show up when the user sees the chat that is collapsed and we're going to call this the chat header inside of its own component that doesn't exist yet so let's create it so this is what the user will see when the chat is not expanded when it's in the default State on the page let's declare that file as chat header in or components and inside of this chat header this is going to be a very very simple and straightforward component so the main thing we're going to return is a div and this div is going to get a class name of with full Flex for a you know what is it called the the display Flex then a gap of three to separate the items by a bit adjust the Phi Dash start to keep the items on the left hand side then in item stash Center because we want to want them centered vertically and a text Dash zinc Dash 800 to make the text a bit darker inside of here one more div with a class name of flex Flex Dash column for um so they're stacked on top vertically items Dash start and text of small and in here we're going to put a P tag and this P tag is going to say chat with and this P tag is going to get a very short class name of text Dash XS for extra small and then we're gonna put one last div below the speed tag with a class name of flex a gap of 1.5 and an items Center for vertical alignment and inside of here goes to P tags let's create the first one and this is going to be a self-closing one because this is going to be the little indicator that the person you're chatting with is online and the way we're gonna do that is by saying a width of 2 a height of 2 and a rounded off full so this means it's gonna be very small and fully rounded and with a background green of 500 so if we take a look at what this looks like already let's quickly import the chat header into our main chat let's quickly take a look at what this looks like it's this little Green Dot that we built right here with that P tag and right below here we're gonna put a book but oops but the support So if you know who you are chatting with you could call this a chatbot to be more specific to the user that this is not an actual person they're chatting with with a font of medium great let's take a look at what this looks like and that looks good that looks like something we could really click on beautiful and if we click on that right now at least not much happens yet and that's not ideal what is also not ideal is that we don't see the little Chevron on the right hand side that indicates if this is currently expanded or not and to fix that let's go into our accordion component and then inside of the trigger this is the important thing this trigger right here we can see there's the Chevron icon and to change the color of that let's pass it a text Dash black save that and then we'll be able to see now the Chevron is there if we click it it actually spins around but there is no expanded state yet for the chat and that is exactly what we're gonna work on next in the accordion content and to do that let's navigate back into our main chat and then we've got the trigger all done good job that's a very important step because this allows the users to even click on the accordion in the first place so we we can't really get around it and then below this trigger let's put or accordion content also imported from the same component this is going to be including a div with a class name of flex Flex Dash column and a fixed height of 80 which translates to 7 at 320 pixels or 20 REM inside of here there are going to be two things first we want the chat messages and then we want the chat input two very important things and I think what makes sense is beginning with the input because without an input there are no chat messages so let's quickly write messages in here so we know it's going to be there later just mocking it out for now and then below that close the chat input as a self-closing component and that component doesn't exist yet so we need to create it inside of our components as the chat input dot TSX and this chat input is going to be what the user is going to use to interact with the chatbot that's going to show up at the very bottom of the expanded chat that's what it's going to be and inside of this chat input we're gonna demonstrate or look at a few Core Concepts of modern web development including react context for sharing State between components and then also react query a super enjoyable library to use for data fetching very very cool stuff let's initialize this as a regular typescript component and this is going to receive props actually if you're in JavaScript don't worry about all this stuff right here this is typescript specific if you're in typescript let's extend this interface because we want to be able to pass a custom class name to this component and the way we do that is by saying interface chat input props extends by the HTML attributes which is something we get from react that allows us to pass something called a generic in these um I'm not sure what they're called but these angled brackets or whatever and we can pass an HTML div element in here and the reason we're passing a div element is because the props that we're going to receive we're going to pass that on to a div element by saying first destructuring the props so this is going to be the class name and then as the second argument by saying dot dot dot props this is going to receive all other properties for this component and what that allows us to do now is pass these right on to this div right here I'm going to demonstrate exactly what we're doing in a second um but we can now because this is inferred as an HTML div element just pass these right into here and then also dynamically combine the class name with our super helpful CN utility function we have written previously and the way we do that is by invoking it passing it a string and in here we could say border Dash T border top a border of zinc 300 and then also to merge it with the class number dynamically passing into this component I'm going to put a comma and then the class name that we receive as a prop to this component and we know we receive it as a prop because we can receive all props that in normal div in react can take let's take a look at that so now if I go to the chat input wherever we render it out and hit Ctrl spacebar we can see this takes any prop that a regular div will take because after all we're passing the properties right onto regular div right and what that allows me to do is pass a class name or what this allows us to do is pass a class name so we can apply Flex specific Styles right in here because I think that makes organizing your code way cleaner except like instead of declaring it in the actual chat input itself so for example this is going to receive a padding xo4 this point is going to get even clearer when we do the same thing with the messages but we can pass a class time here and then pass it right onto this div and that's why we're doing this and then inside of this chat input let's get started with the actual logic of what this component should handle and the main thing we want is a text field right to accept the user input and before we do that let's render out one more div above this with a class name of relative margin top of four a flex of one to fill out all the possible space it can and overflow of hidden rounded of large a border Dash none and an outline Dash none I think I can go over the CSS properties a bit faster than I previously did and because I assume you know basic CSS right this is just an abstraction of CSS it all boils down to original CSS um so I think it's safe to assume you know most of this already and then here let's put a text area but this is not going to be text any text area this is one that is going to scale automatically with the amount of text that the user has written in there something that a normal text area does not do to make our life that much easier let's quickly add one dependency for that very lightweight one react Dash text area Dash Auto size it's you know it does exactly what it says on the can it's an automatically sized react text area that we can simply put right here with text area also size and the default import does not work but we can say in the very top of the file import and then text area Auto size from the dependency we have just installed react text area Auto size and this takes all the props a normal text area also takes it's going to be self-closing and then in here let's pass a bunch of properties for example let's give it a rose rose is going to be 2 by default then let's give it a Max Rose this is going to be four and if we type more than four rows um you need to scroll then a value we're going to do that later this is going to autofocus so whenever this is rendered when you expand the chat the focus is automatically going to be inside of this text area super handy stuff then a placeholder of write a message period period period and then a class name this is going to be a bit longer because we want this to look very pretty off peer then a disabled opacity 50 a padding rate of 14 a resize of none then a not black a block a width of full a border of zero a background zinc 100 a padding y of 1.5 a text Dash gray Dash 900 so you'll actually be able to see all the text that is being typed in here and on Focus this is a thing we get from tailbone CSS that makes it easier to work with we're going to apply a ring of zero a text of small in general and also on small devices and up a leading of six which is the line height okay and we also need access to whatever the user writes in there but first let's quickly save this and take a look at what we've already built so when we expand this component that doesn't work because we're importing a component that needs a user ref and that is because this needs to be a client component because we're expecting this to handle client events so we can't render it on the server as an xjs 13 usually or by default does with all components um and to fix that let's put a used client at the very top of the file meaning that this client at this component is going to be rendered on the client let's refresh this page and now if we expand this component we can see this works but there's one thing that doesn't really work it just snaps open right that's not ideal and to alleviate that problem that's a Tailwind fix we can do let's navigate back into here and then into our tailwind.config.js let's quickly just um fix this by adding a bunch of keyframes into or extend right here so what we can do in this extent right here it means we want to extend or regular tailwind and we can put something called keyframes in here and the first keyframe is going to be accordion Dash down this is going to be an object and then we're going to animate from this is going to be past an object a height of zero and then two this is also going to be an object a height of and then as a string the VAR with a dash dash Radix Dash accordion Dash content Dash height and you know what instead of typing all this out yourself because you know this is not something you need to remember that doesn't really make that much sense because I just copied this as well I didn't type this out because normally we get this from our UI Library okay we do so we can just copy and paste this no need to type this all out herself let's just copy the whole keyframe section that we have right here and that should be just this we can copy that and then paste it instead of or current keyframes so by doing that we save a lot of typing work and then while we're already here let's do one final change and that is installing one little Tailwind plug-in that's going to make our input a bit more pretty because if we take a look at it right now it looks looks pretty bad let's be real here this this doesn't look very nice to work with and to do that let's go into our terminal and quickly stop the development server and install something called add Tailwind CSS slash forms a very tiny Tailwind plug-in that's going to make all things related to inputs a bit more pretty and to import this let's require it in our Tailwind plug-in array with ADD Tailwind CSS forms the thing we have just installed then start or develop the server backup and once it's unloading we can see oh this looks a lot better that's what Tailwind CSS forms did for us and we can also see now it smoothly expands and collapses beautiful and also one thing you might have noticed that's what the autofocus for whenever you open this the focus is automatically in the input field and we do not need to click there or sells which is a huge Improvement for user experience okay and now come a few very Central parts of the whole build first off we want to obviously keep track of the state of the input by making that a control short component so we always know what exactly the user has typed into that input field let's do that by saying just input as the state this is going to be a string because that's what the user is going to type out and we also need to import use state from react in order to achieve that and to make this a controlled input let's give this to the text area Auto size as the value so value is going to be input and then that's also at an on change Handler to make this a controlled component or controlled input we are going to receive the event and then set the input to the E dot Target dot value the Edo target.value is something you know it's the exact string that the user has currently typed in that's being put into the state up here so we always know in the input constant what exactly the user is currently typing great and before doing the other stuff let's work on the most complex part of this whole build and the most complex part are two things that I quickly want to demonstrate this um demonstrate you xcalidraw.com okay this was from a previous video so what we want to do is two things first off um we want to stream the response back in real time to the user so how this works is we have the input field right the oh that's that's very large um let's zoom out a bit we have the input field right here that's very horrible and when the user types something into that we want that to be submitted to our API route that lives over here that is coming from next Js and we're going to pass the message over here then the API is going to return actually let's say text message from the user and then what the API is going to do it's going to return back a readable stream to the client but it's not going to pass it back into this component instead let's just draw an arrow right here this is going to be a readable stream that the API is going to pass back and what that allows us to do is stream the response in real time so as it comes in we're displaying it to the user instead of just generating all of the answer from the chatbot and then displaying it to the user this makes for a much Superior user experience and the way we're going to do that is through react query so this is going to land in react query and by doing that from react query we're going to display it to the user and the reason we're going to use react query is because it makes loading States and error handling and so so much easier and much more enjoyable to work with than if we kept this in regular State and while this is the most complex part of the build do not worry we're going to take it step by step and go through very sequentially and um you know just step by step it's going to be very cool um it's not going to appear as complex as it really is first off let's get started with react query and to do that let's say yarn add at 10 stack slash react Dash query that's how we install react query they change the naming a while ago so now it's called tan stack but it's essentially the same thing okay now there's one thing we need to do with 10 SEC work by the way if you want details on react query if there's anything you don't understand in this whole process um again I'm gonna do my absolute best to explain it but if you want more information there's a separate video I did on react query specifically if you get lost anywhere that is the best resource to watch okay to get started we need to provide something to our entire app and that is called a query client provider in order to use react query across our application however that is a context and because this is a server component by default after all we didn't declare it as a client component we can't just put a context provider in here so in our components folder let's create a file called providers.tsx and this is going to be a client component that's going to encapsulate all the context providers we have in our application and the main thing we want is the query client Provider from 10 stack react query so we can use react query throughout or application and if we take a look at this what this requires it's a client and this client we're going to Define up here cons query client is going to be a new instantiation of the query client class that we get also from react Dash query client up here and invoke that and this is the client we're going to pass into the query client provider and because this is a provider it's also going to receive the children so we're going to wrap or children inside of this provider and that allows us access to the children property inside of the provider so what this looks like in practice is let's put the provider around or body content so this is going to be passed to the providers as its children which means we need to receive them here if you're in typescript these are of type react node a type we get from react and then we also need to render them out of course because if we didn't nothing would show up in our entire application because we're not rendering the actual children right so we definitely want them in there hit save and if we take a look at our development server that is not active anymore you'll notice that nothing has changed because after all we didn't change anything appearance wise but now we set the foundation to actually work with react query across our application okay this is the loading but again nothing visually changed so we can just go ahead and plow ahead we can close out of so many of these files by the way let's clean up our workspace just a bit and then let's get started in the chat input so the first step we're going to take is this right here we're going to collect the text message and send it to an API route with react query so that we can then in the Second Step get back the readable stream which is um one of the core steps of this whole build to get started with react query let's put something called const and then an object because we're going to destructure something from the use mutation hook this use mutation hook we can import from tan stack react query invoke it and pass it in object and if we take a look at what this object can be passed it's a lot of things the most important one for us is the mutation function though computation FN and this is going to be an asynchronous async arrow function that we can put right here and this is nothing else than irregular fetch function again if you want more details on react query there's a whole video I did on this topic um but in here goes irregular fetch or axials function um but what is being handled for us is all of this stuff right loading States error States and so on you don't need to worry about that anymore for example if you wanted to call this function we would do that through something called mutate that we can this structure and then call it send message because that's what it's going to do and then also we get the is loading State just like that by importing it from the use mutation instead of having to declare all of those States or self which is super neat and then let's make a request to one of our endpoints in the mutation function um this is going to be a regular fetch request so let's say const response is going to be equal to away Fetch and the route we're going to use to generate the readable stream and pass it back to the user is going to be hosted under slash API slash message now before we're going to take a look at all the other stuff that react query provides to us we're just going to make sure that this endpoint works this fetch request is going to be of method pulsed because we want to submit the message data the user gave us with the headers of object and then inside of here the content Dash type is going to be of application slash Json and the body we're going to pass into this post request is going to be json.stringify the okay the messages well um I was a bit too fast here the messages is something we don't have yet so for now let's say messages it's just going to be hello and these aren't the real messages but let's mark them out for now and just so we can verify that this endpoint is working through react query and then return the response dot body this is going to be containing the readable stream later on that's why we're also not doing this through axios even though the syntax with axials is cleaner I found the this approach way better if we actually want to return a readable stream back to the user let's quickly set an on success Handler right here we can do that right under the mutation function with an on success that's another big beauty of react query by the way so if everything goes smoothly with this request we can just console log a message called success from the API endpoint now currently this set message is never being invoked and realistically when should it be invoked let's think about it for a second so when the user types into the that's not where I want to go the message should be sent when the user types something into this field and then hits enter right that's the that's when the message should be should be sent and to do that let's go into our text area Auto size and then let's do that with an on key down because in the on key down we can then check if this is an enter key and if it is we're going to define a nine function right here if it is then we're going to send the actual message first off let's check if the key is the enter key and we can do that by saying e dot key if that is triple equal to enter with a capital e and this is not the E dot shift key so if the user is pressing enter and shift that means they want I keep doing that they want to move into a new line whereas if they just press enter they want to submit their message it's a very common thing you'll see in messaging apps a shift and enter new line and regular enter send message very very common thing first let's prevent the default on the event by saying idot prevent default and then let's construct the message we're going to send to the back end con's message is going to be equal to and then we need to think about what should a message 3D contain and there are three key things a message should have one it just needs an ID and to generate the unique identifier for each message we're going to use a package called nanoid very lightweight has millions of installs it's very good yarn add nanoid or npm installed nanoid and then we're going to import that as a named import and it probably will do that automatically it will great and then the second thing we want on the message is we need to know if this is a user input or not because we're going to Define them or style them differently if it's user input it's going to be on the right side of the chat and in blue and if it's a bot output then it's going to be on the left hand side and in Gray so is user input is going to be true because this is what the user wrote and then the text is going to be the actual input and those are the three properties we need on each message and then we can send the message and pass it the message remember this is the mutation we have put way up here and typescript is going to be complaining because currently it's not expecting anything here and this will be expecting a message however if I just put a message and not type this explicitly with typescript it's going to say void because the typescript doesn't know what type the property has that we're going to pass inside of this function so we need to tell typescript this is going to be of type message typescript you need to know that but there is no type message right so that's one thing we need to create and one very very cool thing I want to share with you is how to create this type while also making the validation possible on the back end it's a super cool practice to implement because it's going to enforce Security on your back end um because how we're going to do this is defining something called a validator with a library called Zod which is basically a schema we can parse against so we can define a certain schema so this is going to be our object I know this looks ugly but we can define a prop one in here and also Define a prop 2 in here and this looks horrible let me remove this that just looks really bad and let's just put the actual curly braces in here like that so we can Define that something we want to receive on the back and as the post request body for example can only contain prop 1 that is a string for example and prop 2 that is a number and if we try it to pass the back and something that does not match the schema for example we tried to pass a number of 600 as the prop one into the back end and this would parse it with this schema that operation would fail so we are enforcing security onto our back end by using a schema and you're gonna see exactly what that looks like right now to do that let's go into our lib folder create a folder called validators and install a library yarn add called Zod z-o-d hit enter or npm install that works as well and then in the validators let's create a new file called message dot TS and this is what we're going to use to validate each message to create this schema we're going to parse each object with we're going to import Z from result the library we have just installed and then again if you um didn't really understand what what I just explained in the excalator it will get clearer trust me um I hope you understood it though but I'm going to show you exactly what I meant to write down um so let's export a const message schema from this file and this is what we're gonna pass the odd pars the objects with and this is going to be a z dot object and the syntax you're about to see is going to very closely resemble typescript but with the main difference being that it actually allows us to parse against this schema in here we're going to define the same properties we have already defined with the message way down here so it's going to get an ID a user and is user input and a text so the is user my typical is user message I call it is user message that's a quick fix we're gonna have to do in the chat input in a second this is going to be a z dot Boolean invoke that and then lastly a text that is going to be a z Dot string great and that's our message schema done quickly let's create the array validator for this because we're going to pass an array of messages to the back end it's going to contain all the previous messages that were already written so GPT has some context and remembers the previous messages and to parse those we're going to create a message array schema this is going to be a z dot array and then here we're just going to pass the message schema so now we can not only parse one message object but also an array of those objects and for the type we can very easily generate a typescript type from this array by saying export type message is going to be equal to Z Dot and now we can infer the type of the schema we have written by saying type of message schema and this type we can now use across our application for example right here to validate we are passing the right thing let's import that from our validators in here and as you can see it's going to give us a bad or it's going to complain on this line right here because I call it is user message and not its user input we can save that and now we can expect the message as the message type in here and also use this schema or the array validator to parse the incoming messages on the back end because never trust the client we want to enforce Security on the back end inside of this API rod and verify if it works correctly so whenever we hit enter on the text area or message is going to be sent to the back end however there is no backend yet nevertheless let's quickly try it out and it's the development server running no it's not let's spin up the development server and quickly try out if this is working so we're going to type a message and then hit enter and that's going to invoke this react query send message function and that should give us a 404 because this API endpoint obviously does not exist and let's reload the page and see what happens let's open up the chat right hello hit enter and we should probably go to the network Tab and we can see if there is a 404 right here the API Rod doesn't exist which is exactly what we expect and to create this API rod in nexjs it's super straightforward all we need to do is go into our app directory and then under API which is pre-created for us by the way we can get rid of the um the the regular thing that's in here when we create a new project and create a new folder called message this is going to be what's reflected in the URL so this API endpoint is going to be available under slash API message and here goes a route dot TS file a name that is reserved by nextges for defining or route and to declare or function or API endpoint is super straightforward as well we're just going to export a function that is asynchronous and this function takes the verb the name of the verb of the request that we're making for example post because we're expecting a post request in this endpoint it takes a request that is of type request this is a regular web request so it's not a type we need to import and this is going to be a regular function let's log out endpoint Works save that and now let's try this again let's clear all the files just take a look at the fetch requests hit enter and now we can see okay there is an internal server error because we're not returning any response from here but the endpoint works which is great because now we can actually implement the logic this endpoint should handle and the logic is if we look at excal draw when this API is called it should go over to open ai's GPT model and take in the input prompt GPT to answer this customer and then open AI gives our API a readable stream response back that we then pass onto the client right so this is like a two-step process and we make a request from the front and tour API the API makes a request to open Ai and then back again to the client okay first off we need information on this API endpoint though and that is going to be what message has the user typed that's the important thing so we're going to destructure the message yes because it's going to be multiple from the request body and the way we accept the request body is by saying await rack dot Json this only works if the request is of type application Json but we know it is when we make an authorized request from this application.json content type header we have set right here in the fetch request if somebody tried to fetch this route via Postman which they very well could and does not include a application Json as the header this would fail because it can't destruction messages in that case this could be anything this is client input we can't trust it so what we're going to do is parse these against the schema we have created to make sure this is or the messages that we get in are of this type in an array that's how we make sure the client input is legit right they could be anything anything malicious so we're going to say const parsed messages are equal to message array schema dot parse and then in here we're going to parse the messages that we get from the body so we have no idea what this is it's any right but if I hover over the parse messages we know exactly what this is because the parse is going to fail if it's anything else so we know all parse messages are exactly of the type we have defined in or schema and what we want to do now is work with the open AI API to stream back the response to the client and the way we do that is by first generating our outbound messages let's say const outbound messages these are the messages we're going to send to a chat GPT and these are of type chat GPT message this is a type that does not exist yet and we're going to create that inside of a file that allows us to also parse the readable stream pack from open AI it's like a little helper type very convenient and we're going to Define that in our lib file under open AI Dash stream dot TS and inside of this open AI stream this is going to be the most complex file of the application again don't be intimidated I can see it over here on my right side it's just um it's like 50 60 70 80 lines of code it's not that much don't be intimidated by it and from here we're going to export a interface called chat gbt message and this is going to take a row of chat GPT agent and then also a content of string this is what the open AI API expects as the beta format and to import the chat GPT agent type that open AI provides to us we also need to install open ai's library and you know what I'm wrong we're not even going to use the open air package we can do just fine without it and Define the chat GPT agent or itself I think that's going to be even easier and approach I much prefer so instead of importing it from some separate package let's say export type chat GPT agent and this is going to be a literal of either user or system now I didn't come up with this this is a naming convention that open it's not just a convention it's something that open AI enforces when you make an API request to them it needs to be each message either the user or the system and that's how I know this type again if you're in JavaScript don't worry about this at all but now we can import the chat GPT message into our route.ts great and this is going to be of type array because we're going to pass around multiple messages and this is going to be equal to the parsed messages dot map so we're going to go through every message that's what the dot map is for and then we can receive every message that we map over as the first argument or parameter I keep messing those up from this we're going to return an object and we can return an implicit object by wrapping it in the parentheses um and in here we're going to put the row and now if I press Ctrl and spacebar you can see we know this has a content and role because I've defined the type up here that's the beauty of typescript very very convenient if we were in JavaScript I could pass anything in here and it wouldn't notify me hey that's wrong man you're being silly so in our case we know this is going to take a role and this row it depends on if this message is a user message or not so if the message is a user message message.is user message then the role is going to be user and else this is going to be a system message to open Ai and the content of this message is going to be the message dot text that we got passed from the front end great and because we're going to render out the messages in a reverse order more on that later for now just accept it we're going to render all the messages in reverse order on the front end I'm going to get exactly to Y later we need to say outbound messages dot unshift so we want to include as the first item of the array it's kind of like the push as the array method but push inserts at the last slot of the array whereas unshift inserts at the first slot of the array and in here we can know that this is either either a role or a Content because we are referring to this type and we have explicitly defined the type of that constant so we know we need to pass a role in the content the role is going to be system and the content is going to be the main prompt that we're going to pass into a chat GPT and I abstracted this away so so let's call this chatbot prompt it would it's quite long prompt so I put it into a separate file and it's just a constant after all we could have put it in here easily um I just didn't want it to mess up the visual real estate of this API route now now what is the prompt where does it come from and um let's put this prompt into or actually let's create a new folder for that let's create a folder called helpers these are helper functions or constant across the whole application and then my Approach is putting this into a constants file or yeah folder and then creating a file in there called chatbot prompt dot TS so again I created a folder called helpers in there a folder called constants and in there a file called chatbot prompt.ts what I read recommend you to do now is go to GitHub and then inside of the main repository that I provided to you because we don't want to type this prompt out yourself this is um up to you to improve for your specific use case but I'm going to tell you exactly what I used for the demo for the bookstore so let's go under helper slash constants and then the chat bot prompt and you can see my exact prompt I put into chat GPT right here we could just copy this there's no point in typing this out paste it into our chatbot prompt and now it did kind of mess up the spacing so let's put an enter here a new line there and also in new line there and it demands some kind of book data as you can see in this chatbot prompt there is no factual data that chat GPT can rely on when providing the answer and so what I have decided to do is also provided with the source map of the website so it knows exactly what links exists that is what you can see in this book data file right here I wrote this as kind of pseudo code this is like a site map you will dynamically generate and put into your jgbt I just mocked this out with a few items like a few books in here that I manually typed in normally you would do this dynamically and automatically and generate this every kind of you know like everyday or something and then pass that into chat gbt to provide the best possible response because now by doing this it actually knows the structure and the links of your website how much it costs and briefly what each book is about if you're wanting to open a bookstore website and then it can actually recommend things to users users can ask about the book and then get feedback on that and we achieve that through this file right here so let's copy all of this and this would again apply to your use case for the book data this is a good example and then in the consents create a file called book Dash data.ts and hard coded in here again dynamically would be the way to go for an actual application but since the use cases differ so much I can't really show that to you um so if you have a small website just hardcoding this is totally fine and if you're on a larger Dynamic website generate your site map every now and then and then I just put it into jet GPT for the request great so now we can import the chatbot prompt oh and what's happened here so what we can do now is actually import the chatbot prompt and why does the import look so weird okay that works though great so now we have the chat GPT message all done with the data it needs and the prompt that it's a helpful bookstore assistant and now we are all ready to actually create the readable stream to open Ai and then stream back the response to the user in real time one of the I'm going to be honest most complex steps but also one of the by far coolest and most impressive if you want to put this onto your portfolio for example this is really something to talk about okay welcome back um for me it's another day for you it's probably just a few seconds that have passed I got my coffee here and I even have myself a note where I left off and where we will begin right now and again the most complex but also the most rewarding and the most fun part is right ahead because now we will be generating the open AI stream that we can give in real time back to the client and the way we do that is first let's define the payload we're going to say const payload and this is the payload we're going to send over to the open AI API and this is going to be an object now the model for this is going to be in quotes gpt-3.5 Dash turbo this is the normal chat GPT API what you could also do is use gpt4 instead if you have the API access um the gpd4 is better in reasoning but it is also way slower and I figured for customer requests like this the GPT 3.5 turbo is a better choice that I would actually go with um for you know client project Etc where I would put this chatbot in okay then this takes a messages array and these are going to be the out bound messages we've already prepared them that's the most work for this payload already done as the temperature we're gonna pass 0.4 if you've never worked with open AI the temperature is pretty much how creative it will be in the end that's how you can imagine it the top underscore p is going to be one then the three oops frequency underscore penalty is going to be zero and by the way if you're wondering how I know these These are in the actual open AI documentation if you want to make some API requests I'm not making them up they I literally got these from the documentation the presence underscore panel T is going to be zero as well the max underscore tokens how long the chatbot response will be is going to be 150 so that's about about very roughly 150 words a stream property will be true very important this is required for us to have a readable stream in the end instead of just making the request and getting back the entire response when it's unloading and the N is going to be one great okay if you need any information specifically about these properties again that's in the open AI documentation for example the frequency penalty is how often a certain word will occur or the presence penalty I think it's a value up to two you can pass in here would be if words are already present in the prompt the higher the presence penalty the less it will use those words that are already present um but more on that in the openai documentation okay and now to create the stream we need to create a little helper function first off in the API route let's say con stream and I'm going to close everything else by the way we won't need that right now it's going to be a weight and let's think about oh this needs to be oh wait let's think about what we want essentially we want one function that takes in this payload and returns as a stream so we can return the stream to the front end right so let's call this helper open AI stream and pass it the payload that's how we want it to be syntactically and now we can work on the actual implementation of open AI stream of the function to make this work what we can already do however is return a new response back to the client that's making the API request containing this stream and because we're in the next JS 13.2 or 0.3 API Rod handlers we can do it just like we would beforehand if we were on the edge runtime now we can just return the new response without defining the edge runtime anywhere which is pretty handy we don't even need to worry about that I can remove the URL left off part and now let's work on the open AI stream helper very very important function inside of our application in fact we have already begun with that if I go to open AI Dash stream we've already created the file for that now to make things easier for ourselves in typescript let's copy this payload and create an interface from it so we can ensure type safety in there as well and let's call this export interface open AI stream payload again I just copied over the payload we have already defined in the router TS in into the open Airstream file and now let's make an interface out of this the model is going to be of type string the messages are of going to are going to be of type chat GPT oops message array that's what this type appears for and then everything else right here is going to be a number and the stream is going to be a Boolean instead of a number and that's the interface done now if we make a open AI stream anywhere else in our application in our case we won't but in your case you might if you're extending on this app we can enforce type safety just like this by declaring the type after the payload by using a colon in between and then importing the type from or open Airstream we have in the lib folder great this file is done that's the complete API Rod however the functionality for the stream is still missing and that's what we're going to work on right now the open AI stream we have declared right here that's the function we need to create in this file to do that let's export an async function with the exact same name very important that takes in a payload we know that and this payload is going to be of type open AI stream payload and then in the body of the function we can get to work first off let's import this function like that and now everything is done in this file great let's work on the open AI stream this is the most complex part I was talking about first off we need an encoder let's say const encoder is going to be equal to a new text encoder and if we hover over this we can see what this does it takes a stream of chord points as input and emits a stream of bytes so it encodes text and similarly we want a decoder const decoders equal to a new text decoder this is what we're going to use later to decode the text that comes in from the Open Air Stream because that's not a string but I think a buffer so we're going to decode that don't worry if you don't know exactly what these do or have never worked with them you're gonna see exactly what they what they do later actually not very much later just in a second here okay now let's create a new counter variable by saying let counter and we're using let because we're going to mutate this later and you're going to see exactly why um it's kind of like an iterator and now let's get the actual readable stream from open AI so what we're going to do now is pass this payload to open AI to generate the output and we're going to do that through a regular fetch request we can say constress is equal to a weight Fetch and now the API route we're fetching is the following https colon double slash API dot openai.com slash V1 slash chat slash completions um again I did make this up this comes from their documentation now we do need to pass data to this endpoint so we need to make this a post request we can do that by saying method of post and then as for the headers these are very important we cannot omit these and the first header we want to pass is the content content Dash type of application slash Json and then the second one very very important authorization the authorization authorization header and this is going to be a template string of Bearer this is a common thing you see in authorization tokens or headers it's the bearer and then after a space bar we're gonna pass the process.env dot open AI underscore API underscore key you could call this whatever you want as long as this is the API key from open AI that you define in your environment file very very important that you pass this otherwise all your requests will fail with a status code of what is it like 4-0 how can I forget yeah I thought it was four one okay I was correct 401 unauthorized great okay and now as for the body let's pass that as well the body is going to be Json Dot stringify and then here we're just going to pass the payload that we're passing into the whole open AI stream thing from oral.ts over here that's the payload we're passing into this and now comes the actual interesting part creating the readable stream and we can do that by saying const stream the stream we want to read is equal to a new and now we're going to use the readable stream API like that and invoke that and pass it an object and if we take a look at what this takes it takes you know a pull cancel start and type what we're going to use is the start and we're going to make it asynchronous because we're gonna do some asynchronous actions in here this takes a controller if we hover over this we can see the type it's a readable stream default controller so it's already inferred the type correctly and then in here oops in here like that we can actually get started on the logic first thing we want to do is Define a function that is called on parse and the following code lines might get a bit cryptic to some again this is all in a GitHub repository so if I lose you anywhere make sure to visit the GitHub repository and get unsucked that way but again I'm gonna do my absolute best to explain this as good as possible the on parse we're defining right here takes an event and because this is a function we Define ourselves we also need to define the type of this event and this is either a parsed event or it's a reconnect interval and because readable streams are not super intuitive to work with we're going to install one little helper Library so is that working with a readable stream and this is going to be yarnet or npm install Event Source Dash parser if we take a look at the underlying npm Library Event Source parser npm let's take a look at it um ah here it is okay so we can here here's a little summary of what it does you can create an instance of the parser and feed the chunks of data this is important because what we will get back from open AI is nothing else than chunks of a string right it's the the string that we get back from open AI partial or complete that's great and the parse emits parse messages once it receives a complete message that's exactly what we want that's why we're installing this library and then we can start back up the dev server after doing that and these types I've just declared right here these come from the library we have just imported and I don't know why these seconds import does not work automatically this is going to be reconnect interval we can just import that great and now we can actually get started with the on parse function so if the event that we get back in the on parse that get passed into this function if the DOT type is equal to and now if I hit these strings we can see it's either an event or it's a reconnect interval and in our case we want the we want to check for the event type then let's declare accounts data which is going to be equal to event dot data we get access to the actual data a string in the event.data and saving that in this constant and then we can do a little check right here if data is triple equal to and then when the event is done there's a specific syntax we get back and that is this right here it's a capitalize done in angled brackets if this is done then we want to close this whole readable stream and we do that by saying controller Dot close and invoking this and then we're just going to return to stop any further code execution down here okay and if the data is not done if there's an actual readable stream we want to read then let's go ahead and write the code for that we want to wrap this in a try catch block because there might happen an error along the way and we want to catch that in the error and we can do that by saying controller dot error not in queue but error and pacity error that we receive and that's that's all the error handling we're gonna do and then inside of the try block is how we're going to handle the actual data to normal text logic and to do that let's get started by declaring acorns Json and this is going to be equal to Json dot parse and we're going to parse the data that we get what we're going to do now is convert this Json into regular string like into a regular string and the way we do that is by saying const text is going to be equal to Json and now comes an interesting part and Dot choices at the index of 0 dot Delta question mark for an optional chaining dot content I know not very intuitive but it does get apparent when we log out the Json just for later I'm gonna do that say Json Json so you know how this can be right this seems very abstract I'm gonna be real with you and then or or an empty string if this is undefined which it very well could be because we do an optional chaining right here okay and if this text is a prefix character we don't want to do anything for example a new line would be a prefix character and the way we can check that is by saying if counter the thing we have declared up here if that is less than 2 and now we're going to use a regular expression to actually match the prefix character by saying text Dot match and then here we're going to put the regex it's going to be a double slash and then in here goes a backslash n for a new line or this is an empty array and the length dot length is zero then we want to return again I know seems cryptic at first I want to be very real with you here this is a prepared video right I wrote all of this in advance if I was actually coding out this app there's no way I would just write code this way right it's one step at a time it's just you know figuring out the Json then console logging it figuring out what to do with the Json how to convert it into text then console logging it this is not the this linear approach you see in this video it's good because you don't get any errors throughout the build and you save time by doing that but nobody writes code like this in actuality this is so you learn how to write code um but don't worry if you're not writing code like me here in the in the video in your actual projects because neither am I this is a very prepared video I just figured that's very important to say because nobody you know just writes some code like this anyways um after this let's say cons Q is going to be equal to encoder dot encode and in here we're gonna pass the text and then afterwards we want to enqueue the the queue we have just created into our controller and the way we do that is by saying controller dot NQ and then the queue we have just created great and then last thing we want to do is increment the counter by saying counter plus plus great um and I think we are mostly done one more thing we need to do is actually create the parser because at the moment the on parse function is never being called and that's not ideal to do that let's go below the on parse function but still inside of the async start that's going to be right here and then let's feed this into our parser um let's get started by saying const parser is going to be equal to create parser something we get from the library we have oops parser a function we can import from Event Source parser very handy stuff and then pass it the on parse function if you got lost as to why we are doing this this is also very possible right we're just writing a bunch of code right now um this is to turn the open AI stream into actual text into a string we can deploy on the front display on the front and that's what this whole code is for I know it might seem um you know very very hard to understand I totally get that um it's not very intuitive it really isn't okay last thing we want to do is form a weight and then a const chunk off the res dot body and the rest dot body is going to be a readable stream because we passed the stream true because of this property right here the Stream True the rest dot body we get bad back from the fetch request up here is going to contain a readable stream that we're now splitting into certain chunks and let's just type this as any so we don't get the arrow there and then we're gonna feed this into our parser by saying parser dot feed and then the decoder oops the coder dot decode and then the chunk we're going to pass this certain chunk of the open air response into or parser and now we're done in this file last thing we want to do is return the actual stream from the main function return stream great we can save that and let's just try it out I think that's the most fun so how about we just log out the text right here text and then the text as we get back the response let's save all of this and our development server is running already so let's try this out and see if we get anything locked to the console we can already make the request to the API endpoint which is great and I think there were no more errors were they no okay great so let's go into the network tab side by side with the actual chat window let's open the network Tab and type hello how are you and hit enter and let's see what happens and we got a 404 that's not great um okay what's the what's the problem okay it seems like the problem I just checked it out is that we are expecting in the API Road in Array of messages but just to mock it out we are currently only sending one message that contains hello and that's not really what we want so instead let's try using this message right here we pass into the mutation function and let's put that into an array instead of inside of this object so let's put this into an array and then wherever we call the send message we are already passing the message and then we're converting that into an array and passing it to the endpoint let's see if that works um I'm gonna just send this again and no that that didn't work either okay I'm gonna I'm gonna take a couple of seconds and work on fixing this and then I'm gonna be right back um I mean this is fixed in the final project I just want to demonstrate you um how far we currently are and we can't really do that with just one message give me a second I'm going to be right back and I'm gonna make it work okay so if you take a look at what we are expecting right here in the euro.ts we're expecting a messages array um so one little fix you can do is just putting the messages right here into the object and then converting the single message we're currently sending it's going to be multiple um soon just inside of an array saving that and now we can actually send the request but it's not going to go through yet um because we haven't set the open AI in environment variable yet that we are depending on in or open Airstream sending it as the bearer so we also need to configure that in our anv file and for this video Let's create a DOT envy.local and I'm going to show you my environment variables and because I think it makes it so much easier for you to follow along and then after the video I'm going to change them um so let's declare and open a oops open AI API key and let me generate a new one for this video and then put it in here the way we can do that is by visiting platform dot open ai.com and then logging in with our account I'm going to do that and be right back all right I've logged in and if you go up here in the under your account and then under API Keys um you can create a new API key if you don't have an open AI account yet it's very straightforward to set up and you can just get 20 or something of free credit which is enough for many months if you don't use it um that much and then we can create a new secret key and we can copy it from open AI I'm going to drag this back over here and then let me put this inside of my environment variable right here the open AI API key after we've done that we need to restart our server yarn Dev so the environment variable changes are reflected in the actual server and then let's give that a hot second to load then let's navigate back to our main page and try this out again if the request will actually go through now let's open the chat say hello how are you hit enter and let's see what happens to the message and it went through and you can see the the size is continually growing however we don't get any console logs yet um that's weird let's see why that is um so the stream seems to be working because what is what happened the stream seems to be working because the size did increase as we got the response back and we could take a look at this um right here under the response we can see if this is the full response we got back from open AI so the API communication is actually working as expected it's just not being logged to the console in the open AI stream yet you know what it does work I just looked in the wrong spot obviously because this has been executed on the server and the route.ts this is not going to be logged out in the client console but instead in the vs called terminal and it is um so this is where we can see the Json and we can see the Delta as an object if we log this out as a json.stringify you could actually see the text in there and we can see the choices that GPT has gone for great so we can see the data done is being logged out and that seems to be working correctly so let's close out of the open AI stream file and actually work on displaying this to the user so in the response body we know we are getting back a readable stream that takes a generic of a u in eight array not very important the important thing is that we are getting a readable stream and the question is how do we display that to the user and I think the first thing we could do is just log out the decline console that would be a good first step so let's receive the stream in here in the on success Handler on the chat input we can see it's a readable stream and now we first need to convert the readable stream into a regular string right and I just noticed I explained it the wrong way I think I messed up so um what I said in the open AI stream is that we wanted to convert this into text no that's what we're gonna do right now um the the converting to text is what we're going to do in the own success I think I messed up there um just to make it very clear now we are going to work on getting a readable stream from the server onto the client receiving it right here in the on success and what we want to do now is display that readable stream as a string in real time um to the client very important um okay and now we can actually work on that so if there is no stream which there should be let's just throw a new error called um no stream found or whatever not the most graceful error handling but there will be a stream and if not then we're gonna handle this in the year error and let's decode the readable stream and just log it out to the console so you can see in real time what open AI gave us back is the response the way we do that is by having the same logic as in the open AI Stream So what I said here is I'm converting this to a M string it was true it was just not meant for the purpose of the server but the converting to string logic that I mentioned earlier is going to be exactly the same in the chat input by first declaring a reader and that's going to be a stream dot get reader and then secondly a decoder calls the coder oops decoder is going to be a new text decoder um class and we invoke that great um then let's say let's done equal to false um as kind of the iterator we had in the open AI stream because we're going to use a while loop very simple one we're going to say while we are not oops not data while we are not done the following code block is going to be executed first thing we're going to do is destructure something from the reader.read which is an asynchronous operation so we also need to declare the unsuccess As asynchronous and this is going to be a weight reader dot read invoke that and then what we can destructure from here is one the value and then B the done and I'm going to call this done reading and if we are done reading then we want to set or done to done reading so the while loop will get interrupted and if we are not done reading then we can actually get the chunk value let's call this chunk value it's going to be equal to and then decoder not the red Docker decoder dot decode and we're going to decode the value that we get from the reader this is a u in eight array now we're decoding that and turning it into a regular string we can read and this string is the writer text we're going to display to the user and we can do that by using state but in our case we just want to log it out for now um so let's quickly say mocked let text equal to an empty string for example and then let's add this to the text or actually nah that's not the not that's not the best approach let's just log out the chunk value and this might look a bit weird on the console but you're going to get the idea and later we're going to put this in state so it's gonna get shown to the user in real time and then we're also going to clean up later um let's not worry about it for now first off let's see if we are getting the values logged out in the console that would be very important so let's say hello how are you into the chat hit enter and hopefully we can see the real time stream very nice and again as I said this looks weird because we're logging out each chunk separately and not um you know appending them to one string it looks kind of weird but you get the idea we get back the stuff in real time and adding that to state will actually be super simple now we don't want that in the console though Josh what are you doing we don't we don't need it in the console we need it right here in the chat window and the way we do that is going to be through context because we're gonna handle the chat input where we fetch the data and the chat messages in separate components and so let me quickly walk you through the architecture of how we're going to do this and so we have the main chat component up here this is going to be the chat and then we have two components at the same level it's going to be the chat input which we've already built out what we're writing in here this is the chat input and then also we've got the messages chat messages we don't we haven't created that component yet this component is going to encapsulate all the logic for the chat messages that have been sent previously now the question is if we fetch the data in this component right here then how do we transfer it over to this component without doing some weird thing across the parent and stuff that's an approach you could take I guess um I went with context because I think it's the more intuitive solution and so what we're going to do is declare some context and then pass the data over to the messages component using context great and the way we're going to do that is by creating or context and the way we're going to do that is by creating a new context folder I think it um is better for code maintainability it just looks cleaner in the file system as well and so instead of our source folder let's create a new folder called context in which all the application contacts is going to go in our case that's only going to be one file called messages dot TSX um TSX is important because we're also going to declare the provider right in here instead of giving all the values wherever we render the provider okay and the way we create the context is going to be export const messages context and this is going to be equal to create context a function we get from react if you don't know what context is what context allows us to do is for example wrap this entire and why did I draw a line like that wrap this entire thing in a context that's what the um red line should indicate and then each of these components in here has access to which values are in the context so if the context was const name name is going to be equal to John for example then if we want the top the chat could access the name the input could access the name and the messages could also do the same without having props being passed around because the cons name John is being encapsulated in the context that's wrapping all of them so we avoid prop Drilling and that's what context is good for in react and let's declare our context but if you're in typescript by the way we also need to declare the type of the context that's about to ensue if you're not in typescript this is going to be even easier for you but I am so what we're doing right here is defining the type of the context in line that's why we have these angled brackets I think they're called with curly braces inside this is where we're going to define the type and then this is the actual context you would also need with JavaScript as for the type we're gonna have six properties inside of our context first one is going to be messages and this is going to be of type message that we get from our validators array again the reason we're doing this is so all components that wrap by this context as F have access to the messages and can manipulate the messages as well as you're going to see right now we want an is message updating to show a loading State for the user if it is if the and what the message updating means is that when we are putting stuff in that we got back from the open air response so for example the message would be updating while while open AI is writing this stuff this is while the message is updating we're going to create the message and then add to it as we get the chunks in right that's how we're going to display the message and that's what this updating property is for let me open up my guideline here on the right side then we want an add message this is what we're going to do to push the message from open AI into the chat and also the user messages this takes a message as the argument with that is going to be of message type and it's going to return nothing so void then we want to remove a message in case anything goes wrong because we're going to update the UI optimistically which means we're going to assume everything goes right and put the message once the user sends it right into state so when I send a message right here it's going to be put right into the message state but if anything goes wrong then we can remove it again this is better for user experience instead of loading all the time and then just putting the message in while it's actually being sent because nowadays users kind of expect optimistic updates this takes an ID of string and also returns a void then we want to update the message it's also going to be a function that takes in an ID of string and also a callback function an update function that receives oops that receives a previous text of type string and then it also returns a string and the function in general is going to return um just nothing the update function um so this is how it's going to go when we first get back the initial response from open AI that doesn't contain a string answer yet and we just know there is an answer we're going to add that to them to Jesus to the messages if anything goes wrong we can remove the message and while we are getting the data from open AI we're going to update the message to show the data that we get back inside of that message and then we're also going to have a set is message updating last property set is message update thing to show a loading state to the user you know if this Boolean is true the loading set is going to be shown and to set that value we need this function right here and this is going to take an is updating of type Boolean is up dating of type Boolean and return void great and now we get really nice intellisense right here and or actual context the messages are just going to be an empty array and what we are declaring now is the fallback values so for example if we didn't provide any actual values into this context what we're declaring right now is what the context would fall back to the is message updating is going to be false so kind of the default values as you can imagine the add messages add message function is going to be just a function like that and same goes for all the other functions remove message just to I think it's called brackets or braces braces bracket I think braces is the thing you get in your mouth um remove message just an empty function Jesus Christ um then the update message also an empty function and oops and lastly we want the set is message updating also as an empty function um for the default values and why is the add map okay we want to add message this needs to be a singular so we can put it down here great that's a very very big part of the context already done the last thing we're going to do is Define our main provider that we can wrap our app in so all the components that are being encapsulated by the context as I've shown you right here for example these um actually get access to the proper values so let's export a function called messages provider this is what's going to wrap our application and this messages provider is going to get a actually no that's not going to be equal this is irregular function this just takes in children because we're going to wrap certain react notes in this function it's going to get the children we're going to define the type inline if you're in typescript um of type children colon react node to that typescript know what the type of these children right here is react node is a type we get from react and then we can work on the actual component this is why we declared this as a TSX file because inside of here we're going to return a regular contacts provider and pretty much um so we can already do that actually and let's return the messages context dot provider and this is going to wrap the children we get back from this provider and why doesn't it work because we also need to return this okay and the messages contacts provider expects a value and this value is going to be an object and we can see it expects all these values what are those values though we don't have them do we we have declared the default values but we don't know the actual proper values and we can't really update them in order for these to work we need to keep the values in some kind of state and the question is where do we do that and the answer is we do that right here in our messages provider it's the most convenient place to keep track of all these things so for example let's declare the messages State this is going to encapsulate all the messages that have been sent in the certain chat and let's pass a default value into the state this is going to be an array again because we have a messages array and we can actually declare the type of that message array right here in the state and then pass it did we not import your state now we still need to import that and then as for the default value that we want in here this is what's going to show up as the first message by default and this needs an ID property we're going to use nanoid for that ads what's the import nanoid for this to work then as for the text again we get beautiful intellisense that's the beauty of typescript what do we want the first message in the chat to say for example hello how can I help you and then of course we want this to be a bot message so is user message is going to be um false to indicate this was sent from the chatbot not by an actual user okay great um what we can do with this messages now is we can actually pass it as the value let's say messages for example that's going to be the state right here now there are a few more values that we need to pass and the way we're going to do that is by declaring um three more functions in here first one is going to be const add message so what should happen if we want to add a message well first off we need access to the actual message that we want to add and pass it to this function and then what we're going to do in this function is very straightforward we're going to use this state and push this message we're receiving into the state the way we can do that is by saying set messages and now get access to whatever the state was previously as the first parameter I think it's called and then return an array that has all the previous chat messages and also the message that we just got into this function now we can actually pass this to the provider as well and separate separate it by a comma then for the remove message it's going to be very similar we're going to say can't remove message and now we need access to the ID of the message that we want to remove this is going to be an error function as well and what we're going to do is again update the messages State we can do that by saying set messages and get access to whatever it was previously and then return directly a previous dot filter a regular JavaScript array method we can use we get access to each message we're filtering over and then we can check if the message no that's not what I want to do if the message dot ID is not equal to the ID that we passed into this function and thereby filtering out the message that we want to remove and setting the state to the filtered version great we can also pass that into the value and then one last thing we also need to do is the update message function let's work on that const update message is going to take in an ID off string and also a callback function I decided to call this update FN for update function this takes a previous text um that is a string and it returns a string and the reason I decided to go with a callback function to pass it the previous text is because then we can actually push the answer that we get back from open AI into an existing string instead of having it separated out into each Chunk we get back right so this would be one string in the end that contains all the words in sequential order um you know just as it should be nobody wants all messages chunked out like that okay and this update message is also going to set the messages State receive what we have previously and then we're gonna um oh and this needs to be an arrow function there we go okay what we're gonna do inside of these set messages is we're gonna map over all the messages first so we're going to say priv.map and then for each message we're mapping over we are going to invoke the following code block first if the message dot ID is triple equal to the ID that we are I'm passing in so then we want to update this specific message very important so we're going to say return so the return value is what is going to end up as the actual State this is going to be put into State and this is going to be all the message properties that we previously had just the text is going to be updated and this is going to be update FM we're going to invoke the Callback function we pass into here with the message dot text which means we get access to it similar like a state Setter wherever we call the update message which is very handy and then in the other case if this is not a message we want to update we are just going to return the message as is and now we can pass the update message function into or value and are almost done because now if we take a look at what's still missing it's only two properties it's the is message updating and set is message updating and how we're going to keep track of that is in a very simple set State um not set say in a very simple state by saying is message updating and then subsequently set is message updating this is going to be of type Boolean only if you're in typescript again and by default we want this state to be false great what we can do now is pass both of these into the value we can say is message updating and set is message updating to set it anywhere and that's giving us an error set this message that you mean set is message up padding so I did I did a little typo updating updating did I misspell the I misspelled the state okay let's redo that state is message updating Boolean false great that's our state and up they think is message update and I think I also misspelled it up here up day thing oh Jesus I misspelled it everywhere that's horrible there we go finally okay we got rid of the arrow let's move it up here just for stylistic purposes and we've got the context done great work very very nice and now we can actually get started in the messages component and that will be the last step of the build so now we get a real-time response back from open AI we verify that now what we can do with the context is share the state we get back from open Ai and in so the messages that we display in the chat window let me quickly draw out for you what it looks like so we have the um chat in chat input component down here actually let me type that out as a component like that make it black there we go this is where we are receiving the open AI response in real time right and what we can do is above the chat input literally above it we're running rendering out the chat messages as a component and then comes the chat header which is already done that's the thing you click on to open up the chat chat header right this is the chat header up here what we are doing now is passing data from the chat input into or context context you know we're keeping it in state as state in the context and then passing that context to the chat messages that's how we communicate across two components at the same level we could do it through the parent again we could also do it without context and with a state manager like um joltai for example or tsushdant that's entirely possible if you prefer that approach um of course you're free to do so I just found context to be a good fit for this purpose great so after finishing all that we can exit out of this and now we can actually show the chat messages currently we just mock them inside of our chat component not anymore it's actually declare these as a component let's say chat messages they can it's going to be a self-closing component and this of course doesn't exist yet so let's create it chat messages dot TSX instead of our components folder hit enter FC and hit tab for a full-blown typescript component that's pretty cool now inside of these chat messages we want to do the same thing we've done for the chat input where we can pass it a certain class name from wherever we render it because if there are any Flex properties applied I think it makes sense to keep them right here for visibility purposes again the way we do that is by saying interface chat message prop props extends and then the HTML attributes a thing we get from react and pass in as a generic the HTML div element in here and what that allows us to do is pass any prop that in normal div will take receive that and here we can destructure the class name right away and then receive all the other props in here and pass them right on to this div and also combine the class name we're going to do that in a second though because the class name is going to be a bit longer we can save that and if we take a look at this now we get an error because we also need to and we called this chat message this needs to be called chat messages um there we go and change the file name as well chat messages that's better and now let's try importing this component right here and what we can see now is we can pass it all the properties a regular div will take for example the class name and what what we want to pass as the class name is a padding X of 2 a padding y of three and a flex of one this is what I meant with the flex properties on the same level as the div that has the flags I just think it's easier to see and debug styling if you're following this approach instead of having the flags declared at the very top level of this div of course the result would be the same but I think it's just a bit easier for you as a developer to do it this way at least that's my Approach okay then inside of here we want to get access to the messages from the context and we get access to these by destruction them saying cons messages is going to be equal to and now to subscribe to the context we can use the use context hook that react provides us we can import that from react and then pass it the messages context from add contact messages now because we're using a hook here this means we need to declare this as a client component because server components do not have access to Hooks and by default this is a server component so we need to opt out of that behavior and declare this as a client component for client-side functionalities like the use context and then um this is what I meant earlier with the inverse messages and we want to be on the very bottom of the chat by default so if there's a very long message obviously we don't want to scroll down but we want to be at the very bottom of the chat by default and the way we achieve that is in a second here in the div through a certain style and to display them properly in the um correct order we're going to say cons inverse messages and invert them we're going to say dot dot messages to spread them into this array and then dot reverse which is just a regular JavaScript array method we get and now we have the array in a reverse order that we can now use inside of or component down here and let me show you the property that's responsible for the correct reverse order so for the class name let's have this dynamically with or helpful C and helper function and first off the values that should always be applied right this is going to be flex and now comes the interesting thing what I meant it's the flex call reverse it means the entire column is reversed the first message is at the very bottom and the other way around but what it also does is we are at the bottom of the chat now by default so if there are a lot of messages we don't need to scroll down but instead if you wanted to see the above messages we would need to scroll up to see them which is the desired chat Behavior which we're achieving through this and then also inversing the messages because this also reverts the messages so we're kind of um twice reversing them so they're back in the correct order and we get the desired Behavior we want let's apply a gap of three an overflow way of Auto to enable a scroll bar and then some custom scroll bar Styles this is going to be scroll bar Dash thumb Dash blue a scroll bar Dash thumb Dash rounded a scroll bar Dash Track Dash blue dash lighter to apply a custom light blue color to the scroll bar we want a scroll bar W-2 for a little width and then lastly a scrolling Dash touch all the classes we're going to apply to that and then we can also merge this with the class name we have passed into the component up here for example the flex one we have declared in the overlying chat component we can do that by just saying a comma and then the class name we receive in this component so we're merging them together depending the class names we have passed into it from here um with these class stamps together okay first off let's put a div in here that's going to fill all the remaining space of an empty chart so if there's only one message it's going to show up at the very top instead of at the very bottom which should would by default but it just looks kind of bad so let's put a div in here with a class name of flex 1 and flex grow it's going to take up all that space if it's available and then let's map over the inverse messages inverse messages dot map and for each message we're going to execute a code block first and actually we don't need to do that I just saw I did it in my project we don't need to do that though we can directly return some HTML by having some normal brackets here inside of here we're going to put a div with a key and that's to a composite key of a template string first it's going to be the message dot ID and then actually no let's not do that let's just leave it at the message ID that's already unique message dot Eddie and that's totally fine no need to do a composite key um just the message ID is unique that works and then as a classm let's have chat Dash message this class name does not have any Styles applied to it but it is cool for debugging purposes if you're in the browser window you know exactly where the chat message begins so I like to do that you could also do this as the ID and to not confuse this with your Tailwind Styles but this works um really well for me okay inside of here let's put one more div with a class name off and this is going to be dynamic so let's have some curly braces in here with a CN and invoke that the class number always gonna pass is flex and items Dash end but that's not where it ends no pun intended um we're also going to put an object Right Here and Now conditionally apply a justify Dash end down here and this is only going to be applied If the message dot is user message so if this is a user message we're going to display it on the right hand side and if it's a bot message on the left hand side of the chat that's what's that's what this line right here is for and then inside of this diff we have just created let's put one more div and I promise that's going to be the last div of this component um it's almost done this class name is going to be dynamic as well let's have some curly braces with scn invoke that you know the drill by now first the class names that are always applied Flex Flex Dash call a space y of two a text of small a maximum width of Xs a MX of 2 margin X and then overflow Dash x dash hidden the reason for the Overflow accident is because sometimes when we have links inside of our chat the chat bot tends to write past the scope of the chat and because we're going to use some markdown that's not going to be a problem when it's done writing but during the writing it might look a bit weird and to force weird scrolling Behavior to stop we can put this overflow x hidden in there great and now conditional class names lastly in an object so if this is a user message we want a BG blue of 600 to be applied and also a text dash of white and this again as I just said is If the message dot is user message and then the other conditional class name is going to be a background gray of 200 and a text Gray of 900 if we have a not message dot issues a message this is a bot message these styles are going to be applied and that's the class name done great inside of here let's put a P tag and inside of this P tag relate you're going to render a very light version of markdom that's just gonna know about links and but this is where the text of the message is going to go so this B tag is going to get a dynamic last name as well you know the drill curly braces CN invoke that with the class names that are always going to be applied first a padding xo4 padding y of two and rounded Dash large conditional class names are going to be after that first one a BG blue of 600 and text Dash white okay and inside of this diff we're gonna render out or light version of markdown this is going to be marked down light a component that we're going to create together that doesn't exist yet and this markdown also needs to know about the text that we want to render it out in so what this markdown light component is going to do for us it's it's going to display the regular text as it is but it knows about links because the links we asked GPT to provide in markdown format like this only include links in markdown format and to be able to actually link to these articles we need to create a markdown component that knows how links work and convert them to actual next JS links and so it needs to know about the text that we're trying to render out and this is going to be the message dot text that we're going to pass into the markdown light component and I think that is one if not the last component that we're going to create in this whole project so let's go into the components folder and create a markdown light dot TSX component initialize this as FC save that and first off actually let's ditch this approach and don't worry we're still going to do it but I just want to show you the actual message even if it's not formatted correctly let's just render out the message text remove that and we're golden let's save all of this and see if it's already working let's go back into our browser let's let it load up and let's just send a message in here saying hello how are you and let's see what happens and hit enter let's see what happens and does anything happen let's look into our Network tab if I hit enter it's going to make the request and it seems like we are getting back a response but the message is not being displayed yet okay so the reason for that is that wherever we are fetching the message in the chat input right here by the way we could also abstract this away into its own hook I decided to not do that because it adds a lot of overhead you could totally do that though um we are never updating the message right that's the thing we're not doing so how should the chat message know its content currently it can't and that's why we put all the stuff in the use context um so let me switch back to German keyboard and let's destructure some some things from our context first off let's say is equal to use context to get access to the values and this is going to be our message context and what what do we want to destructure is the messages then the add message a remove message an update message and a set is message updating those are the five values we need access to inside of this component and now in the success Handler for or so whenever we got back the stream right in the react query success Handler this is where we want to perform all these actions and in order to be able to add or message array we need to have a message to add in the first place and the way we do that is by saying um let's do it up here right below where we check if there's a valid stream for example let's say cons ID is equal to Nano ID and walk that and then let's construct or message so let's say const response message is going to be equal to and this is going to be of type message we already know the type so we can enforce it right here for some additional type safety this needs to take an ID property we have just created then it takes an is user message and this is going to be false and then it's going to take a text which is um just an empty string so we know this is a bot answer we know there's going to be an empty string because we have just created the stream and the content of the stream is still going to be empty and we're going to add to this message later that's why we have the update message up here so first thing we're going to do is add the message add message to our context state by saying add message response message and then we also want to set is message updating to true to display a loading set to the user for a better user experience and then inside of this while loop remember we had access to the current value and this is where the let's go back into our context this is where the Callback update function is going to come in super handy because we can treat it just like a regular State update we can say um update message so we are updating the last message that was put into the state or not necessarily the last but the one with the ID we have created up here with a nanoid we're updating that specific message and now we can treat it just like State we get access to the previous value and can update it to previous plus the chunk value which is going to construct one coherent string together instead of just having all the chunks separately as we previously had right here on the console that's not how we want to display to the user then let's clean up we're almost done let's clean up by saying set is message updating to false then also let's set the input to an empty string because everything went well we know the message has been submitted and now we can clear the chat input field and then let's refocus the text area so the user can type a new message right away now we can't focus the the text area as is what we need for that is a ref and we can pass a ref into this text area Auto size and let's call it a text area ref now this ref doesn't exist yet so we need to create it at the very top of a file let's say const text area ref and this is going to be equal to a user ref a hook we get from react if you don't know what use ref is it get it grants us access to a certain Dom node and that we can then access programmatically with the text area ref and so for example we can then focus it this is going to be a null by default so we need to include null in our type but then it's going to be an html text area element that we can then Focus via this ref because now we have access to the Dom node and we can do that in a set timeout if you want to do it directly it doesn't really work at least it doesn't for me that's why I decided to wrap this in a set timeout we can say text area ref dot current this is the actual Dom node like the text area element and then we can say dot Focus and invoke that remember the optional chaining because when the component hasn't fully rendered yet then the Dom node in the ref is not going to be accessible just yet and then let's give it a very small timeout of like 10 milliseconds I think it is um users won't notice it but again if I try to do directly that didn't work great let's save that and see what happens let's go back into our input field and say hello how are you let's see what happens I hit enter and now we still want to display a loading state and it seems like we are not seeing the message but it did go through successfully because our message disappeared interesting so it's not showing the message neither is it showing the original message um of hello how can I help you that's interesting oh and I figured out why it's because we have never provided our context and so it will always fall back to these values right here let's be so we're we're mapping over an empty array in our chat messages that's why no message shows up and that's exactly what I said earlier if we forget to provide this to our application it's going to fall back to the default values right here so what we need to do is go inside of our providers and we can already copy the messages provider right here and provide our entire application with it the way we do that is by wrapping our children instead of the messages provider that we still need to import from our context and now let's see if it actually works let's restart our app and now we can see the initial message that's great let's say hello I wanna buy a book let's see what happens hit enter and it does show the bot answer correctly but it does not show my message interesting this is where the optimistic updates come in um that's what I meant earlier users expect optimistic updates at this point they're super important so what we're gonna do is as soon as we hit enter we want a message to be put into the state right and the way we can do that is by using the on mutate function inside of react query we get access to the message inside of here and what we can do inside of the on mutate essentially this is going to get called as soon as this runs so whenever we mutate the function we don't know if it's successful yet if it's erroring yet but as soon as that happens we want to add message the message that the user has just sent right um so we don't get a missing message in the chat so we also want or message in here not only the bot message but when I write a message hello then it's also going to show up in here now it doesn't look perfect yet it looks blue but the borders are not rounded that looks a bit weird um to fix that let's go into our chat messages and inside of here let's put a P tag Just For The Styling to make it look better with a class name again this is in the chat messages component we're inserting A P tag right here and this is going to get a dynamic last name as well with a CN invoke it with default glass names being padding xo4 padding y of two and round the dash large that's how we get the beautiful border radius with conditional class names and what we can do is just copy them from here very cheeky and pasting them right here that's the conditional class names we're going to apply save that and oh that that doesn't look better you know what let's not obsess over this for now I promise this will get fixed um later but let's get the core functionalities working first I think that's more interesting for you um we know it's working though we can see all messages we can see the bot answer in real time that's really really good one thing that doesn't work is um displaying links yet link me a good mystery book for example it will actually write the link in markdown as we expected to but it won't be displayed properly yet and that's what the markdown light component is for that we are rendering out down here we can already import the markdown light because we have created that let's get rid of all of these and then switch over to the markdown light component we know this will receive a text and this text is going to be of type string and the task we have inside of the markdown diet component now is um getting in text and only displaying the links as proper markdown to do that first off we need a link regular expression right we need to know what is a link of the text that we are getting in let's call this the link regex and this is going to be a double slash as always with a regular expression and then Let's Escape this with a backslash and then here goes an angled bracket a this is going to be hard because I don't know the exact words of how these are called like a a smooth bracket or a I think it's called parentheses I'm pretty sure a period a plus a question mark a closed smooth bracket a backslash and a angled bracket this is hard to pronounce and it looks very cryptic but you can copy and paste this from the GitHub if you want it might be easier and we're just checking if this is an actual link or not so you know how it puts the link in angled brackets and then the actual link inside of the round brackets the parentheses that's what we're checking for um with this regular expression then another backslash uh two smooth brace uh brackets a DOT a plus a question mark a closed smooth bracket I'm not I'm pretty sure it's not called Smooth bracket actually then a backslash and then another parenthesis and then after the last slash a g for Global checks and that's the irregular expression it's over again feel free to just um copy it from the Repository let's create the parts as an empty array and let's create two things first off let us initialize a last index and this is going to be zero and then a match so the actual match we get and then let's create a while loop the condition for this while loop is going to be in parentheses the match is going to be the link regular expression Dot exact for execute and then in here goes the text you can see right here the string object or string literal on which to perform the search so we're searching with the regular expression and if this is not equal to null so if we do have a match then we can execute the code inside of the while let's destructure some properties from the match that's going to be the full match the link text oops link text and also the link URL is going to be equal to match so from this array we are destructuring the first three properties the match start is going to be the match dot in oops index and then the match end is going to be the match start plus the full match dot um length if you've ever taken computer science classes I think this makes sense intuitively and then if the last index is smaller than the match start in that case we're going to push this into our parts by saying Parts Dot push and in here goes the text Dot slice with the last index and the match start okay again might seem a bit cryptic understandable and what we're doing is creating the actual link components from next.js and the way we do that in practice is by saying Parts dot push and we can push a component in here this is going to be our link component we get from next slash link and this will contain the link text so whatever is inside the angled brackets for example here that's going to be the link text and if you click on here then you're going to go to the URL that's in the parentheses that's what we're doing right now and this link takes a target of underscore blank and a Rel of no opener no referrer this is gonna open the link in a new tab instead of overwriting the current one with a class name of break Dash words underline underline offset 2 and text Dash blue um we're almost done this just needs a key let's give it the link URL as the key and then in href this is mandatory and this is going to be the link URL as well and that's our link component done let's quickly say last index is going to be match end to keep on searching and by doing that we're almost done with the markdown like component of course you don't have to do everything yourself you could also import a library to do this for you but that would add a lot to the bundle size if you want to use it on the client just pay the markdown um turn it into a client component that would add to the bundle size now you could also keep it on the server but still it will take longer to load and if you only want to render links I figured this would be the easier approach and the more performant one as well well no it's not the easier one but it's it's the most performant one and I figured that trade-off was worth it um so one last check we want to do is if the last index is smaller than the text Dot length in that case let's push the last index into our parts by saying Parts dot push text Dot slice the last index in there great and now we can just return the jsx that we want let's turn this into a fragment and in here let's say Parts dot map and for each part and the index of that part we want to render out some jsx directly and this is going to be a react dot fragment that we're going to render out the reason I'm putting this as a react.fragment is so we can attach it a key key is going to be the index and we need to import react for this it seems like let's say import react and the FC type from react that makes us able to use the react.fragment and then here we're going to render out the part great finally marked on light done we can put a check on that save this and go back into our chat component really really good okay let me check why this still looks off um this is not how it should look like give me a hot second I'm gonna change the or I'm gonna look at the CSS and see why this looks so weird oh I think I found it it's because we didn't wrap the markdown light inside of this P tag but instead until now it was self-closing that's not how it should be let's instead wrap the markdown light inside of that P tag and take a look at what changed and that does look better but it's not rounded yet oh I think I found it um so um this is not correct this should be applied to the div instead the conditionals should be order dash one and items Dash end and for the div if it's a user message and also order Dash 2 and items Dash start if this is not a user message let's save that and take a look at the messages and they look better great really cool um so let's say I'm interested in a fantasy book hit enter and that looks really good we're almost done great one thing that doesn't work yet is that the message field is not disabled yet and there is currently no loading indicator that the message is loading that will be the last thing we're gonna fix and then we are completely done with this build if you're following along really really good job oh and one more thing we're going to do is the API Road security right that's also very important I'm not going to leave you hanging there that's also the thing we're going to implement together and don't worry because very important if you're going to allow users access to something like chat GPD which costs you money you definitely want to consider securing your API rod and rate limiting them so there's not some random dude that can can just bankrupt You by spamming your API endpoint we're going to make sure that it doesn't happen together um great first thing we're going to do is go back into the chat input and down into the jsx and improve this just a little bit um so first thing we're going to add is the loading indicator we're gonna do that below the text area Auto size right below it we're going to create a div with a class name of absolute inset y 0 so left and right then Flex a padding y of 1.5 and a padding right off 1.5 great then inside of here goes a kbd this is going to get a class name of inline Dash Flex items Dash Center rounded a border a BG of white a board oops border Gray 200. padding X of 1 a font sense a text Dash XS a text Dash this is going to be the gray of 400. and then inside of here let's go inside of here and this is going to be um determined by the loading state so if we are loading then we are going to display a loader 2 which is an icon we get from Lucid react to indicate a loading status and apply a class name of with three height three and animate spin to make it spin around and if we're not loading then we're going to display a corner down left which resembles an enter symbol with a class name of width3 height 3 indicating to users that they can press enter to submit the message and then one last thing for a beautiful input highlight when we're focused is we're going to create a self-closing div down here there's not going to be content this is purely for visuals that is why we're going to hide it on screen reader devices so area hidden is going to be true because we don't want to show make this show for screen reader for visually impaired people on their screen readers it's just for decoration so we won't need it there the class name is going to be absolute absolute in set X of 0 so left and right is going to be zero and oh by the way this was top and bottom this is not left and right as I said earlier and that is why we also need to insert a right zero up here instead of this this diff right here a right of zero continuing with the decoration element this is going to get a bottom of zero a border top border gray 300 a peer Dash Focus that's why we declared this as a peer up here whenever that is focused this is going to get a border top of two also when the peer is focused this is going to get a border of indigo 600 that's going to make it look very very beautiful we can format that save it and let me show you what we just did whenever we tap in here we can see there's a beautiful highlight at the bottom of the input field very beautiful and one thing we also want to do is disable the input field if we are in a loading state if the readable stream is being generated and the way we do that is by going into our text area Auto size and saying disabled is equal to is loading so whenever the is loading property is true that is managed by react query then the text area will be disabled beautiful let's try it out let's save that go into our chat field let's reload everything for um so everything is up to date and let's type a message in here I'm interested in mystery books for my brother for example let's hit enter there's a beautiful loading indicator the message is right there and that works perfectly super super cool if we click on the link we get to the correct page awesome job really really good job dude if you're following along awesome awesome job there are a few tweaks here and there um that we still want to do but it's um we're like 90 95 done with the build so we still want to do the API error protection very important and one quick thing if there are a lot of messages in here um please elaborate I'm trying to get GPT to fill this you can see there is this scroll bar here it doesn't look bad we can make it look better though you know it's it's not horrible but it doesn't look good either it's just the default M scroll bar that's not ideal um so let's go into our globals.css because we've already applied all the um styles that we need um oh by the way well while we are already in the chat input let's do one um very quick thing and that is error handling if anything should go wrong during the um fetching stage of the Stream we can handle that gracefully in the on error and we get access to the error the variables and the context we won't need the context we won't need the error either and but we do want to notify the user that something has not worked um so for example we could say toast and we don't have a toast yet so let's quickly install a library to do that to display toast notifications to the user that's going to be react hot toast it's a very popular Library for displaying tools notifications to your users and let's start back up the dev server after the dependency has installed and inside of the on Arrow let's import the toast from react hot toast with a toast. error informing the user that something went wrong please try again great that's a good approach to error handling then we want to remove the message remember remember we did optimistic updates so it's very important we remove the message otherwise the user is going to think it actually went through when in reality it didn't and we can receive the message right here as the second parameter and then let's focus the text area ref dot current once again so the user can just hit enter again and that's going to send their message again um okay let's mark out an error actually let's go into a route.ts and just throw an error randomly let's throw a new error I don't know that's gonna make the um chat input error and let's see what happens thank you I'm gonna hit enter it's gonna error here in a second and a reason why I think the on Arrow did not get called actually let's verify this but I'm suspecting the on Arrow did not get called got called because there's one thing we're still missing um let's write thank you once again and the on error is not actually called correctly that's why the arrow is not being handled properly and so what we want to do is up here in the fetching function let's check if the response is okay and if it's not if not response dot ok then we're gonna throw a new error which is gonna and put us into the error handling block so let's try this again thank you and now hopefully the error will be handled accordingly and it did not okay and um that worked I just had to restart the server um sometimes that helps so this does work correctly I I was very sure this would but it just didn't in the hot reload um but if you restart the app it works perfectly as expected great so when we send a message now thank you and it does not go through correctly it's being removed again and put back into the state so all we need to do is re-press enter that is the best possible user experience I could imagine in this case in a very very graceful way to handle arrows very good so now we are handling the error correctly we can remove the artificial error we are throwing inside of our router TS we used to verify that we are correctly handing the error and now let's quickly work on the scroll bar scroll bar styling it's just going to take a minute and it's going to make the scroll bar look better and let's go into our globals.css and first off is going to be the scroll bar Dash W-2 colon colon web kit Dash scroll bar and this needs to be dash webkit dash scroll bar and inside of here we're gonna put a width of 0.25 Ram and then also a height of 0.25 Rim so we're kind of extending the scroll bar Styles then the scroll bar Dash Track Dash blue dash lighter colon colon dash webkit dash scroll bar Dash Track inside of here we want to set a dash dash BG Dash opacity of one a back background Dash color of a hashtag a hex value in quotes hashtag F7 f a f c it's a light gray and then a background Dash color of rgba and this is going to take a 2 4 7 a 250 252 and a variable variable of dash dash BG Dash opacity that we have declared um right here beautiful okay then next up the scroll bar thumb blue dash scroll bar Dash thumb Dash blue colon colon dash webkit dash scroll scroll bar Dash thumb and inside of here we're gonna pass a dash dash BG opacity of 0.3 0.3 a background Dash color of an RGB and this is going to be 10 102 and 194 to resemble a Tailwind color and then let's copy the background color down this is going to be rgba and then 10 100 to 195 and a variable dash dash BG Dash opacity okay we're almost done one last thing it's going to be the scroll bar thumb rounded the colon colon webkit scroll scroll bar thumb and inside of here we're gonna pass a border Dash radius of 0.25 Ram beautiful that's just gonna change the scroll bar styling when there are a lot of messages and now the last thing that is to do in our entire application is protect our API rods it's really last but not least um this is the last thing we're gonna do but it is one of the most crucial things if you're working with apis that cost you money this is a thing you definitely always want to do and it's honestly I'm super straightforward the way we're going to do that is by creating a a rate limiter instead of our lib folder and what we're going to use trade limit is redis this video is sponsored by upstash and upstash is a redis provider I have used upstash before they sponsored the video and I have used upstash in my most popular video ever without them sponsoring it either because I genuinely think it's the best solution to do rate limiting in next JS so they are sponsoring this video that was important to notice and with that being said let's continue in the route protection to get started with redis let's navigate to up stash and create a new redis instance we can use for or API real protection I'm going to log in with Google let's go over my wordful AI admin account to create a database in here when you're in the dashboard click on the create database that's going to prompt us for name let's call this GPT um and I'm on the English keyboard again GPT rate limits this is going to be Regional for me that's going to be Frankfurt um this should be closest to your users and then we are going to enable encrypt the traffic and click create that's going to create everything for us the creation just takes a few seconds and then we have all we need right here with the upstash rest URL in the upset token let's go over to our environment viable to our environment file and put them in here the redis underscore URL is going to be this for me and then the red is rest token is The Red secret I'm gonna put in as redis underscore secret um right down here it's a bit longer and again this is sensitive information I'm gonna show you this but you should not push this up to Source control or anything and this is just for you do not share these values okay and once we've done that let's create a new file called redis in or lib folder and we can actually create or read this instance we are going to use for rate limiting again upstash as I said makes this super super simple let's add um two packages for this first one is going to be add up stash slash redis and then for the rate limitation we're going to install at upstash slash and then rate limit as one word without hyphens or anything those two dependencies either yarn add or npm install and we can get up and running with a working rate limitation in a matter of minutes which is super cool we're gonna let that install start up the dev server again and then inside of our redis file let's say export cons red is that's what we're going to call it is going to be equal to a new redis class we get from at Upstate slash redis this is going to take an object and we need to pass it a URL this is going to be our process.env.reddis underscore URL I'm gonna tell typescript I know this exists maybe you would want a function checking if it exists first um I'm not going to bother and then a token of process.env dot redis underscore secret also with an exclamation point to tell typescript we know this exists then let's create a red limiter let's create a rate rate Dash limit limiter let's call it dot TS and inside of this file we're gonna say export const rate limiter this is the one we're actually going to use in our middleware to enforce the API rate limiting it's going to be equal to Nu rate limit class we get from add up stash slash rate limit you can't tell me this is super intuitive and in here we're going to pass an object with our redis instance that we have created and then also we want a limiter function that is going to be a algorithm we're going to use for the rate limiting in our case let's say rate limit dot sliding window which is a certain method we can use you can see right here how many requests are allowed per window and combined approach of sliding logs and fixed window honestly I don't know super well what these are but we can just enforce a certain rate for example we want four requests per 10 seconds that's how we can Define that and the red is import does not seem to be working so let's say import redis from dot slash redis that should work great and let's quickly Add a prefix and this is going to be at upstash slash rate limit and this is optional you don't need to put this in and in my comment I've written here for me um it says optional prefix for the keys used in redis this is useful if you want to share a redis instance with other applications and want to avoid key collisions the default prefix is add up slash rate limit so this is the default I decided to put it in here explicitly you could leave it away no problem at all great so we have created the red limiter and now all that's left to do is applying it to our middleware that validates each request and we can do that by creating a middleware.ts at the same level as our app directory so in the in the source folder let's call it middleware.ts it's important that you name that because that is enforced by next.js and from here we're going to export a async function Middle where this takes in a request a next request if you're in typescript and then first off we need to determine the IP that is making the request and we have access to the IP by saying rack.ip or if that is um null or undefined let's type this out or hard code this as 127.001 or localhost because we are making the request if there is no IP present then let's actually rate limit the API calls and first off before we do that we need to tell the middleware which and paths to run on and we can do that by exporting a const config again this is a name enforced by next.js and inside of this config we can put a matcher and the matcher is going to make that this middleware run on every path that we Define in here in our case we want to rate limit our API message and we want any path of that to be red limited so we can put a colon path star for any subsequent path we don't care as long as is it is the message Api okay now for the red limitation it's very straightforward we can destructure in the try block something from the weight rate limiter dot limit and we are limiting by I oh I called it IQ this needs to be IP and we are limiting by the IP what we can destructure are a bunch of properties but we care about the success property and and if there is no success then we're going to return a new next response but it's going to say you are writing messages too fast period and we need to import the next response from next slash server so this is going to be actually written inside of the chat window which makes for a super cool user experience and if there is Success then we can return a next response dot next because everything is fine with the request and they are allowed to access our API and if there's an error anywhere in the process let's return and you next response and then here we're going to say sorry sorry something went wrong processing your message I'm gonna close this so you can see this better please try again later awesome we can save that and that is the red limitation done how cool is that that was super super easy and just to try this out let's allow one request per 10 seconds to make this a bit easier to see let's restart our book by the app and the development server might be no it's online great and let's try this out let's reload the page and see if it works so it says hello can I help you how can I help you I'm looking for a book let's hit enter and let's exceed the limit let's just say thanks and now it says you are writing messages too fast if I try spamming this with my requests under 10 seconds I can't that's the beauty of obsession rate limiting it's super cool and enforces security um for or backend API rods on any that you want um by the way in the first try this didn't work so I tried debugging but again what fixed the issue was um restarting the application so if I say hello and then spam it and it works just fine and so I removed the path for debugging but it works just fine um so everything is working correctly and as it should we successfully secured our chat GPT API road so you're not gonna get some incredible bill from chachibati because some idiot decided to spam your API rods beautiful we are done that's all we need to do um to get a beautiful and very functional um support chat for a bot there are many ways to extend on this and I am of course encourage you to extend on this to turn it into your project something you made and can be very proud of putting on your portfolio because I think that's generally the best way to learn we marked out the website content we finished the um chat it looks beautiful it's functional it's secure um very well done thank you very much if you'd follow along with the video I hope you got a lot of value from this I try to make it as intuitive and you know code along friendly as I possibly could and that really helps you um enjoy this video and have a lot of fun with your um chat thank you very much for watching I'll see you in the next one have a good one and bye bye
Info
Channel: Josh tried coding
Views: 23,681
Rating: undefined out of 5
Keywords: nextjs 13, nextjs, typescript, next js 13 api routes, react, react typescript, nextjs typescript, nextjs tutorial, nextjs 13 tutorial, react tutorial, modern web development, tailwind, tailwindcss, shadcn, shadcn ui, upstash, redis, rate limit, api rate limit nextjs, chatgpt, gpt 4, josh tried coding, joshtriedcoding
Id: KiWClrSVgfU
Channel Id: undefined
Length: 167min 11sec (10031 seconds)
Published: Sun Apr 30 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.