GraphQL with Next.js 13 Server Components

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
foreign xjs application that's using nextges 13 and react 18.2 you'll also notice inside of here that we have graphql code generator installed with the client preset plugin we also have a custom script for running the code gen and running our back end this will run a back end locally so we can interface with it from our next GS application and it will create apis to create read update and delete as well as link entities from this simple graphql sdl now because this is a nexgs 13 app we won't be using the Pages directory so we can get rid of that and instead we'll create a new page inside of the app directory called page.tsx here we'll create a new page and will export a simple H1 tag so we can use the new app directory instead of the next config we'll need to add an experimental flag that we want to use the new app directory so we'll set this to true now that's set let's head back to our terminal and inside of one window we'll run our back end then instead of another will run our application we can see here in the terminal that next 13 is updated.ts config but a server is running and ready to go so let's head on over to localhost 3000 and here we can see the H1 hello graphql that we wrote inside of the new app directory if we head to our localhost 4000 Port here we can run a mutation for our backend to create a new post we'll give our post a title a slug and we'll give it some content then from the response we'll grab the post ID and slug if we execute this mutation we can see here that locally this new post has been saved for us and we're able to access it over the graphql API if you haven't used the graph based backend before we can see here inside the documentation that it generates all of these different queries mutations input types and it uses custom directives to handle things like validation now let's head back to our code we'll leave the server running for nextjs and we'll leave the graphql code generator and graph base watching for new changes now you'll notice that we have this file layout.tsx instead of our app directory this was created automatically for us when we ran the next server it's inside of this Global layout that will make a query to get all of our posts and we'll include a link to all of our other posts then instead of our children we'll render the posts page so when you click on a link from the root it will load into the children here all of the contents to get started let's install graphql request and graphql graphql request takes care of some of the formatting of our requests it's not necessary and you can just use plain old fetch but we'll use this because it abstracts some things for us so we don't have to repeat ourselves throughout this tutorial instead of our root directory let's create the folder lib and inside of here we'll create the file graphql client here we'll import graphql client from graphql request and we'll export a new const called graphql client and will invoke that graphql client class here we'll need to pass it the URL for our API and here we'll pass a environment variable and if we open up our EnV file we can see here that we've set here the graph base API URL we don't need an API key when working locally but if you were to deploy those two graph base and to the web you'll need an API key so instead of graphql client let's add our API as a string and then we'll add for good measure the headers for our API key and the header key will be X API key and then the value will be our graph base API key so this will work when you deploy and it will also work locally because we're using the new graphql client preset plugin with graphql code generator we have this folder.gql in the root of our project from the graphql file here we are going to import graphql right now there's no strings in here but as we begin to type our graphql queries they will automatically be transformed into type document nodes and will be available inside of this function so let's see what this looks like let's import graphql from gql and we'll import the graphql client from our graphql client folder now inside of here we'll create the variable get all posts document and we'll invoke graphql as a function inside of here we'll pass a new graphql query let's give our query a name get all posts and we'll in fact use a graphql variable as well so we can see what it looks like to have the fully typed variables instead of our graphql client so here we'll specify that the argument first is of the type and then we'll use the query post collection passing in the first value from our variables then for each of the edges we'll grab the node and we'll grab the ID title and slug now because we have graphql code generator running and watching for changes to files if we head back to the gql file we should now see that here we have a new document inside of our object and because graphql code generator is watching for file changes and is inspecting anything instead of our app directory that looks like a graphql query it's able to automatically create a new type document node from the string that we passed to the graphql function now for our root layout if we update this function to be async then inside of here we make a request to call and await graphql client request then we pass in our type document node which is get all posts then for the second argument here we can see that the variables must match this exact object and here we can see that we have the value first of the type number and this is because the graphql code generator automatically generated all the types that we need and using generics pass this to the graphql client so here we'll fetch the first 10 posts then from the response we'll destructure post collection and we can see here when we hover this that we can see all about the type and what this looks like then further on down instead of our body we'll create a new tag for nav and then for each of our edges in our post collection we'll map through those and we'll grab the edge now if the edge contains a node we want to render a new list item otherwise we'll just return null so we'll need to add a key and here we'll call edge.node.id and inside of our list item we'll invoke the link component from next.js and we'll pass it the path slash posts slash Edge node.slug then inside of the link we no longer need to provide the a tags but now we can just invoke Edge dot node .title and for good measure let's wrap this inside of an unordered list what we want to do when we click one of these links is render the contents inside of main now previously with nextges this would require a full page reload but with the new next 13 updates we can render the contents of here without having to re-execute all of this code here by using react server components now if we head back to the browser and we refresh our page we should see here that we have our default page hello graphql and then at the top we have this hello next 13 and this is coming from our API and by default all of the content here is static now similar to nextgs before we can create a new file inside of our app directory inside of the folder posts but instead of using square brackets to define the file name here we'll Define slug as a folder name and inside of here we'll create the file page.tsx you can also create its own layout so the posts look page can contain a layout and then you can Nest other Pages inside of that we'll Begin by an important graphql and graphql client then we'll declare a new const and we'll call this get post by slug document and here we'll invoke that graphql function as we did before so here we'll specify query as our operation and then we'll declare the operation name get post by Slug and we'll use graphql variables here and instead we'll use a string and this string type and non-nullable must match what is in your graphql schema he will call the query post to get a single entry and using the argument by we can provide an object here where we pass the slug from our variable and from that we'll grab the ID title and slug now if we go back to our gql.ts file we should be able to see now that we have this get post by slug automatically generated for us now let's create a new page const that is a sync and it's inside of here that we can call out to our API so let's destructure post from our request and we'll await graphqlclient dot request then we'll pass in the get post by document and here we can see slog is available for our variables but you're probably wondering at this point where slug actually comes from and if you've used next yet before well you can get it from the context of the current page and just like before you can fetch it from params let's type our params here so we can specify slug is a string and then first look we can call params Dot slug then we'll check to see if there's a post if there is no post we'll simply return 404 otherwise we will return a pre-tag here and we'll render the Json that's returned from our graphql API now all that's left to do is export default that page component that we've just created now if we go back to our application and we click one of our posts hello nextgs13 this will then render the contents of that page inside of the layout which contains our links from our database what you see here is then static and similar to next.js before we've got the option to revalidate we can now do this by exporting a new const called revalidate where it will revalidate once that time has passed so this has been a brief introduction into nextgi server components we make a request to our local graphql backend that we can also deploy to the web next.js13 and the app directory is very experimental so I'd recommend that you check out the documentation to make sure that the way is taught in this video are applied to Future versions of next.js
Info
Channel: Jamie Barton
Views: 32,325
Rating: undefined out of 5
Keywords: web development, reactjs, javascript, es2015, html, css, code
Id: yuzvcZ6yTho
Channel Id: undefined
Length: 9min 55sec (595 seconds)
Published: Wed Nov 02 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.