GraphQL Microservice in Go - Quickstart and Deploy to Heroku

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everyone welcome back to the channel today we're looking at just how easy and quick we can set up a graphql api in golang um so go is typically not a language that i see used very frequently for graphql servers but it's 100 doable and it's actually pretty effective and pretty clean and this channel kind of specializes and focuses on go so i just wanted to show how easy and how doable it is um so today we're going to set up an api and deploy that api onto heroku so first off i'm just on graphql.org we go to code here and right here we'll see all of the different languages supported with with graphql and you can see different libraries available for each one of them so if you go to go here you can see server let's see clients and tools um as well so for server this is what you'd be using one you want to use one of these libraries here if you're building on an api right so you're you're you're serving resources to clients or users also in go if you want to consume other graphql apis there's a bunch of libraries supported for doing that but for us we're going to build a server today and you can take a look at these and kind of play around with each of these but one that i would highly recommend is 99designs gqlgen this is a really really handy goal library here and a big selling point that i really like about it is that you make your you set up your graphql schema basically a single schema and you give this library your model so your schema essentially and then it generates types and it generates resolvers based off of your schema so you have one schema and it's actually written in graphql syntax and then based on that schema we generate out go files and then we can use those and implement our resolvers using those automatically generated types so that's a big selling point for me if you look here all this different language support a big part of graphql is you want to have a single schema just one schema graphql schema and then you'll have many clients many servers that will use that schema so you don't want to graphql library like auto generate from that schema right so you have a single schema single source of true source of truth and all of your clients or libraries your front end if you're building out a front-end app and react would generate queries mutations types everything like that from that schema your back end your server would generate the types from that single schema right so you don't have a duplication you don't have to duplicate that schema across many different repos so that's why i really recommend gqlgen here because you have that single schema that single source of truth and then it automatically generates types from it some of the other libraries uh for example you have to manually create go structs and then it does things for you so um you'd have to manually create a bunch of uh different structs and then for your front end you'd have to go back and manually create all those types too you wouldn't get anything generated for free so yeah that's kind of why i recommend google gen here and there's a comparisons here um if you really want you can see all the features chico chain has and then some of the other common libraries as well obviously this page is put together by g equiligen so i kind of expected to say cheque gen is the best library in the world but there's probably some trade-offs and if you really want to go ahead and dig into these others and play around with them too but today we'll be using jiggle jin another great thing that i like about it is that they have a really really excellent quick start guide for setting up a project um it sets up a bunch of source code um a bunch of tooling everything you need really to get going with the graphql api so we're going to follow their steps here and then we'll make some modifications and some additions that we'll need to build our api as well so the first thing we'll do is just set up a directory and then set the go project here so let's do that right now okay so that's taken care of um second step so gqlgen uses some tooling basically to take that schema that we had and generate files from it so here's this command here that essentially pulls that down and writes some code into a new file tools.go so let's go ahead and run that and then after we'll do go mod tidy that will add dependencies and clean things up so let's do that now all right and let's open up that folder cool so now we're inside that folder that we've ran we have our go project here we have some dependencies here and inside here we have our tools and we use this this specific import here this dependency um to actually run a command that gqlgen gives us go run a command that gqlgen gives us that generates our resolvers and our types so that's why we have this file here so next step we actually run that command so basically let's go run gquelgen init so let's open that terminal back up let's run it so this is going to generate a bunch of files and basically give us a server structure that's pre-built comes out of the box and it's pretty workable okay so now if we wanted to we could just run go run server.go and that would actually give us a fully working functional graphql server let's walk through the code that they give us so they set up here they give us a default port of 8080. this is useful for applications for example when we go to deploy this later on to heroku um we get the port so heroku specifies the specific port that we need to listen on so this is nice that they give us this pre pre-built out of the box here already listens for it and then what they do is set up a server default server and they have all of the resolvers the graphql resolvers that we've generated and they set up two endpoints so the first one just the slash endpoint is the graphql handler the graphql handler is like swagger ui for example or open api it's basically a ui where users can test queries and explore schemas and types right one of the big selling points of graphql is that your code is automatically commented or your code is your apis are automatically documented right there's documentation generated for them and with this playground here that's kind of how you go around go and then see it you can play around with all the queries mutations and types and use them like that the second one is the slash query endpoint and this is where you'd actually fire off your queries so if your front end your front end would call this endpoint here looks like they're just logging out what they're listening on and then they just start it up so you notice they're just doing http listen and serve and this function takes an http handler so behind the scenes all this handler new default server is doing is generating an http handler right so http handler just needs this it's an interface that just needs to have serv http this function implemented so that's what they're doing behind the scenes it's just a handler if you didn't if you don't want to use their handler their built-in handler you could use chi you could use um gorilla mux you could use the standard the standard library as well the http standard librarian go but i really like this library because they give you this out of the box so another thing you can do as well that you probably want is authentication or logging things along those lines so you can do that just as well you can add in middleware so you can wrap serve for example or you could wrap the playground even you can wrap both these in any kind of middle where you want so authentication middleware logging middleware tracing middle or anything you'd like we could do that so we'll add one test middleware later on um but for now let's look at the other files that we got generated so gquegen.yaml is configuration for um generating our schema with gqlgen so we ran that a net command to set up all these files for the future instead of running a nit we will run generate so if we make any changes to our schema we rerun it with generate so you probably have that as some kind of build step or pipeline step as you deploy your app you generate the types that's one way of doing it we don't really need to make any changes here this specifies where things go so where we generate stuff they're saying graph generated generated dot go so it comes out here this is a bunch of types handling running our queries etc don't need to worry about that file too much package name you can change that around and then where do any models go so remember when i said you create a single graphql schema this is their schema that they give out of the box you create the schema and then types or models are generated from that schema so we look model slash model models gen and then here's our types right so let's split this and let's open up schema so here's all the types generated so here's the graphql type and here's the go link type that was generated automatically for us um yep so that's specifying that and then where do the resolver implementations go so we have a we have a query and a mutation so we have a query right here to do's which gets us all of the to do's and then we have a mutation create to do which creates a new to do and this this section here specifying where that goes so directory graph and resolver.go is the file that it comes out in and then there's a bunch of other configuration here [Music] won't go too in-depth into this um this is specifying type so graphql has an id type and an int type go has several n types and go has a bunch of different things that you might want to use for ids right so graphql out of the box for an id usually uses like a uuid a really long string but you can also have ends that's kind of just showing some customizations that gqlgen can do the other type here is ent and it's just saying we can represent that with all these different types with all these different types of thing go n32 and 64 or just regular int etc all right so that's this is kind of a config and again when we make changes uh we'll we'll run this command go run gqlgen generate and that'll update all of these files for us okay so what do we actually change now with this generated stuff well within the schema.resolvers.go file here we see all of our generated mutations and queries so we see create to do and we see to do's and there's just a panic statement in here that says hey nothing's implemented yet so we can implement these uh right now and we can see that basically we need to return a model to do when we create one and then for the the query we need to return this array of to do's so let's do that um and just get these running for now so we can return and let's see what does it take and it also returns an error so we'll just say nil oh and actually we have this input so we can make this even though this is just mocking it for now we can make this look a little bit better what else does input have has the user id okay and then here we want to return an array of to do's and yeah that's fine we can just have an empty array so this would run but one of the other things you see here is we have context um so remember we talked about how we can add in middleware like a logging middleware tracing we can put whatever we want into this context and then we can pull it out here so for example one of the things we might want to do inside of context is have our database right so we could pull if we're creating it to do we probably want to write that to a database right so you could probably pull um get db from context and then pass nctx and that would get our database from the context and then we would do database you know write to do etc and that would actually store it somewhere maybe we're doing some kind of sqldb or something like that and that would actually write this into our database so we'd have persistent storage there for us we're just going to do a memory database but we'll actually set up this this little hook here to pull our storage from context and we'll actually save that in there so to do that we will add a new folder here we'll just say store within store we'll just say store go and we will have type store struct and inside of here um we're gonna have our to do's so this will be an array of model dot to do and we need to put the package in on top okay so this is basically going to be our store um we'll have a list of to-do's and that's that's where we'll store our stuff so let's set up a new store we'll set up a function to create that that's going to return just a store pointer so we need to make our empty list of to-do's and then let's return a store okay so that sets up the store um let's also we'll make a middleware so a width store middleware this will inject um store into context and then we'll say git store what do we call this get db we'll get store from context we'll we'll do that okay so there's two functions that we're going to do here so with store is going to be our middleware and that's just going to be a regular http middleware that injects our store into context and then we're going to do git store from context and that's going to pull it out all right so for middleware basically we need to put a http handler if you don't know how to write middleware and go http middleware go i'll put a link in the description that has a good article on it um you can read up that basically we just need to implement the http dot handler interface so let's do that say with store and next http dot handler and we need to return a handler too so let's return http.handler funk and that's going to take a function that's a w responsewriter then a request serve http and request now we need we need some way to give this middleware our store so it will take it'll take a store so this will be the store that we've set up all right so now we basically need to set up a like request with store and we're going to say r dot with context so we're going to take the old request and we're going to add on to its context so with context context start with value so it takes the parent context it takes a key and then a value so the parent context is r.context the key will be store and the value is store yep and then serve http so instead of serving this old request we serve this new request that has the store context put into it yeah so that's just a really simple with store middleware um so the next one is gonna be git store from context and that's just going to return a store and to pull that out we're going to say store takes into context i'm going to say store equals context dot value and store and we want to cast that as a pointer and this will this this returns okay so you could do if not okay um panic or return you could add an error that you return up here um i'm just gonna panic otherwise we'll return the store so if you're doing production um probably return an error don't don't panic here all right um and best practice too for for context keys and values don't use a hard-coded string because other other context other middleware could use that same exact string so typically you'd see like a store key type string and then all right so typically you would see this this is a better pattern here so you would set up its own type store key type this could be anything and whatever but then you would set up a store key that is that type so now other context other mid middleware would not um kind of overwrite this it's a different type um so it wouldn't it wouldn't interfere so that's set up uh we're using this good practice here with context so the next step is let's set up that middleware so we'll create the store first dot new store yep declared but not used the next part is just adding in the middleware so basically what we need to do is just wrap serve this handler with our stored middleware so stored with store.with that with store and then let's just surf and i think we need to pass in our store as well okay what's this error all right so we're we're setting our variable here to be stored so we're overriding the package so i'm just going to say um db sure we'll use that naming is tough okay so we have our middleware passes in the db injects it into the context of all of our quests coming in all right so now with our resolvers we can pull that db out and we can do fancy stuff with it so db equals store dot git store from context okay so there's our database and now we have access to the to do's so we could probably write a helper function um that actually saves it and then a helper function that retrieves it so let's do that so new store we've got all of our little context helpers so up here i'm going to put s store so this is like our helper function that updates to do's state right so this this would be the equivalent when you have a db this would be the equivalent of your little query so save we'll say add to do and this will take just a model dot new to do that's fine and this will just return an error okay so s dot to do's and we're just gonna append it basically so this is a new to do so we need to make a from there all right so this updates our state um kind of abstracts that out so our resolvers don't have to worry about doing this so if we go back now that's fine now we can do db dot add to do input and pass the address and what does that return that returns an error and we'll handle that otherwise we'll just return um this this same object here okay so down here for the resolver now when we want to retrieve all of our to do's now we can get our storage and instead we can return db.todos everything we have stored right here let's go ahead and run this now um so you can just run what was it go run server dot go and that should start up our server and we can run we can open up the playground next and actually play around with all of our code so yep we'll allow that cool so we're on localhost here we can see um this is the playground here basically so we can see remember we talked about how there's pre-build documentation we kind of explore the different stuff we can open up query and we can see we can query for all of our to-do's mutation et cetera so that's what it looks like let's build a query and get our to-do's to-do's and let's see id what else do we have on to do text done user id name so we've run that we don't have any to do saved so it's just that empty array but now we can make a query we can make a mutation to do that will create it to do and what does that take that takes input input which is going to be an object that's a new to do so it's going to take text and a user id and that's going to return a response and let's query the whole to-do from that response all right so let's run create to do and there we go we see we've created our to-do let's run git to do's and see if that store was updated yeah cool so our little in-memory storage is working create another one get them and there they are yeah so that's not persistent storage that'll be erased every time we um every time we restart our application but you get the idea so that's that's really how easy and quick it is um to set this up if you want to change your schema around uh if you want to change your schema around also pretty straightforward and fast do for example we can add comments here maybe we'll have an email which is just an optional string okay and we want to regenerate things so let's go ahead and do that equal [Music] so instead of running a nit we're going to run generate this time there's a couple things we should see so there we go and now let's run our server again go around server.go allow and we're set up here but now so you can see here we updated the documentation for user at the start of the video we talked about how graphql is kind of self-documenting so all that all those comments that we put in our schema will actually show up here in our playground so that users of our apis can can see that here we see email string so what happens for example if we got rid of a mutation so let's say we rename this to add to do instead so it's not create to do it's add to do we'll see what we need to do then now if we go to schemas dot resolvers we see the new one is created we changed it from create to add and it's got the just default panic not implemented our old query is still fine this is going to update and resolve but down here you see this new part um it's was going to be deleted code is going to be deleted so basically any old code that isn't used within the schema is going to get put down here so you can either just erase this entirely or you can keep this and kind of comment it out work with it do whatever you want um but yeah so when you add new queries and mutations you'll get new resolvers that you need to implement here and you'll you'll get some old code to clean up down here okay so that's looking good let's undo that let's undo this we'll rerun generate just to clean things up okay but there's our server uh there's our graphql api so now let's finish this out and let's get this deployed all right so before we deploy i want to talk a little bit about um what we're gonna do next and that is containerize or dockerize our go application so i'm a really big supporter really big fan of using docker whenever you're working with go go one of the parts that we do when once you finish writing your code it all compiles and gets generated into an executable now that depends executables kind of depend on whatever architecture or operating system you're using so for example you're on a team of three developers one developer uses a windows machine why no idea they just like windows another developer uses linux and another developer uses mac os right so we can all be working on the same devtopics gql api or server but all three of us are going to generate and use a different executable a different binary right so when you when you run go build you can also specify the output the architecture so you could use linux you could use osx you could generate a binary for windows etc so three different binaries right that can make it a little bit interesting a little bit difficult to work with so instead of that we use docker so it doesn't matter what os you're running on as long as the os can run docker it can build and run our application so one single command one go build generate an executable for whatever docker image we're using whatever os that docker image is using and it's good to go similarly by using docker we open ourselves up to the um to a wider range of uh cloud providers so maybe for example if today we're gonna do heroku um maybe we would customize if we weren't using docker we would have to customize some things to get this deployed onto roku if we were running this in an ec2 instance on aws and we were not using docker again there'd be some things we have to customize right so maybe the ec2 is limited to the operating systems it can run maybe it only has linux maybe it only has arm processors available to it etc so without docker we're kind of limiting ourselves to generating an executable that runs only on that ec2 and if we ever want to move over to gcp or microsoft azure other cloud providers there's going to be a lot of work translating and transforming our build scripts our code just to get it to run on those other providers so using docker number one it makes it easier to develop with multiple team members who maybe use different um operating systems but also it opens up a wider range of products that we can move across to right so we can easily deploy a docker image onto heroku easily deploy one not to gcp aws azure um many many i'm guessing the majority of cloud providers out there will offer some um support for docker um yeah so that's kind of highly recommend using docker get familiar with it it's pretty straightforward to do but that being said let's go ahead and set up a very simple docker file and then also dockercompose.yaml so that we can do uh local development and boot this up and run it so to get started dockerizing um our graphql api we're back in vs code i'm just gonna make a new file we're gonna call this docker file um we'll make one more file and we'll call this dockercompose.yaml all right so um dockerfile i kind of use this similar pattern in many of my other videos so um if you've watched my other videos that also use docker you probably know what we're doing here but we pull from goling as our base image this basically lets us run build go inside these docker containers next we're going to set up a kind of directory to work from in here that isn't just the default directory we're going to add all of our source code in into that directory that we just made so everything in here that would contain our all of these files here are server tools graph etc and then we're going to set our working directory so the current directory basically to be app so now any commands we run will be run inside of app um and we're just going to build our executable our binary here so that's saying go build a binary output what we name our executable it's going to be main and then it's saying built from the current directory uh next we're just going to expose our port 8080 and our command that we're going to run is just run the binary we just uh the executable we just made so apps made and that's it it's pretty straightforward um so one thing to point out here if you're if you want to improve this docker file make this a little bit better we're pulling in all the source code so every bit of source code we have is going to be in every single docker image that we build you don't have to do that a better way to do this would be kind of use stages let me pull this in here so multi-stage builds basically so it helps out keeping the image size down so when you have when you have builds like this that need the source code and it just runs a single command but after we generate our executable we don't need any of the other files in here right we don't need the source code we literally have just a single binary that's what we care about that's what we want to upload so maybe we have you know you're working on a big api you're gonna have hundreds of files tons of resolvers um lots of code going on but you don't we don't need that to clutter up our docker image so you can kind of shrink your docker image by using stages um you could have you know your first stage first stage is i don't know pulling secrets environment variables second stage maybe is actually generating the binary and then third stage just ships the binary right so this stage would have all your code and this would be the biggest image but that's not the that's not the image we would be uploading and using we could just ship a very minimal image here that just ships the binary so i'm not going to implement that i'm not going to really go into depth on how to do that i'll leave that up to you if you want to productionize it i would recommend using stages and pulling pulling these out into different stages here okay so there's our docker file uh let's move on here so docker compose um we're gonna have a service we'll just call it web and um to set that up we'll just say build and the current uh current directory ports we're going to specify we just have 8080 that's our api and we'll expose it to 8080 and then the environment and we actually don't have any environment variables we're using here other than port um let's see let me double check that so we look for port here and it's 8080. okay so we can set that let's set i don't typically use that syntax let's do port 8080. okay so now our server code can actually pick up that port okay and we would when we deploy to roku heroku sets this dynamically for us this won't always be 8080. okay so let's just make sure this is working and we'll do docker close down and docking bros up builds this should just run it all right okay tools okay so found main and tools and slash app package tools package main okay so simplest way to fix that would be we'll set up tools here and we'll move our file in there and again the reason we have this here this equal gen dependency isn't really used anywhere else it's not used in our code but remember it is used to generate our models basically it's it's used to generate all of the the types and the resolvers and the models so having this here ensures that it stays in the dependency right otherwise we couldn't uh we couldn't keep it in we couldn't put it into our guild modules so that's that's sort of the purpose of the tools um whenever you see tools within go that's typically the purpose it's some script or dependency that you need but isn't really used in the code so typically stuff you would run on your cli all right so let's see hopefully that fixed it yeah cool okay so we're up running on 8080 and let's just pull that open and here we have our queries get to do's let's create one and let's get it yeah so things are working our docker our dockerized api is all up and running now so we can move on to um deploying this onto heroku now okay so i've signed into my heroku and i'm just on my dashboard so let's create a new application here a new app and we're going to call this dev topics gql api and we'll create our app all right so this is all set up here um if we were deploying this by git we could use this and we could use kind of this guide right here and every time we pushed up it would trigger a build um we're doing docker so container registry over here so we can kind of follow this um this guide right here so the first thing you'll do if you haven't already will first install the cli i've already done that i'm not going to run through that but just follow the guide second part you're going to log in so heroku login signs you into roku lets you use the cli and next so they're saying you have to have docker installed locally so i'm assuming you have that done and then you sign into the container registry and basically there's two commands you'll run so the first one is push that actually pushes your image up so you can have a bunch of different images you can roll back to older images etc you know in case in case you release something and it's broken you can just roll back previous deployed images and then finally you can release so heroku container release and that will release your latest image so let's go ahead and run those commands um docker compose down let's close those out we don't need these anymore so i'm going to run roku login and i'll sign in so we can close that let's jump back and we need to do container login all right and now we can do roku container push web um and usually i have usually i just combine both these so there's and and i'll do them one at a time okay we need an app here so let's specify our app and what do we call it dev topics gql api all right containers pushed so we can release this now so roku container release web and we need to specify our app roku container release web app was it dev topics and if you don't want to have to specify app every single time um like we're doing this will go away if you set up the uh get remote so the the get remote um so while while that's going okay couldn't find that okay so one second um yeah so if you don't want to specify the app um what you can do here is just set this up and this heroku will automatically pick this up so if i do this for example then i can do um web and it should pick up the app automatically here for us all right cool so we pushed our container and now we have released it um so we should be good to go and we should be able to go to the end point and actually see our playground now um but from heroku we can run roku open here and that should pull up our app yeah okay so we see we're in our graphql playground um i think i can just copy over these from our localhost one so we can check out the docs we can see our queries to do and we can see our mutation here and all that here's that description that comment that we put in and it's the user type here but yeah we can run our queries here so get to do's you see we have notes dues and let's create a couple there's the first one let's create another and then get them so we see we have two to-do's now okay awesome so that is um how we deploy here we're on to heroku now and hopefully because we used docker and docker compose you can pretty much take this this base server anywhere you need to in the future yeah so hopefully that was helpful for you all here we went through and set up a graphql api with gqlgen and we also showed how to um set up middleware so you could do things like database injection or authentication if you watch my series on authenticating with amazon cognito with aws cognito you could set up middleware that would verify tokens on every request so if you haven't checked that out go back and look at how to authenticate your applications using aws cognito but beyond that thank you all for watching and i hope you found this video helpful make sure please make sure to like comment and subscribe to the channel helps the channel grow and helps helps me keep making these videos lets me know i'm doing something right that you find helpful so if you found this video at all helpful please make sure to like comment and subscribe to the channel thanks for watching and i will see you all next time
Info
Channel: devtopics
Views: 4,258
Rating: undefined out of 5
Keywords: microservice, golang, microservices, heroku, api, backend, software, docker, docker-compose, docker compose, deployment, deploy, go, graphql
Id: RroLKn54FzE
Channel Id: undefined
Length: 43min 33sec (2613 seconds)
Published: Tue May 10 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.