Serverless Node.js Tutorial – Neon Serverless Postgres, AWS Lambda, Next.js, Vercel

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in this course you'll dive into deploying expressjs and node.js applications on AWS Lambda using neon serverless postgress and the serverless framework by the end of this course you'll also learn how to adapt your deployment for versel alongside mastering database setup migrations GitHub actions and more Justin Mitchell developed this course Justin has created many excellent courses on his coding for entrepreneurs course platform and his YouTube channel neon provided Grant to make this course possible hello and welcome to the serverless nodejs API course in this one we are going to learn about expressjs to build what's called a rest API now we're really going to be building a JavaScript application that runs on the back end in a serverless environment which really means that we just focus on our code we don't have to focus on the servers at all now traditionally speaking when it comes to seress a big challenge that comes up is what database to use in and how do we actually connect to it now for that we're going to be using neon and servess postgres neon is actually Paving the way for so many applications to leverage serverless databases as well and of course they are a partner in this course but the idea here is we need an application that we can scale up or scale down as needed and we'll do that using AWS Lambda serverless functions but then we will also need our database to match that scaling up and scaling down and that's where neon comes in but it gets better better than that what we can do is we can also have multiple versions of our application deployed so we can do these iterations share them with other people and test out little pieces of it while also simultaneously having our database made and ready for that neon does this really cool thing to Branch data at any given time you can look at a point in time and Branch it so you can work on multiple versions of your application at once and allow people to use it without having major overhead to make all of this work it's fantastic it's a really straightforward way to do all of this now the idea here is really going back into focusing on the code and being able to iterate as much and as fast as we possibly can so sure we are going to be deploying an API service itself but that's really just for a foundational piece so we can actually focus on getting our code better and better and better over time so this is really about setting that foundation and serverless is the perfect way to do it so let's talk a little bit more about why serverless is important and why you should care about it let's talk about what servus means for your application and why you should consider using it first and foremost it's all about focusing just on the code and effectively running that code now the best analogy I have is lights like quite literally the lights in your room right so when you walk into your room you turn your lights on when you walk out you turn them off right but what if you forget to turn them off well they just keep running whether or not you're using those lights or not servus is just like that now a normal server application you turn the server on when you want it to run and you could leave it running forever that doesn't mean it's being used all the time but it does mean that it's running forever now of course there's a lot of cost Associated to that to have it run forever because computers just have a bunch of teeny little lights that really that's how it works right but the idea being that if you leave it on it's going to continue to run always but there's another part of this is like what if you have just a small little light turned on maybe it's just your desk light and that's the light that you have on and then when you leave that desk light is still on but then a 100 people walk into the room that desk light is not going to light up the room you'll need a lot more lights to do that and therefore you would have to scale your lights up now it's not any different in a server environment when you have one server yes you can have a tiny server that maybe costs a few bucks a month to actually run your application but as soon as you get thousands of people trying to access that server well you're going to need to scale up you're going to need to load balance across all of these different other servers which means that you'll have to actually understand the infrastructure behind it you'll have to learn how to scale those things up now don't get me wrong I find that stuff really fascinating it's something that I really like doing it's a fun orchestration but the thing is not everyone's ready for that or is interested in doing that especially with business critical applications maybe you don't want to manage those servers and so that's where these serverless Services come in servus is really just focusing on the code you deploy your code and then somebody else makes sure that all the systems scale up to meet the demand and all that scaling happens automatically but when you use something like AWS Lambda you just focus on making sure your code runs and then once it does run you can deploy it to Lambda Lambda will scale up and scale down to meet request demands now Lambda isn't the only servoless service that's out there there are other servess options but Lambda is a great one it's one that helped Pioneer so much of what we take for granted with serverless today so it's actually really great but actually deploying to Lambda presents a few problems for us number one deploying directly to Lambda with code is tricky there's a lot of different things that we need to consider with doing that and if you don't know AWS it becomes even more tricky because there's a lot to AWS and that's where we use something like the serverless framework we really just use the open source framework to help orchestrate the deployment to Lambda so that it's only a few commands instead of trying to figure out a bunch of commands in there now there are things that we will discuss in terms of I am or access policies and all that with servis and AWS um but you know it actually is fairly straightforward with servess I've I've seen a lot of different Services over the years that deploy to AWS Lambda I can honestly say that servess is one of the best if not the best way to do it specifically for Lambda and what we're trying to do with nodejs now the other part that comes as a challenge when it does deploy a serverless application the serverless application typically doesn't interact well with databases because databases themselves historically are always on that's because they pretty much always need to be available to grab that data so traditionally speaking that's what they do and the reason we're using neon and well neon is pioneering this serverless for databases and specifically for SQL databases and postgres postgres is probably the most used database in the world and so a serverless postes really just unlocks this for your database as well so in addition to your code application your database is going to do that as well now there's a huge advantag as to having serverless databases as well is this point in time branching so what's going to happen is whenever you need a copy of your database with neon you can just Branch it just like you would with your code and it will actually make a point in time copy of that data so then you can actually stage your application many many different times so if you're working on version 10 or version one and all of the versions in between between you can actually create different branches of that in your database do the migrations make the changes that you need for your database all right there and all of this is possible because of how they approach serverless and how easy they make it for your applications to run and consume all of this stuff so that's really what we're going to be doing is orchestrating all these things together now I realize it might sound complicated but it's actually fairly straightforward thanks to both AWS Lambda and neon really you just focus on code uh and you just pass in in the case of neon you're going to have a database URL string that Express will consume using the neon servess package that works with JavaScript so it's actually fairly straightforward there are some moving parts so yeah you probably want to know some JavaScript to get this going but we'll talk about that in just a moment but that's a high level understanding of why Servo is in the first place it's far more effective it's far more efficient than managing a server yourself now of course if you're not ready to really scale up your application and you're not ready to build into this environment you can still bring around the expressjs application and your neon application or your neon database with you to wherever you want to deploy it doesn't have to be servess and so we actually do take a look at how to deploy this on for sale for that very reason is so you can see how portable it can be by leveraging a manage database service in this case a serverless one that ends up costing significantly less so I'm really excited get into this one so let's go ahead and take a look at this course let's talk about the requirements you'll need to do well in this course and then we'll also do a highle overview of what it is we're going to be using what technology we're going to be using now first and foremost this is a JavaScript heavy Series so that means that you need to know stuff like functions classes objects arrays string substitution handling promises and callbacks maybe even Json once you know that stuff in JavaScript then you can move to the next part which is using node.js or serers side JavaScript so be sure to install node.js grab 20 LTS so version 20 is the version we're going to be using but whatever version you end up using make sure it's an LTS one unless of course you know what you're doing but the idea here is we're going to be using nodejs to actually run our JavaScript so we'll create a Javascript file and then use node to run it now if you're from the browser world of JavaScript as in writing stuff like react or vanilla JS node.js is just on the server side your web browser won't run your JavaScript node.js will although they work basically the same which is one of the Magics of node.js so once we actually have our application running locally we'll start to deploy it to a serverless function called AWS Lambda now at a high level serverless just means that it runs when it needs to that's what we're going to be using Ed Lambda for and that's how they can offer things like 1 million requests free per month on their free tier which is amazing that's so many requests to pay $ Z for so the next part of this of course is not just running the actual code itself but Al also storing the data that we need to store and for that we're going to be using serverless postres for this thanks to neon neon is really pioneering the next generation of what post CR can be by implementing things like servess and many other features to make it a very very powerful offering for your applications this is very very straightforward as we'll see but the idea here is these are going to work int to have a fully seress application from the actual node.js side of it itself as well as our database is fantastic then to actually build and deploy the application we're going to be using GitHub and more specifically GitHub actions to have a serverless part of that as well that's going to handle so much of that so if you don't know git definitely brush up on that as well now finally the main part of AWS that we're going to be using really is awsam this is because of the the tools that we'll use we really just have to focus how to make the right policies to do the deployments itself we're not going to have to manually deploy to AWS Lambda because that's kind of tricky and it takes a long time to configure when we can really just automate the process and really only focus on the I am access the policies all of that that's one of the things that we'll be doing for sure so make sure that you already have an account on AWS and also that you already have an account on neon and all of those links are in the description of course and then definitely sign up for GitHub as well now a couple optional things that we'll do towards the end are deploying to versel so deploying to versel is going to basically take everything we did for AWS Lambda and then deploy it on versel and that's really where we're going to wrap things up to really just see how portable we can make this application and of course when we deploy to versale we will still be using neon our serverless database so that's going to be very very straight forward so we can see how we can really move the code around without doing much to our database which is really nice to to keep that Integrity now the final thing that's optional is actually using visual studio code or vs code this is the text editor I like using the most to me it's by far the best one that is also free so if you have one that you prefer by all means go ahead and use that one I'm going to be using visual studio code so with all those requirements out of the way let's go ahead and jump into actually building out our project of course with node.js already installed which we'll verify when we build out that project as well now when it comes to building for AWS Lambda one of the key things we have to think about is how do we actually even test or develop locally for a serverless function and the way we're going to do this is by using the serverless framework from serv.com now this framework is going to make it easy for us to test locally as well as deploy into production but before we even get there we need to jump into the AWS console to just verify which version of node.js we should even use I know I mentioned that we should download and install the LTS one in the last part but we want to verify that that LTS is even available in Lambda itself so if we go into AWS into Lambda you can do a quick search for Lambda as well we're going to go ahead and hit create function and in here we just want to look at the runtime and the run times that are available right now as we see there are a number of them on there I would say node and python are probably among the most popular ones there but the idea is node 20 is available and therefore the one that is the current LTS is also available so download what's available there so we can actually test it out locally and so in my case I've got the node version 20 There It Is I've got mpm 10 which comes in with node version 20 and then I've got the serverless framework is what I want to use to actually test out AWS Lambda we're also going to use serverless to create our project that's what we want to do now is we're going to use that package to create a project so to do this we're going to do NP PM install DG serverless and this will install that serverless package for us in my case I already actually have it installed so it's actually pretty fast but the idea then is once we have serverless there's a lot of different Frameworks that we can use from it so it has a lot of different features in here as well many of which we are not going to cover at all but this is what we're going to do we're going to create a new serverless project and then we're going to navigate into that directory and then we'll actually bring it into our vs code instance so to do this I'm going to navigate into a location where I like to store my projects which in my case is in the dev folder and we'll call serverless right here hit enter okay and so what we should see now is different kinds of serverless functions we might want to build out and so what I'm going to be doing is the express API one and this is just to get some pure expressjs built out for us or the package for us I'm going to go ahead and save that and I'm going to call this my serverless nodejs API okay and so it's going off of the node Express and all that I'm not going to register a login for the serverless framework I don't need to do that and I'm also not going to deploy right now in fact I don't really have a way to deploy yet and so we will worry about that when we get there uh but for now what I want to do is I want to navigate into this server list no. JS API and we're going to go ahead and open this up into vs code just like this and so I'll go ahead and start off and save this as a workspace and then I'll also do my get in it definitely want to make sure that I have that notice there's already a g ignore file in here so it's not something I need to worry about just yet next thing what I want to do is prepare this to work with the correct postgres so when it comes to postgres databases you need to have a client of some kind to connect to that postc database so traditionally speaking you might use something like node postgres node postgres is a fantastic package that makes it really easy for something like expressjs to connect to a postres database which works really well when the node application is continuously running even if it's updating and it has some downtime overall it would be continuously running right and so when it comes to something like AWS Lambda it's doing the opposite of that it's only running when it needs to run and therefore it's going to be down a lot and so that's where something like node postgres isn't well positioned to make that happen maybe at some point it will be but the thing is neon actually pioneered neon Serv list this actual database which is ideal for serous and Edge deployments using htps and websockets so that persistent connection can be interrupted on a regular basis because of these two things and therefore is very well suited for any kind of Ser lless including AWS Lambda and it also can use persistent connections because it's a drop-in replacement for that node postr package and it's also what this is based on but again it's designed for this modern serverless architecture that AWS Lambda has that we've seen in a lot of different places as well and so that's the actual package we'll end up using to connect to our database and of course our database is also provided by Neon but the idea here is we definitely need to use this postres package or this postes driver and client to be able to connect our expressjs application to it so let's go ahead and install this as well I'll go ahead and use the mpm install command as well so we'll install this into our application and while we're at it we're also going to go ahead and do mpm install DG and it's going to be the neon command line tool so we'll use this to use all sorts of things related to Neon and so with that installed I'll also go ahead and do neon CTL and off so I can actually log in to neon and verify my local machine on this to do all of these sorts of things which in my case I'll go ahead and authorize you can also use an API token which is something we will do when we get into the GitHub actions of it all and we can see where those credentials are saved right there okay so we now have our Baseline project set up right it's ready to go and overall there are still a few other installations that I want to bring in and those are related to how we connect to our database and how we manage databases schema so this is going to be done using drizzle o and drizzle kit both of these things will be installed right now so let's go ahead and install drizzle kit omm and and then we'll go ahead and install drizzle kit and to do this we'll do mpm install D capital D drizzle dkit and go ahead and install that as well which it might have actually done with Drizzle it did not so drizzle kit will be used as well to help us with our migrations the drizzle omm will help us design out our schema once we get to that portion but these are the main packages we're going to be using if we need to add any later we will uh but overall for right now this is what we're going to start with and we really need to see how to do a Hello World with this serverless package and see exactly what that's going to look like in conjunction with our neon database when we get there in a standard server environment the way you run expressjs is by modifying what we have here by using app. listen and then we can pass in some sort of Port value here and this port value then can have a callback function once it actually starts running and you can do something like console log and then running at HTTP col localhost 3000 right and so what this will allow us to do then is to open up our terminal and we can now run node index.js now before I even look at that I'll close it out and turn that off I'll comment that out again and run it again and we see it goes away this is actually a good way to think of serverless right so the first version this is going to continuously run no matter what the second version is running when we tell it to run and the way we restart the first version is well quite literally restarting the application itself and then making sure it runs so now we've got that let's go ahead and take a look at that homepage and we can do that by going into our web browser we could o up open up a new terminal window and we could just run curl HTTP col localhost and 3000 and that will give us the same response back great so that's showing us that expressjs is working and in some sense it's actually a hello world from expressjs but of course I want to make sure that my serverless app is actually working and I can use it locally so the way we do that of course is by commenting out this app. listen this is for like a server full app which we are not using right and so let's go ahead and close out this node index.js so what we want to do is we need to update our local environment to handle the offline mode from the serverless framework to do this we're going to go ahead and use serverless plugin install dashin serverless Das offline okay so once we install that it should open up and install something into package.json related to the serverless offline package and again this is for development mode so we can emulate what the actual AWS lamb serverless would end up doing and so to actually run this we can use server list offline now and that will trigger the exact same application and it's running at the same port in this case so if I went ahead and run this again I get the exact same value except now if it's going based off of an AWS Lambda event or it's emulating exactly what that is it even shows you the build duration for this request now another thing that's cool about this is it can also emulate different regions so if we wanted to run that with a different region so like region and US West one you could do that and that would actually show you this different region we can also do different different stages which allows us for a lot of options here so if I do region and or rather stage and prod and hit enter I have just another stage that I can work off of and of course we're going to build this out quite a bit more throughout this course but the idea now is I have a really quick and easy way to test this serverless application itself but there's still some things that I need to modify the first one is going to be package.json what I want in here is the scripts de definition so we can actually call out those scripts that we might want like Dev and this is going to be just simply serverless offline and you can add in the stage of let's go ahead and say Dev for example the region in this case doesn't really matter that much uh but I'll go ahead and leave it like that that way I can do mpm run Dev and then it's still the same thing and it's still working just as it was before but then I don't have to remember all of the commands related to this serverless offline that's a big part of the reason I use scripts like this there are other reasons as well but again we're just sort of preparing for development mode in this at this point so the next thing is going into serverless yaml so what you might see right off the bat is well yes it added this plugins here perhaps you looked at this file perhaps not if you haven't looked at this file we see this functions in here this function Handler we'll talk about that in a second but then I also have this big red flag which is this right here that is showing us the runtime node.js 18 this is not the runtime we want now now to remember what run times we have available to us we can log into the console on AWS and we can navigate into Lambda and then go into create function and going down to what's listed here right so we have other options on here now we've already talked about it but 20. X should be the runtime we are targeting here so we're going to go ahead and save that we'll run this again in this case it doesn't actually change anything it doesn't change anything technically from how this is running because node 18 and node 20 aren't a whole lot different but at the at the point of this is to make sure that our runtime is matching our development mode and our production mode that's really the point there the next one I want to do is change how my API Handler is working so the API Handler is really just looking for this index. Handler which in our case is coming from the servis application itself and then this module right here that's exporting Handler that is what we're end up using so we've got is index. Handler if I change this to app. Handler I would need to change index to app.js right and you might end up doing that now in my case what I usually do is I create a file called SRC so SRC a folder there and then I move my app inside of SRC now it's probably a good idea to just keep it in as just index.js our application itself is not going to get that complicated so we're going to leave it as index. JS but with this it actually separates my application code from a lot of the configuration for the project itself which is what I always always do and so back into this server. yaml we can now just change this Handler then into SRC do index. Handler right and so that's going based off of the folders themselves so let's go a and save that and let's try that again with this run Dev in this case I can go ahead and run it again and now I get a problem in here so let's go ahead and change the the actual handle and Handler to actually doing not DOT notation but using the folder itself and so we'll run it again this time it works correctly okay so that dot notation is the intuition I would typically use to run some sort of Handler like this but it's important to be able to test these things out try them out and see exactly how to configure this so now we can actually start developing with this serverless offline framework it's really just that simple I don't think there's a whole lot more we need to talk about this in terms of getting the serverless offline working now I will say that you could probably experiment with a lot of things related to just doing your server full app that's probably going to be okay and realistically I wonder if we could have that server full app in here with the Handler itself if that causes any issues and it doesn't appear so mostly because of the export itself the actual name space for that export is what's being used it's not actually calling index JS like this app Listen would end up doing so basically we have support for either one which is also pretty nice okay so now that we've got this let's go ahead and see how we can load in environment variables into our application right off the bat at this point in your coding Journey you probably realize that it's not a good idea to hardcode API Keys database strings or really any secret data right inside of the code itself it's much better to inject that when the code runs so have it done in the environment at itself and the way we're going to do that is by using a EnV file this is very very common to do in all sorts of programming languages but the idea here is in index.js if I did something like database URL we don't want to have the actual database URL string in here we want that to be abstracted away and so the way we're going to do this is by using process.env so what we have here is process. EnV do and then the environment variable we want to use in our case it's going to be database URL so if I save that as is and then run our serverless application our offline and then curl to it what I'll get is this hello world here looks like maybe I didn't save anything so let me try that again with that curl I'm still not getting anything for this so let's go ahead and actually change this a little bit more and then go ahead and say something like not here okay so now we've got if it is there it will be set otherwise we'll just have it as not here we'll go ahead and re run our lless application and now I get this database URL not here of course this is just a fallback value not something we'll necessarily use for much longer but the idea is that we want to actually have this work and so that's what we'll do now now traditionally speaking you might use something like theb package this is a fantastic package and is used a ton so this is something you would absolutely consider using when it comes to well non AWS Lambda Services we want to prepare it for aw Lambda and the way we're going to do that is by using the server list. EnV plugin it's a fairly straightforward plugin and it allows for a lot of different support as we'll see right now so the idea is we want this database URL to work and the way we can do that is by coming into our server. gaml and we can Define our environment and so in this environment I can add in our database URL here and I could set it equal to EnV colon as in the environment colon and then the actual value value we want to set from the EnV file like that I can also have a fallback like that okay great so this is sort of configuring it but we still need to add in the plugin itself so let's go ahead and close out this server and now we're going to go ahead and run serverless plugin install dasin serverless dv- plugin hit enter and this will install something in our serverless Di gaml as well as package.json will be this right here and so now that we've got that we can come back into Ser yaml we see that this is one of the plugins that's available to us which of course will be very useful when we actually push this into production then I also want to add in use. EnV being true so we'll go ahead and save that and I just want to see if the fallback value shows up if the EnV value shows up or if the inline code value shows up so let's go ahead and try this out by running it again I don't need to reinstall I'll just go ahead and run mpm run Dev and then we'll run our curl command and we got something else in here so what's happening in in our serverless DL it's fallback value is actually not being used but rather the EnV value is being used so at this time in my get ignore I want to actually add in EnV here with a star at the end because what I can also do is EnV let's say for instance DOD as in my development mode this is another value that I might have so we'll go ahead and say Dev DB right and so then if I wanted to use this Dev mode here I could go in package.json and quite literally change the stage to Dev mode which I still have it's actually on that stage so this Dev right here if I called it Dev ABC I would want to change this EnV to Dev ABC type of thing right so that stage is how I can actually modify this a little bit which which is also super nice and it actually will help prepare us and for our local environment for our staging environment and then finally for our production environment we could quite literally have all of those environment variables in here and then just call the different stage to run the different environment variable which we can test out really quickly right now what we see is that database URL coming through fantastic so the next thing is like maybe we want to add in another environment variable I'm going to go ahead and say debug and we're going to set this equal to one so this is now in myv back into serverless iaml I'm going to go ahead and do something similar um and this time I'll go ahead and say debug and we'll go ahead and come in here this time I'm to give the fallback value of zero and so now in index.js I'll go ahead and add that in as well and debug and this time I'll just go ahead and leave it like that save it and actually I'll go ahead and say it's equal to one so debug being equal to one based off ofv let's go ah and save that we'll restart everything and let's run it again this time I'll curl again and what we've got is false right so EnV is not equal to one let's go ahead and see if it's equal to or what it is equal to let's just console log it like that I'll go ahead and restart the application itself we'll run it and my console log is one what the what just happened here well let's actually try it or equal to the string of one and let's see what that looks like so go ahead and just put it in as a string value here and maybe we wrap this whole thing in a string as well now let's go ahead and try this again rerun this and run that again and now we get true so the actual values themselves are often going to be treated as a string itself so you need to be aware that when you're going through and using environment variables inside of your express application um now there are more advanced usages of the actual uh you know plug-in itself which I do encourage you to go into the serverless Frameworks plug-in documentation for that and you can see all of the advanced usage that's in here but generally speaking what we want to know about are three main things one is we have to define the environment variables in ser. yl that we're going to use in our application two we can use different EnV files to actually load in those environment variables and those EnV files can be as simple as just the standard EnV or of course based off of Any Given stage that we might want to have Dev test prod V1 V2 and then each stage can still inherit from that base. EnV which means that we don't have to rewrite all of the same variables over and over and over again like in the case of debug maybe we just want it always to be one so that all of the other stat PES can just go based off of one CU we're in our local development environment and especially because these EMV files would never be committed to the git repo and then of course in the final one our code itself needs to be aware of when the actual data types don't work as you might expect so that's another thing that's just to be aware of when it comes to using those environment variables and so at this point we now have a way to test out our application altogether notice that when I did change the EMV Dev the actual stage environment variable then the actual value came through so this actually overrode what was in the first one the EnV itself uh which Al is also pretty nice great so now that we've got this we have we're really just ready to start actually using that database URL we start the integration process with our neon postgres database so let's start that process now now we're going to take a look at neon serverless postres now what I want to show you here is the console on neon so we can really highlight what branching is all about it's that point in time snapshot of our database that we can essentially use a brand new copy of it isolated from the original one which is great for production applications and then staging instances or versions of our application and it happens really really quickly so of course we get all the benefits of serverless as well so it actually scales down to zero and then it'll scale up to meet demand as well our application needs it now the latency to actually scale from 0 to 1 one is surprisingly incredibly fast with neon as well so go ahead and go to neon.pdf versions as well I'm going to go ahead and use the default database name but feel free to change this as you see fit the last most important part is the region itself this region should be as near to your application as possible now in our case we're going to be using the default which is Us East Ohio at least that's my default and the reason for this is because it's going to be where we deploy our application on AWS and of course that region of Ohio is where we're deploying to us East to in AWS which will come back back to once we actually deploy that but the idea here is we want to start off with our database being in that region now it's actually really easy to create a new project with a new region so if you have to you could always just use whatever region is physically closest to you right now and then then when you actually start deploying your application you'll go ahead and change the region then okay so at this point let's go ahead and create that project and what we get right out of the gates is a connection string this is a postrest connection string that gives us our username our password our host our database and a few other configuration items now the other thing that's really cool about this when you first start out is you've got a lot of options for integration in here right and there is one for node.js but we are actually not going to use this integration option we don't need to for a number of reasons one of them being that we will just use that connection string that's what we want so basically what we'll do is copy this string whatever it ends up being you can click copy right here and then you'll bring it back into your local project into the environment vers variables when we did the EnV stuff and you'll paste this in and this is what we'll end up loading in once we get there for now I'm going to leave it just as a you know the original D Dev DB thing uh but the idea being that eventually we'll use this string for now though what I want to do is I actually want to play around with branching inside of our project so let's go ahead and close out this window here of course we can still get all of that data right in the dashboard also but what we want to do is we want to actually insert some data into our database now of course if we look at our tables here we have no tables yet there's really nothing in our database so let's go ahead and add some just by using the SQL editor that just gives us some default code that we can just run that will actually insert some data in for us and so what we'll see is if I go back into the tables tab I now see there's a table in here in which I can select and actually see the data that's coming through from that and so we could also just run that a few more times by going into our history here I can just run a couple lines here by selecting it I'm going to run that run it again run it again just so we have some more data that we are playing with just like that so I've got 40 rows in here right now so let's take a look at the magic of branching so if I go in this branches area here I'm going to go ahead and create a new branch and I'm going to call this my Dev Branch so current point and time that's what we're going to go off of but notice we've got other options in here we can also select which branch we want to be the parent the one that we're basically copying from I'm going to go ahead and create this new Branch from my main branch cuz that's the only one I have and now I get another connection string this is a different version of the exact same data but it's a brand new and unique connection string that I can use in my project specifically for like a Dev branch that has a lot of the same data so let's actually just verify that real quick by going back into that SQL editor and now what we want to do is just switch our branches to our Dev Branch I'm going to goe and run those two lines once again and now I'm going to go run it and let's run it again and what we'll see is if I go into my tables once again I'll change the brancher make sure that I'm on a new Branch I'll select the table that I just was playing with I'll scroll to the bottom and notice that I have 60 items in here now and if I were to go back into the main branch which I can do that quickly select it again and now we see once again the main branch Stills back at that 40 amount of items or 40 rows in there um and it happens just that quickly but the other thing that's great about this is if at any time we feel like deleting this I'll just go ahead and copy the name go ahead and delete this delete it now that project is completely deleted so what we can do of course is just bring it right back now of course this is going to be an empty project with empty databases but overall it's the same configuration just it doesn't have the data and I can spin it back up that quickly so of course actually bringing this back to what we just looked at I would just need to go back into that SQL editor and create those tables of course before I created tables I could just verify that yeah it actually is an empty database which no surprise it is you know like that kind of makes sense but the idea being that if I come back into the SQL editor I were to run it again and maybe again and let's say for instance I wanted to create a new Branch once again I'll go into branches create a new Branch Dev Branch from the main branch I go and create that and then we'll go ahead and go into well let's create another Branch we'll create one from the dev Branch we'll go ahead and do Dev V2 right you can create as many branches as you really need I mean my project in this case has a limit of 10 but you're probably never going to go past 10 that would be surprising if you ever did but the idea of course is that we have all these different branches so I can go back into the SQL editor once again and run a few commands for the various branches that I see fit and it just is that fast it's it's so cool uh and it makes things really really easy and granted there's a lot of other configurations we might consider like new roles new database is these things get a little bit more advanced on the postgres side itself but overall there are other options that we can Implement now one of the main ones that we want to implement is well basically doing these things in an automated fashion and we will do that using the command line tool later the neon command line tool that will actually build up our branches it will do the branching for us it won't necessarily run SQL directly with the command line tool because that's not what we're going to be using it for instead you'll use a connection string to run the various SQL commands itself and then what we will also do is we'll implement the API keys so inside of developer settings here we're going to generate a new API and then use the command line tool specifically for that so that so much of what we just did will be automated you can also even automate the project itself which is something we're not going to do but we will automate the process of branching because it is a fundamental part that really just unleashes a lot of value that we can provide to to our development process right so whenever I need to actually have a Dev version or Dev V2 I can still use production data if I want to or I can just build out data that I need just for local testing and that data could be in its own branch of our original database and so then when we run migrations we can migrate each database as we see fits as in change the structure of the database whenever we need to on each branch as well so that parts also pretty cool but overall the idea here is if you actually let it sit for a little bit longer what's going to happen is these are not going to be active anymore and so that the actual databases will be ready to be used but they'll just become inactive and then actually reusing them or making any sort of call on these databases we'll spin them back up to meet the demand that you need as we'll see as well once we actually start implementing this database uh into our project so at this point go ahead and clean up the branches delete the ones that you don't need I'm going to be going just off of the main branch and we'll start integrating that into our project here in just a little bit let's take a look at how to use the neon command line tool to automate generating branches as well as getting our connection strings and then of course deleting branches when we need so the way we're going to do this is by starting from scratch we're going to create a brand new project I already deleted that project and then of course we're going to use the command line tool which there's a lot of documentation on it we're only really scratching the surface here and doing a few of the primary actions that I've been using using so to do this we want to make sure that the neon CTL is installed in my case it is and then I can also use just straight up neon and if you remember we used this mpm install DG neon CTL that was actually how I went about installing it now at one point I will use something called JQ which really just helps me parse Json data as you'll see and to install that it's Brew install JQ this is absolutely optional but it's one more thing that you could can potentially use to automate okay so once you actually have it installed you definitely want to run through neon CTL o now I already actually authenticated this I did all of this earlier so if you haven't done that go ahead and do that now and make sure that you have access to your local project now in my case if I actually go into neon CCL projects and list hit enter what I will see is absolutely nothing I actually deleted my project and it's really simple so to create one let's go ahead and create one real quick with a name here and I'm going to call this serverless and node js- API now this name flag there are it's pretty straightforward as far as how to create something I go ahead and create it and there we go I get all of the data that I might want including our connection URI or connection string that we can use to actually connect to this we're not going to use that yet instead what we're going to do is just delete this project I'm going to go aad and copy this uh Restless rice here and we'll go ahead and do neon CTL projects delete and use that ID and that will actually delete that project itself so then when I go to list everything out again I get nothing right and so that's really that simple to create a project and delete a project being that it is that simple we need to be aware that yes you could absolutely accidentally D delete a project that you didn't want to delete but in our case we're going to stick with just this for now and so I have my project created and I already have this connection URI now what I can do is I can actually run neon CTL and connection string hit enter and this will give me that same connection string right and so I can use this within the psql the actual local client that I have I can use that connection string and it's psql not pqsl but we can use that connection string just like that and it will allow me to jump into this like something like select now and I can run that and it'll give me the exact time that it is and we can quit out of here in a number of different ways you can write out quit you can write out exit it's actually pretty flexible as to how postgres works these days so that's the connection string so let's go back up there it is and so I can use it in many different ways now the other idea is if we needed another Branch right so if we were going to create a branch let's take a look at that so neon CTL and then branches G and actually let's go ahead and do branches list first we can see all of these things so to me the format works much like kubernetes if you're familiar with that but the idea is you get the neon CTL or just neon the actual resource that you're looking for and then the action you want to do right so it's going to be get list delete right or create and then in this case so we've got branches and if we do get let's go and grab this ID here and paste it in we get that particular Branch if we go neon CTL branches create well let's go and create a new one I'll go ahead and just write out Dev unknown command Dev now just like with our project we use D- name and hit Dev and then we get a name so the name and the ID are not the same right so we've got the ID here which what do you know that ID is also corresponding to the actual connection string itself but once again if I go to list out all of the branches I can see the various branches in here if I do neon and CTL branches git let's say Dev once again now I can actually call Dev right so when I go to create it I need to pass in the name argument if I don't create it I don't have to do that the other thing is when we create one let's go ahead and just not have a name argument at all and what do you know it actually creates one for us so it'll actually come up with our its own name and we can list all of those things out just like that so then if we want the connection string we can use neon CTL connection string and then pass in the branch that we're looking for so we can use the autogenerated one that the name and the ID look to correspond or we can use our own let's go ahead and use Dev in this case that Branch should be different than the main branch right and enough it is right so the user that's being created is not different and we could absolutely create a different users to actually run this connection which you can just verify inside of here as well there's these different roles in here that you can create for any given user that takes some extra that we'd have to do in the post side as far as their permissions and whatnot whereas this user has full permission to the database itself um but the idea here is now we can create branches really quickly we can create connection strings really quickly and then of course we can actually use those connection strings so in the case of Linux and Mac you can export let's say for instance our database URL we can set this equal to that neon command so neon CTL connection string and then you could do our branch of Dev something like that you hit enter that will run that command now I can use this connection string with the psql like this and let's make sure that we have this in double quotes actually not single quotes and there we go so now I'm actually able to connect and hide the database URL as that is one of those key things that you would want to do and so when it comes to postes itself the different clients actually downloading installing psql is something that I I recommend you do if you want to learn more about postgres itself so using that SQL editor like we saw before we had all of this stuff I can absolutely do that inside of postes SQL right so I'd go line by line to just make it simple we've got this example here we created that table and then I can go ahead and bring in some of those values again once again it's doing that and then I can select all and see all of those values right here so that's really how you can do the second part of what we were doing within the console itself of the SQL editor you would actually have to use a postgres client of some kind now we can do a lot of these same sort of things within the actual JavaScript itself we could call SQL queries inside of the JavaScript itself as we'll see here very shortly but the idea overall is that we have the ability to automate so much of this being able to create branches on a whim is really great because then when we are actually committing our code we could create those branches get the connection strings and deploy it wherever we need to with that stage and all that so it just really lends really well for Automation and staging purposes for many many different stages which is why I wanted to show you this now again this is foundational for what we'll do later but overall I recommend playing around with this and being comfortable with just deleting a assets whenever you feel like it so in the case of our project neon projects right so we can do neon projects list rather and we can go ahead and delete this project and then start to get right so neon project delete and you get that ID now in this case I only have the ability to create one project let's go ahead and just try create hit enter notice it creates one for me I try to create another one I got my limit exceeded right so I can really only create one project but within that project you can create multiple databases um as you could you know experiment with on your own in there as well so if you wanted a multiple databases what you would with multiple brand brch is what you would do is you would have multiple branches each branch can have its own database itself so the project had can have multiple branches each branch can have multiple databases or they can just have all the same sort of things uh so that's pretty cool as well so yeah if you have any questions on the neon command line tool definitely consult this reference here there's a lot that it's just a matter of playing around with it uh to get the most out of it and then of course downloading and installing some sort of post client so you can really get your hand diry with a SQL if that's what you're wanting to do and if you're not wanting to do that if you're wanting to get using JavaScript that is what we need to do now it's time to actually integrate our neon uh connection string that database URL and also the neon seress package into our nodejs application so we can do a lot of these things with now heavy JavaScript so let's take a look at how to do that now we're going to go ahead and integrate our node.js application with our neon postgres database the key part of all of this is the fact that we are using serverless so the way we approach it has to be from a serverless perspective and that's how we have to configure things which means that we're going to be using the very specific package of neon database serverless there are a lot of postes packages out there to integrate with node.js specifically no- postes could be one that you would end up using but the important part for us as we've said before is that we are using seress and we need to use the neon version of that seress so the first thing that I want to do is I want grab that database URL wherever that might be so in my case you can go into the neon console grab the database URL that's right here or we can use the neon CTL connection string as well right so you could always create a new project if you don't already have one I'm going to go ahead and grab this connection string right here and bring it into my EnV Dev file not justv but rather just dodev so with this in mind I'll go back into index.js and now I need to start configuring fing out my database client so I'm going to go and create a function called DB client and what we really want to do is we want to return for the moment just the database URL itself from the process so I'll go a and return that and so now what we want is our DB client is basically equal to that so const um let's go ahead and call this our DB is equal to that database client that's roughly speaking where we're going with this so to actually make this work we need to start integrating the neon database serverless so this package itself and so at this point I already have it installed and if you don't just go ahead and run that npm install if you once again we already have that database URL we just set that up next we need to use it so this is how we're going to use it don't worry about the actual SQL command that's right here we're not going to look at that one instead we're going to go ahead and use these two items right here so let's paste this into index.js and before we go any further I'm I'm going to go ahead and use this down in the DB client instead so that's really what I want to return is that SQL command from neon and that's the point of having that so this DB client then I can run different commands directly from it itself so we could do something like con results equals to DB with these back ticks it's really cool that you can use these back ticks and you can do select and now just like we've seen before this should give us some sort of result that we can then pass in to our API request okay so one thing that's important to note is this right here let's go ahead and actually try and run this if you recall we have mpm runev inside of our package.json to go with our serverless offline specifically with the stage of Dev which is why I put it into EnV dodev so let's go a and run that again and so what we'll get when we actually go to the endpoint of Local Host 300 000 is we cannot use an import statement outside of a module so by default we do not have a standard module we have commonjs so it's just a slight difference which means that we just need to change this from importing to being a constant and using require and if you wanted to convert it to a model module you could but that takes extra configuration that we're just not going to cover at this time instead we're going to go ahead and use the commonjs which is what is traditionally been used by expressjs in a lot of these node.js applications but now that we've got that we should be able to use this just fine so I'll go ahead and refresh in here and now I get something it's not necessarily exactly what I wanted but it is something now why is it not actually giving me what I want to see this is because we need this to be all asynchronous so the database client itself we want to change this to being async which which means that down here we need to change our call back Handler for the route itself that also needs to be in a sync right so if you think of how any given route Works inside of expressjs it is also a function so we can say like our index route and by default it comes in with the request the response next and this function itself could then handle all of the things from that function it could go like this and like that right so we can absolutely separate these things out like this that's just not a common pattern to use with the nodejs application itself you usually just put the call back right in there and so putting a sync in front of it is the same thing as putting async in front of a normal function itself now if you didn't know that that's what's going on because now that it's asynchronous we can now await our database client and then within our database client to actually use it we can await it as well and to really follow follow a little bit closer to the convention you could also just call this SQL as well and so that kind of fits a little bit more with what you'll see throughout the documentation on the SQL client itself directly from the neon database server list which is what we do see in the docs itself that's this right here so as you can see it says await and that's how we're actually able to do that is by making everything a synchronous and so now we should actually see some sort of result that goes into the actual database itself so back into our Local Host we rerun this and we should see something that is coming from our database now the first time we run it it might take a moment because our actual database on the neon console has to be active for it to work and so that first little bit of latency does make a difference now I'm also using a database that's in Us East it's quite literally on the other side of the country than me so how fast it's gone is really really great so it should in the speed should improve quite a bit when I actually deploy this application into AWS Lambda and so we've got here as results that actually is an array here with some data and that that's just what happens from this particular command so we can actually unpack the results like this if I refresh that that should give me a dictionary value I might need to refresh the server itself um or what we could do in here let's go ahead and run that again refresh in here now now it's a dictionary value now I can actually just do results. now and we might have to refresh the server once again so let's go ahead and try that out and I refresh and there are my results this is now a timestamp that's coming directly from the database like we saw when we were in the SQL editor itself and we ran you know select now just like that it's the exact same command going to the exact same database and return R turning roughly the same value of course it's changing just for however long I need to actually look at this um so yeah now we are fully integrated with our neon database inside of our node.js application so that means we can start doing a lot of more interesting things before we do that I want to point out this right here is raw SQL so this command what you can do with a weight SQL now is you could write out your own raw SQL so if you wanted to experiment with how you know SQL itself you could create an endpoint that would in theory accept data to run various SQL commands in other words you can create an endpoint that handles a form very similar to what we see inside of the console itself that's not something I want to do I want to make sure that the commands I am running are exactly to the structure that I want them to be I don't want to have pure SQL commands running through in my application at least that's not how I want to design this one right so now that we've got this all integrated there is one more thing that I want to add in to the configuration for neon and this just has to do with the connection itself when we go into our deployed serverless environment I want to just change the configuration to use an experimental feature called Fetch connections cache and we're going to set that being equal to true this is for HTTP connections and nonp pooling right that's really what's going on here feel free to look at the documentation themselves but we're not going to have pooling in a serverless environment because that server is going to be scaled down so it's it's a little bit more difficult to pull connections when they aren't persistent it's going to cause a lot of havoc and issues so that's why we're going to be using HTTP connections and that's also why I put that configuration just to make sure that that's in there well before we actually go into production itself pretty great now we're going to do our first deployment of our node.js application of course using the serverless framework to make sure that it runs into AWS Lambda now to make this work I'm going to go ahead and use this command seress but instead of saying serverless offline I want to do something different so we'll give it a new script name of deploy and then we'll go ahead and use serverless deploy and then the stage we want to deploy and of course that stage is going to work very similar to the offline where it has maybe in its own environment variable so let's actually change the stage to Simply prod and so now to run this I'm going to go ahead and run npm run deploy and of course it's going to run this command in here one of the big benefits of writing the script in here like this is that you don't have to remember what different flags you're going to be adding for different deployment options in this case the stage of prod will always be the stage whenever I run npm run deploy and so what we've got here is we've got loading environment variables from EnV so it's going to look into the EnV here and then it would look into dodev if that was the stage so if we wanted to add in ourr stage we could do that just like that and so now let's go ahead and add in just another key in here so hello world equal to this something like that we go ahead and run the deplo deploy again and we can see the different environment variables that are coming through in here and so of course it fails to deploy it for a number of reasons one of them the security token is invalid so at this point I have credentials on my local machine but they're not the valid ones and so I also have this Us East one in here so two kind of major issues that we want to address we want to verify that this is the correct region and in package.json I can actually bring in that flag of region and in this case I'll leave it in as us East one now the question of course is going to be how do we know which region we want to use well it has everything to do with our database we want to make sure that the region is as close as it as it can possibly be and so to do that we can use neon CTL and we can go ahead and do projects list and we can see the project that we're using and the the region ID which is Us East 2 we can ignore AWS Dash because Us East 2 is the AWS region itself so we can update this region to being that region itself and then we can run deploy again and this time it's going to attempt to do it in the correct region which is exactly what we were hoping for so the security credentials we need to address but before we do that I want to bring your attention to something important and interesting so yes it is deploying using this.r right so it's going to use all of these environment variables and it's going to build something and go and do what exactly well what it's happening is it's building this dot serverless folder inside of our local repo so it builds this folder out and then creates this zip file here so what we could do is we could actually unzip this ZIP file and we could take a look at what happens so inside of my project here I can actually see what do you know it's actually everything related to my repo my main code in itself is zipped up and brought into here it's actually Mi missing the EnV files so it doesn't actually have those things in there which is pretty nice and fairly important right and it also doesn't have the doget ignore so it is missing some things but it does have all of the node modules in there so it's kind of like a point in time instance of the application itself but it's also showing me something in this state file and that is all of my environment variables now on one hand this is great it actually uses my.pro and it's just going to go ahead and do that and I can just deploy this application and in theory this will just run which is nice but the thing is this state file is well it has our database URL directly in the state file so this is a security con concern for sure because we will show you where this is going to end up in just a moment so to make it end up where it needs to we need to actually update our credentials so if you've used the AWS CLI before you'll know that most of the time you'll log in there and it'll create something inside of a. AWS credentials in your application and you'll see something like this now these credentials in my case are no longer valid in general but they're also not valid to actually do the deployment they're not valid to actually use serverless serverless itself needs its own policies and its own credentials to be able to deploy everything well maybe it doesn't need its own credentials but ideally with each project you attach different policies and then create credentials for them now if you're not familiar with this process I'm definitely going to show you but before I show you I want to just show you this reference that we have here for this I am policy so this is a policy we're going to be using for the serverless framework it's pretty extensive one of the things that's important is when you you actually use this policy you want to change the AWS ID to the correct one for your account which is something we will do in just a moment but the idea is once you actually have those the permissions in place then in your EnV for example you'll have something called AWS uh access key ID and then AWS secret access key cool so we need to set these up let's go ahead and create these now to do this I'm going to go into IIM and I'm going to create a user group this group I'm going to just call it serverless framework okay just naming it calist framework and then we'll go ahead and create this group Next I'm going to go and create a user and I'll go ahead and create this user as serverless framework user realistically I would actually probably want to give it the name of my project so serverless nodejs API just like that we do not need to give them access to the Management console but what we do need to give them access to is the permissions that will'll add to this group so I'm going to go ahead and add into the serverless framework I'm going to add this user into that group Next what I want to do is we can review and create this user so let's go ahead and create them so all I did there was create a new user and add it to the group that I just created at this point they have no permissions and they also don't have any security credentials so going back into our user we go into security credentials and we'll go ahead and create a brand new access key go ahead and create this key we'll hit other and then I'll go ahead and say next and this is going to be our uh we don't actually need a description tag it unless you really want to here are those access keys I'm going to go ahead and copy this one uh the first one is the access key ID the next one is the secret and we'll paste that in there the names of these have to be exactly this the actual keys are going to be different of course but the actual AWS access key ID it has to be in this format it has to be written just like this now you could use it inside of credentials as well but the reason I'm using it into. EnV is so it doesn't conflict with my system at large and so I want to once again run my mpm run deploy and hit enter so this time what it's going to do is it's going to attempt to use these environment variables credentials but one of the problems is it's actually loading these into my environment as well this is definitely going to be causing an issue very very soon but now our new error is a little bit different it says that it's not authorized to perform this data so the user I just created is right here so that is the user we want to give permission to be able to perform all of the things that serverless wants to do and the reason this is working is because it's in the EnV itself this right here so now we need to give those permissions we actually need to create that policy so again in the actual GitHub repo itself you can go through and see this exact same policy I'm going to go ahead and copy it the repo is right here um and it will be linked in the description as well and so in my case I'll go ahead and create this policy reference and it's policy. Json I'm going to paste it in here I'm actually going to add this to get ignore because I do not need to commit this so I'll add it to get ignore and what we want to do is we want to change this AWS ID here so I'll do a command F or control F and we want to do a finding replace and I'll do a colon in front and back of it so we want to change that all across the board and the way we're going to do that is by finding the actual AWS ID for our account inside of I am so if I go ahead and close this out for the user go into the IM am dashboard we can see the account ID right here I'll go ahead and copy that and then we'll go ahead and replace it inside of this policy and we'll just do replace all just to make sure that our account ID ideas in here now you could in theory use just a star here but I like narrowing this down to specifically the account in the case of this particular policy so that if I were to reuse it again in the future I actually review it before I use it um and that's the idea okay so I will say that the permissions themselves if you were wanting to remove permissions you totally could but that might cause problems when it comes to the serverless framework itself that you'd have to look into even more but now that we've got this policy let's go ahead and create it by going back into the I am into policies here create policy and I'm going to go ahead and click on the Json and we'll go ahead and paste this in and so now I've got that policy in there with I think everything that we'll need I did test this a number of times so let's go ahead and create this policy now and I'll just call this the serverless framework uh permissions policy and then notice that it has very limited access I'll goad and create this policy and we want to attach this to either our user or our group I'll go ahead and attach it to the group click on the group itself click on permissions click on ADD permissions attach policies and do a quick search for the policy we made so the serverless framework policy I'll go ahead and attach that now the important thing about using permissions and policies within aw is you can attach too much permission right so the admin access gives quite literally all access to all AWS Services which could get you into trouble you could accidentally provision way too many things and cause a lot of Havoc within your account and charges as well and of course that also means that you maybe accidentally exposed your access keys and all that so the policy we're using is really limited to what the serverless framework needs to do it doesn't have unlimited policies just a few which is why we created our own and it's why it's working in this way so at this point we now have all the policies in place to run what we need to run within our serverless framework but before we do I will say the policy itself if for some reason the policy is not working we have inside of the gist itself for the policy reference we do have a gist available for you uh that will have the latest up-to-date information related to this policy and the servess framework so be sure to check that out that's going to be linked in a lot of different places uh but the idea being that the we use needs to actually work for everything we're trying to do here so with that out of the way let's go ahead and actually run our deployment here right so we go ahead and do mpm run deploy hit enter and so this will attempt to deploy this now with all those policies in place and so since it's going to deploy or attempt to I'm also going to go ahead and create another script in here and we're going to call this remove so deploying and removing are basically two sides of the same coin it's really just that simple to deploy it and then turn around and remove it I'm going to let this finish it might take a minute or a little bit more depending on our speed and everything that's going on in the AWS configuration uh but I'll go ahead and let this finish and then we'll come back and actually test out that deployment it's not quite done yet but it did say just a moment ago that it was uploading to S3 so since that just finished I want to actually look at S3 to see what that means exactly so if I do a quick search in here for S3 inside AWS we go click on S3 and then in here the buckets themselves I want to search for serverless and what we see here is our serverless application right there so this is going to be autogenerated for us that's what the serverless framework does it will autogenerate these buckets for us yes we could designate our own buckets but that configuration is a number of steps that we're just not going to cover so the idea here is we can click on this bucket we can click on serverless we can look at our project the stage that we're deploying the actual commit of the stage and then all of that same sort of data the Json State and all this so if I click on this state once again if I go ahead and open this I will see that there's my database URL and I also have my AWS access keys in here that's not great either so all of these things we we definitely want to move towards getting rid of and so we actually might have an error in our deployment because of our AWS access keys and sure enough by the time I said that we do so we definitely do not want to have these access keys inside of our deployment now I wanted to show you this error because it's related to ourv so we're kind of at this weird spot it's like well I need to use these keys but I also don't want to deploy them so how do I actually handle that well luckily for us serverless diagel has the ability to do this so what we can do is we can actually customize what environment variables will be submitted to our backend so if we do custom and then and then exclude we can actually put a list of enir environment variables that we don't want so like the AWS access key here and the secret key here we can do all of that and it's really just that simple every once in a while we might have to use something like this AWS session token and you know we also will get to a point where we want to remove the database URL we're not quite there yet but we will have that for now so with this in mind I have some changes that I made I'm going to go ahead and run this deploy again uh notice that these are still coming up so it looks like maybe I did something wrong and this should say exclude so I canel that out and we'll write it again now those environment variables are gone the ones that we just simply don't want anymore great so this is exactly what we want so once again it's uploading and we can see once it's done that progress so we go into prod here it's going to show us a new folder based off of that and it's also based off of the time itself so this one looks to be a little bit later so I'd imagine this is the more recent recent one and sure enough it it's already uploaded I can verify the state once again open this up and once again the things that I'm okay with exposing for now are exposed um and the other ones are not and so once this actually finishes we will actually test out this Lambda function um and we can also review the function itself inside of Lambda which is something I'll leave for you to just explore on your own uh but for now I'll I'll wait for this to finish and there it is it's fully deployed so if I go to this URL here let's open it up in the web browser just to make it a little bit easier to see it might take a moment to spin up but once it does it's going to be really really fast as we see here so there it is and of course we can also curl this out and get the same results from it it has the timestamp that's coming directly from the database itself which is what this is right and it also is using the environment variable for that database URL which of course is not as secure as we want so what we need to do is move this towards being a lot more secure than it currently is and that's definitely something we will do um in the next part but overall what we see here is we removed ourv variables we can deploy really simply using basically the same things we've been doing with the offline then our policy this is the part that's going to be the trickiest to get configured and set up and working correctly if at any point you need to change things you need to test what policy problems are happen happening you can always go into I am and go into the policy directly and let's go ahead and search for it Ser a list click on this since we have Json in here now if I go to edit this I can navigate to any block section so let's say for instance our S3 isn't working correctly like this put but bucket tagging let's say for instance that wasn't on there and it showed you there was some issue with that what you could do is go into that one single block notice that over here it says edit statement now I can actually look for various statements so like put um tag or let's go ahead and just type out tag and we can scroll down here and we can see put job tagging put bucket tagging put object tagging we can just add these things in really simply like that into this one statement and then it would actually adjust as as it needs to so this is a good way that you could really experiment with what's absolutely necessary when you go through and start building things out so for example let's just try this out I'm going to go ahead and get rid of this describe right here so I want to just refresh real quick um and I'm going to go ahead and edit out just one single permission so we can see what that error might look like right so we're going to come in here it looks like I'm still editing I'm going to go ahead and get rid of this describe star I'll go a and cut that out for the cloud formation stack itself we'll hit next and we'll go ahead and hit save changes I now removed a pretty major permission so going back in into my local project here I'll now run mpm run remove which will of course attempt to remove this and right off the bat I get not authorized to perform this describe stack resource so I can't even do anything here as far as deleting it so that alone will stop it in its tracks for any given user and so I can actually go ahead and come back and change that once again by going back into cloud formation and bringing that back in and that one single permission allows me to do describe which we could also verify by going into the statement itself we can go to cloud formation and describe as soon as you put describe there it will describe all of these things so describe star does all of these actions right here which is why they're highlighted and selected because of this if I remove that and come back and type out describe I can see that none of them are selected so it's actually kind of nice that I can I can go about doing that whenever I need to okay so we definitely want to bring that back in I don't need to do it manually I'll just write in that one single change save everything and now we'll go ahead and attempt to remove this and it might take a moment for it to be fully propagated as far as the description but right a bit right away it it happens really really quick basically and so getting in the habit of deploying and removing is perfectly okay because our code itself is being tracked by git so we don't have to really worry about our code our actual database itself is not being managed by serus at all the only thing that's being managed by serus in relation to the database at this time is the database URL but very soon that is going to be abstracted away to something a lot more secure but overall the idea here is being able to deploy and remove really really quickly at a moment's notice and having the correct permissions to be able to do that for your AWS user and so you could also practice right now just take a pause practice right now and actually remove access to this particular user and see if you can recreate access because it's fairly straightforward um but if you don't know how to do it then this is a good time to practice that before we move into securing that database URL a little bit more so let's take a look at that now if you're ready now we're going to go ahead and configure our database URL to be more secure by using the AWS systems manager so if you do a quick search for systems manager inside of AWS we're going to go ahead and configure this to work with our application and more specifically through the parameter store of the AWS systems manager now this is not the only place that you can store Secrets or variables that you want to encrypt but it is one of the best ways to do it so we're going to go ahead and create this parameter here and we're going to start off with Slash I'm going to say project name slash stage and then SLV name something like that that's kind of the format how I think about creating these parameters themselves so what I'm going to do is this is going to be called the serverless and nodejs API the stage itself we'll just use prod and then I'll go ahead and do database URL that's it so this path right here is something we will use in the future so let's go ahead and grab it into index.js I'm going to go ahead and do const and this is going to be my um database URL SSM pram and we'll pass that in okay so that's just the name that we'll have next up we need the database URL itself so I'll go ahead and use the Neon command line tool for the connection string and we'll go ahead and place that in here I'll copy that and we will then place that inside of our parameter itself so we've got our name we can use this standard tier we're going to use a secure string so it encrypts this data and then we're going to go ahead and use our current account we don't need to change any of these things you can use all of the defaults the next thing is just going to be the value that we want to encrypt and then I'm going to go ahead and create this parameter so it's encrypted at rest so if I go go and look at it it's going to be stored in here crypted until I decrypt it so this is decrypting it right there I'm just toggling that value and then of course if I go to edit it I can have it decrypted as well okay so it is stored encrypted and so what that will allow us to do is well a couple things number one we could abstract this to an environment variable now I'm going to leave it in just as a standard variable but since you now know how to create environment variables that would be a good place to potentially put it in your environment is actually using that database URL SSM peram and it's called SSM that's just the default name for it for the AWS systems manager that's the name of it as we'll see when we actually start using the package itself inside of our application so the main thing here is then I want to actually decrypt this data I want to be able to look it up and grab it so before I can look it up and grab it I want to grab the database URL out of the environment variables and so inside of myv I can leave it there but inside of the Ser list yaml I'll go ahead and just add it into exclude just so it never goes in and so at this point I wouldn't have access to my database client at all and that's what we want to change so to change it we're going to be using the AWS um client for JavaScript which is the AWS SDK and so we want to install this I'll go ahead and copy the install call here and then we'll just go ahead and install it onto our project it's really important that we do have it in our project as a dependency because it is an actual dependency for our runtime so the way it works is inside of index.js we can bring this in and I'll do const AWS equals to require and this is going to be the AWS SDK no surprise there then we initialize a SSM connection so there's going to be SSM equals to AWS SSM and then the region itself so region we will pass in here as well and once again I'll go ahead and add another con in here as AWS region and we'll go ahead and use Us East to once again you can have this as an environment variable that you might want to change when you need to the region and the parameter work in tandem together okay so now this is going to get all of the parameters or at least allow for me to get all of the available parameters inside of that particular region right so the parameter store in that region it'll show all of them that are in here this is a going to allow me to do that so the next thing is our actual database URL so now I'm going to go ahead and say const DB URL equals to well what exactly well to do this I'm going to go ahead and bring in the data itself the data from the pram store so let's just call it pram store data and this is going to be await SSM the initialized class right here and we also want this to say new because we are creating a new instance of that class so await SSM doget parameter and we want to pass in some arguments here the first argument is going to be the name this is the name right here the second argument will be with decryption and we want to say true we stored it as a secure string if it just says string we would not need to use with Des decryption it would just work then if I do promise I can turn this into an asynchronous call which is exactly what I want after I've got that I can actually use this neon Connection in here with the parameter store data Dot and this is going to be do parameter. value just like that and so this should actually now allow for us to make this work um and so obviously we need to make sure that this is actually working at least locally before we even push it into production because there is something else we'll have to change with our production environment so let's go ahead and do mpm runev and once again we see that debug is in here now that database URL is not in here and I'm not even using it anyway so there's nowhere in my application that has that database URL for the client itself so what I want to do then is open up the local host and let's just take a look let's make sure that I can still connect and still get that data and sure enough I am it's still giving me the exact data that we were expecting from the database itself in other words this parameter store data is actually working so we can console log this out and see that parameter store data as well uh just like this and let's go ahead and just refresh our application and reload the page and we refresh in there there it comes through and there is that data coming directly from the parameter store great so this same package could be used to set the parameter as well that that might be something we do later uh but for now we have the ability to use a production ready parameter for any given stage so naturally we could also come in here and do our constant and say stage and this is going to be equal to process. env. Stage or just simply prod right and so then we would come in here and just change this to being something along those lines and it should actually be right here and that will give us at least the ability to modify the stage once again this could be just its own environment variable itself so now that we've got this what I need to do is I need to prepare this whole thing for production to be able to just use this so one of the things about AWS Services is they can communicate with other services without a secret key or an access key they can just do it together and so in our local environment we have those keys in EnV now the only reason I can actually communicate to SSM with these environment variables has to do with what we did in the last part which was in the Imam so if we go into the I am and into our user groups into our particular policy that we've got on here we can see something important to understand why this user can actually access that data it has to do with this right here SSM and star now generally speaking you probably don't want to give every resource every access to this parameter store I did this as a quick and easy way to have access locally yes it would be something I would want you to change as well but I knew I was going to point it out so here it is I'm pointing it out and this is where we'd want to change it right here now so that any different project just doesn't have access to all of your parameters on your entire application but the thing is when we deploy this to Lambda it doesn't have permission it won't necessarily have default permission in here just like this so we need to actually add in the permission that it needs we need to give it the permission now one of the ways to do that is by inside of ser. yaml what we could do is inside of this provider here we can add in additional permission and the way we do that is by creating Imam and then we're going to create a new role this role will give it a specific name that starts with serverless I'll show you why in a moment as well and I'll give this serverless SSM roll and then the statements we need for this is effect and we want to allow resource and we're going to go ahead and say all and then I'll go ahead and write out all of the things all the actions I want to allow and so I'll just copy and paste them and we'll talk about them right now and where you can diagnose these things SSM get parameter what do you know we just did get parameter get parameters well that would be a multiple parameters by path and so on right so these aren't all of the permissions that we necessarily need need but it is all also opening up the resource everywhere so this role will play the same exact role as this effect right here and so if we wanted to get more specific with that we can go into edit here into our policy or any policy this is how we can start learning about those various permissions like we've talked about before we've got SSM here which is this systems manager if I get rid of this all let's get rid of that all call right there and just search for systems manager oops get rid of that comma and then look for systems manager as far as the service is concerned we can start seeing all of the various commands that we can use in here and this of course is going to correspond to the SDK itself how the SDK can be used so each one of these policies is important so get Pam we see that there what if we do create and we've got a bunch of different create in here right create resource right well we also might want put Pam right there that would be adding in one right uh and then maybe delete and so all of these different parameters that you might want to have for a resource is well this is where you'd find them this is the easiest way to find them in my opinion now of course we could just do all of them here but our Lambda roll certainly does not need all of that data the actual serverless role is mostly just for our user so going back to these serverless permissions I'm going to go ahead and abort all the saves and changes these permissions right here have nothing to do with our Lambda project itself right that's what this role does the only way it has something to do with that Lambda project is so that my local environment or really this particular user right here has access to do what it needs to do so these Keys these access keys and secret keys are not on the AWS Lambda by default which means that this code would not be able to work unless of course I were to add that actual roll on here and the reason we have this serverless Dash roll on here is so that we can can actually manage this so inside of our Json data inside of our policy here if we scroll down a bit and look for our role let's do a quick command and we look down a little bit for I am and we will be able to see that we've got some roles here so create roll put roll and so on notice the name of it right so we've got serverless Dash is how it starts which is exactly why I call this serverless Dash and then the SSM Ro right or I could actually say to be even more safe I could say my SSM rooll or project SSM Ro if if it does conflict with anything else that the serverless framework gives us but the idea is if I went in here and called it my SSM rooll like that I would have to update this policy for this particular user to adjust for that like it wouldn't be able to create it this is only able to create if the actual name of it starts with serverless great so now that we've got this let's go ahead and try and deploy this here so we'll go ahead and do mpm run deploy and what we should see is two environment variables working that's great I don't have any errors so far about the role itself uh but the main thing that we want to verify is the fact that the application once it's all said done is going to go based off of this database parameter that will end up coming through so I'll let this finish and then we'll check that out and so now I get this error of I am tag roll action I I don't have the ability to tag this role so it's not performed the tag roll so this actually gives us the opportunity to see how we can adjust our policy to make that work so back into our seress framework permissions policy we're going to go ahead and edit it I'm going to navigate down all the way to theam stuff and so what we've got here is get roll create roll and so on so in here I'll select it we'll do the included service and we we'll go ahead and look for the put and I think it was what put or no tag roll so we'll go ahead and do tag roll and there is the tag roll and untag roll so I now add both of those just so I have the ability to do that tagging and so we'll once again save this so we hit next here and save changes and then of course now I'll go ahead and run that deploy once again and of course that's how it goes this is how I was able to build out that policy in general and these things modify every once in a while there's little things that the serverless framework adds on that it needs access to do like taging a roll I don't think is so critical that it needs to have it happen but there's also things behind the scenes that I don't know that's going on that might have a good reason to be able to tag that role or have that particular permission um so yeah I'll let this finish and then we'll come back and here we go we got this success that it was able to deploy so I'll go ahead and grab the URL itself we'll just do a quick curl call it might take a moment for it to boot up completely all across the board but once it does it actually gives us the results that we were expecting so now what I want to do is simulate the problem with something on Lambda itself like if there was some issue on Lambda let me just go ahead and use you I'm just getting rid of the L there altogether right and the only reason for that is just to see what's going to happen with these results here and I also want to bring in something else and that is I'm going to grab this right here here I'm going to just go ahead and say now and then I want to go ahead and declare a Delta and this is going to be equal to dates. now minus now. now. time and then we're going to divide this by a th000 I want to see how long it takes for the application which is what this will do to actually reach the database and come back and that's what's going to happen with this new Delta here just so I can see what exactly is happened happening within the database so we can once we actually get everything working we will see a Delta so it'll be something that changed at least a little bit and so uh this is going to be our now. now or maybe our um you know DB now result let's just make it a little bit more of aose so it's not too confusing here and there we go okay so now that we've got this let's go ahead and do another deploy so I don't need to push it or anything like that I just run npm run deploy and it will start that process so what's going to happen is it's going to fail right so it's not going to find the parameter and it's going to be a major system error so what we're going to need to do is look for AWS Lambda we'll actually go into Lambda itself we don't need to type out AWS actually uh so we'll go into Lambda itself we'll click on here and we will see very soon where we will be able to navigate the problem with this and it comes into monitor this tab right here we can go to view cloudwatch logs and this will allow us to see some of the errors that might come through for this particular project from these log streams and we can see other logs as well like the things actually working but the idea is this log stream is how you might be able to diagnose errors with your particular project so in my case that that error is going to be this parameter is invalid and so the application itself won't attempt to use that parameter until it attempts to use the database client at least at this point right so realistically I think when I wanted to boot up the application I might then want to initialize the database client right away instead of always in the endpoint itself but those are optimizations that I don't really need to worry about because they're not really that necessary at this point but the idea being that we've got an error coming through and I just want to see if it's done and here we go we've got it done I'll go ahead and do a curl here um and we should get an error back invalid or internal server error so let's go back into our production and we've got two new streams here I'll go to the newest one click on that and we're going to see that we've got a runtime error right here and then we'll go parameter not found right so that's the actual error itself and this obviously means that a parameter name was not found I don't see anything else in here for catching the error or handling the error as in right here we might want to handle that error or down here we might want to handle that error in the actual application itself U but I wanted to simulate what would happen if there were an error with the parameters themselves so we could actually update and make sure that it's working correctly on our next deploy right and so this would be something you'd also want to test locally and make sure that all of that stuff's working uh but it is important to be able to also see the logs when you need to inside of a Lambda function and just a quick recap as to how to get there you go to the Lambda function you go to Monitor and you go to view cloudwatch logs and then You' be able to see all of the various logs in there including the metrics that are happening for the application itself and how quickly all of that's working and so once this actually finishes I do want to see that updated Delta I want to see what that value is because it's pretty impressive so while that's still building I'm going to go ahead and create a new uh we'll do mpm run Dev and then I'll go ahead and go back up I want to compare and contrast the production version versus the local one so if I do curl HTP Local Host 3000 I should get that Delta and it gives me a pretty fast Delta right this is local and then if I go into production do the same thing and it will take a moment to boot so the first one might be a little bit longer but then it's going to be just as fast or a lot faster uh it really depends on where you are located in the world and how quickly the Delta ends up working as well uh but overall that's it we are able to now check is how fast we can actually get the database call back to us a return call back to us have the database spun up and then also have our application now live and you know out there so we can we can use it in many different ways uh which is great so from here on out we just need to build on top of this because realistically we have the foundation we would need to build full-fledged applications we've got our database we've got a way to deploy it and we've got a way to secure it with the secur data itself um it's fantastic now one of the things that we're not going to cover are custom domain names so adding a custom domain name to this system it's not negatively supported by serus because there's a lot of different ways to put custom domain names on AWS Lambda there are some support out there for custom domain names but that's just not something we're going to cover at this time that would be the only thing I would want to improve about this project for fully production consumer facing projects if it's not consumer facing projects which is kind of what we're assuming this could go anywhere now we could actually use it anywhere just by using that endpoint itself and then making updates as we see fit now we're going to go ahead and decouple our application so it's a little bit more concise on how it works and it's just cleaned up then we're also going to update our AWS SDK we want to upgrade it to a newer version now this older version is still valid and it still works and it'll probably work for years to come but overall we want to start adopting the newer stuff because the package itself is probably smaller as well and maybe just a little bit easier to use than what we've got here with this promise and all that that's kind of a mess so let's go ahead and use the new package which is aws-sdk client SSM so we'll go ahead and install this into our package since we're installing that I'll go ahead and do mpm uninstall the AWS SDK I want to get rid of that one right away just to make sure I don't have that any longer so with this in mind I want to just change how I'm going to grab that parameter itself so to do this I'm going to go ahead and create a new module with lib secrets. Js and I'm gonna quite literally copy everything from this DB client stuff up and so this is where we'll change everything so I'm G to get rid of all of the expressjs things that we just simply don't need in here and this will be just be get database URL now and it's going to be roughly the same values here I do not need the neon stuff just yet we will decouple that one as well um but overall if I were to keep the old version this is how I would decouple it right just like that and then I would update where I need to use this database URL with the Imports as we'll see in a little bit as well okay so with these things in mind I want to go ahead and use that new SDK so what we can do is we can look in the documentation and scroll down a bit you can see the installation process we've already done that you can also see the import process is basically this right here so I'm going to go ahead and copy this and we'll go ahead and paste this up here now and so I'm going to get rid of this AWS here and to create the client we'll go ahead and come down here and say const client equals to well let's see how they do it in the documentation it says new and SSM client hey what do you know is very similar to what we saw before in fact we can pass the region in like we saw before as well the exact same way so that modification was already really simple so we can get rid of that and it's not a whole lot different going throughout so next one is going to be the actual pram data so I'll leave that in but I'll get rid of this git SSM parameter way but rather just turn this into an object itself then we need to actually create this as a command so we say con command and this is going to be equal to well what we've see in the documentation is this list associations command which is not what we want we want something different and there are a lot of different commands that you can run in here if you scroll down you will see all of them the one we need is of course the git param command so G parameters is in here so let's go ahead and grab that and I'll just search by it unfortunately it doesn't give me the actual command that I want by searching in here if you go into the API reference you will see it but what I know is I can just go ahead and change this to just get parameter command it's really just that simple so the command itself will then be new of that command with these arguments in here and then we want to grab the result from it with a wait and this is going to be client. send and that command and then the final data is just simply that results parameter. value and that's it just a SM a small modification to how this ends up working and then we'll go ahead and do module. exorts and the git database URL is equal to that right there great so what we could do now of course is change our DB client to work off these secrets now I could do it in the index.js or I can follow along this pattern of decoupling and decouple the DB client to its own location so that's what I'll do I'll do DB clients I'll use the S in here for a reason as you'll see once we get to the drizzle stuff now I'll go ahead and copy all this once again just because it's already working mostly okay and I'll get rid of the things I don't need which is serverless HTTP and express I also won't need this AWS part here instead what I'll do is WR under the neon import I'll go ahead and do con Secrets equals to require and then my new libr library for secrets and that's the one I'll use so I'll go ahead and get rid of all of the SSM stuff in here now and so really this pram data store is no longer going to be that instead we'll get rid of all these comments even I'll just do con DB URL equals to secrets. database URL and of course this is an asynchronous function so we go ahead and do await those secrets and now I can actually just use that database URL just like that so it's cleaned up a lot in both places if you ask me the database URL SSM pram I might put in here or maybe I have it in as an argument itself if I needed to change it and support other things uh but overall it's definitely cleaned up a bit so last thing of course is just to bring in our database client into the index itself so to do that we'll do module exports and the database client is going to be that database client let's go ahead and stick with that same naming convention we just did with the git database URL to get DB client and just change it ever so slightly and now in index.js here all we need to do is bring that one in as well this time I'll just go ahead and import the function itself and so we'll do const getdb client and this is going to be require and then do/ DB and clients just like that and of course the index file itself is not inside of Any Given folder which is why we can just do that so now that we've got that here is the old DB client which I'll go ahead and get rid of and now SQL itself can be just a wait get DB client right there and I can get rid of all of the old AWS client stuff as well as Neon right inside of this index.js because now it's imported in places that it makes a lot more sense so of course the last thing to test here is that this is actually working so let's go ahead and do mpm run Dev here and so there is my Dev server running and let's just do a quick curl call and we'll go ahead and do curl and after a moment or two because everything needs to spin up it go it does that exact call great so there's one last thing that I want to fix and that is this Delta here this Delta I don't think is actually accurate so what we want here is we want to grab now which is going to be date now we want to actually call this before we call our SQL command this will now be in the past where this should be in the present hopefully we're going to go ahead and change the ordering of this just slightly and I'll go ahead and get rid of the divided by a th000 here we'll save that and I want to just refresh real quick and run this and just see how quickly I can grab this data okay so let's take a look at the Delta now it should be a different number so this is still in milliseconds so we can put that back of SL a000 just to make it a little bit more accurate as to seconds uh but overall we just wanted to see something a little bit closer to how long it actually takes so I'm making this request from the other side of the world and it actually is still going pretty fast which is pretty nice um but with this in mind we're going to go ahead and do mpm run deploy I did say the other side of the world I meant the other side of the country um that's all so I'm going to go ahead and deploy it let that run there's one more thing that I might want to have in my scripts in here which is just going to be simply info so every once a while when you run deploy it's not necessarily going to always show the endpoint that you are using so using just info will give us that endpoint so we can see that by running it in here with mpm run info this will give us that data from the current one that's being deployed which is going to be this endpoint right and so the endpoint will or could change if you took it down and then brought it back up that endpoint will might most likely be different uh but I'm going to let this finish and then we'll just verify everything is working with our stuff decoupled and all that looks like it finished it was pretty fast this time so we'll go ahead and curl it out and I'll run that and there we go so the Delta is really really small right that's the point of what we see there and so the actual commands themselves the time stamp it's a little bit tricky to get an accurate representation of the time stamp that's happening but over overall we do see that the actual Delta is not really that different it's like almost instantaneous because of how close it is whereas when I was when it was on my machine it was a little bit longer and that would make sense because of how far away it is uh but it is very fast cool so now that we've got this decoupled and we're using the new AWS client it's time to actually start building out our schemas so we can actually get to the process of actually putting stuff in to our database now we're going to go ahead and implement the drizzle omm now OMS are great because they allow us to basically use our runtime language to manage our database in this case our runtime language is Javascript and it's going to be managing our postgress database and drizzle omm is going to handle that interaction for us so we can just write a bunch of JavaScript instead of writing a bunch of SQL and what it's going to do is it's going to allow us to insert data like you would with SQL it's going to allow us to retrieve data delete data anything you would need to query it but it also treats the data with the the same data type in other words if in the database it's an integer javascript's also going to treat it as an integer this is also true with time stamps and we've already seen that so using just the neon database client we actually saw this select now doesn't come back as a string it comes back as a date object itself which is how we're just able to do this in other words I don't have to parse it I don't have to like parse the date from here and build out some complicated parsers just to use the value from the database itself and that's the key here is we want to make it so it's super easy to use values directly from the database without making any major changes so let's go ahead and create our first schema so inside of DB here we'll go ahead and do schema. JS or let's call it schemas actually so schema. JS is where we'll Define our tables so this is what what we're going to do is we're going to do const lead table and for now I'm going to go ahead and just put a dictionary here and really think out what I want in a lead table table well I'm going to go ahead and say I want an email uh perhaps I want to have a timestamp right like a created timestamp and maybe I want a description maybe a source right maybe the first name maybe last name right and so these are all of the columns that I might want to store in my database now to keep things simple I'm just going to go ahead and just do email and timestamp for now but instead of calling it timestamp I'll just call it created at so the these are going to be the fields we will want to use and so we need to actually design these in a way that makes sense within the drizzle way of doing things so let's go ahead and comment it out for a moment and let's take a look at the drizzle documentation and how this is done so inside of drizzles you know om. drizzle. team in the docs if you scroll down here we've got these column types here for managing the schema this is where I like to start because well I want this to manage my database so we need to start with designing the tables we're going to design and of course you could reverse engineer this too if there were already existing tables but that's outside the scope of what I want to accomplish instead we want to just create the ones that we want to create so in here we're going to go ahead and look for a column type called email and we're going to be severely disappointed there is no column type email now this is for good reason that's because email is really just a string of data right it's just like abc abc.com that's just data that's just a string of data so validating that it's email is going to be something we'll do later for now we just need to treat it as if it was data and that is going to be text right so it's really just text itself so we're going to go ahead and follow this line of thinking right now to build out our lead table so I'll go ahead and copy and paste it in and we can't do it in this fashion instead we have to change it to const and then equals to require and that's how we actually do our Imports with this es5 or the Comm common.js module um and so the next thing is going to be our table so the the const table the actual variable name itself or the constant name in this case we actually want to keep it in as lead table the actual database table itself I'm going to call it leads so this is going to be my JavaScript name for the table this is going to be the actual database name for the table and what do you know this is also the same so the first thing is going to be our email we want to leave it in as email this is going to be the text data type or the text column and inside of there we're going to go ahead and leave the database as email as well so they are corresponded they're the same you don't always have to do that but that's one way to do it next we're going to go ahead and do our created at now within JavaScript I typically use this camel case and I think that's the best practice within JavaScript but in the database I might want to change it a little bit differently and use use snake case which is common inside of databases but you could use camel cases if you want as well so what I got here is a leads table but hopefully you have a little bit of a red flag going off or a little bell going off is like should I treat something like created as or created at as text well no there's another data type and we've already seen it we've already used it which is related to a timestamp itself so going back into our column types we can scroll down and we can see there's one called timestamp there's one called time there's one called date so I can actually use a timestamp itself and that's what I want so I'll go ahead and come in into my schemas and use a timestamp in here and so I'll go ahead and use that instead so we can also set a default here as to default now so default now works on timestamps it doesn't work everywhere else so if you wanted to set a default let's say for instance for a description let's do that now we'll just add in that new column I can add in a default here with just simply default and say this is my comment or something like that right we can really build that out on how we want to do it based off of the data T tables that we have here and so another advantage of using specific integer types has to do with doing calculations or running functions inside of the database itself so if I were to jump back into the neon console here I want to show you this in action so first off this select here I can run this and this will give me a bunch of numbers right it's going to generate series it's going to generate numbers from well in theory 0 to 100 but it actually goes into 11 101 rows so it does from 0 to 100 into however many rows that takes which is what that generate series does and now we can also count how many rows there are by selecting and running that and this is a database function to count up all those rows we can also use a database function to sum up all of those rows there's many different functions that you can do and you can also see how this would affect maybe different rows in there as well so if you did a 100 and you multiply each row by 1.2 that would give us that sum whatever that ends up being right and of course you can also do much bigger rows as well so this is where databases start to really outshine doing something directly in JavaScript you would actually do the calculation something this big you would do it on the actual database side and then you would bring it into JavaScript and so that's where these data types have get another reason to use them as the correct data type right so in the case of the time stamp this is great because then I could use the JavaScript stuff to calculate how long it's been basically when I want to display it but also if I wanted to do like the number um you know of entries I could do something like that and then that way I could then grab maybe an in a integer type so we come up here we've got an integer I could bring in an integer in here as to what that would end up being right and we could have this default being zero now the integer itself we would also want to import it okay so there's a lot of different ways on how we can think about writing out these actual tables themselves I'm actually not going to do that number of entries this way because there's a way to count it in the database as well with the email being the same uh so that's outside the scope of this one but these are generally the fields that I want to have but there is one more field I want to keep as well and that's going to be an ID field so every time I do an entry I want to have a new ID field in here and the way we do that is by using something called serial and we can set this as our primary key just like this and this will be our primary key for the database which helps with indexing and speed and all sorts of stuff but overall the primary key here and we'll go ahead and say not null in fact wherever I don't want this to be empty in the database I will go ahead and say not null the description itself it could be empty that's okay and so now I've got my actual schema that I want to use for my entire project itself I've got something a table with ID email description and so on so it's time to actually bring this in to our table or our database itself but the thing is I don't have a way to bring it in just yet so let's go ah and do module. exorts and Lead table and so what we've got here now is the starting of creating an actual table in the database but we don't actually have the SQL behind it so we need to create the SQL behind it and then we need to run or migrate that SQL and actually execute that SQL and so that's what we'll do in the next few now we're going to convert this JavaScript schema into an actual SQL migration file so to do this I want to start off with creating a folder called migrations now I'm putting it inside of the SRC because I really just want this all bundled in one place sometimes it's recommended that you just put it in your main project itself so with it in here with this folder existing and my schemas existing I'm going to go ahead and create a new file called drizzle. config.js so this configuration itself is going to be fairly straightforward we declare a constant called config and we set this equal to well our schema the location as to where it is and then where we want to Output this schema as in that migrations folder and then we're going to go ahead and Export this as a default so export default and the config itself so the reason I can do this is because the way I'm going to actually use this file is going to be right inside of package.json I'm going to go ahead and come below here and do something called generate and this is where we'll use use drizzle kit itself so drizzle kit will allow me to do drizzle kit generate colon PG as in postres and then config equals to drizzle. config.js or that file name and that file path so in this case the file path for drizzle. config.js is right next to package.json if you had it into the SRC folder you would do something like that okay so with this in mind I now have a way to call this file but I don't have the configuration correct so what I'm going to do is I'm going to copy this relative path here bring it in and what do you know then I'm going to go ahead and copy the relative path to migrations also bring it in and there we go so we can use slash if we need to uh but the idea being that it's a relative path to wherever drizzle config is and once again if it was in SRC we would get rid of SRC no surprise there okay so with this let's go ahead and run this mpm run generate and hit enter and all this is going to do is create our SQL file this is quite literally the SQL that will run to match this schema right here and if we were to change anything let's say for instance we got rid of this description here I want to generate that migration file and once again it shows me exactly what I need to run in order for me to match my schema to my database right so it's going to do each one of these things and that's it that's really just that simple and if I were to add a whole another table let's go ahead and add in a whole another table I'll call this lead table 2 and leads to whatever right let's go ahead and Export it just make sure you export it as well and then I run the schemas again now it's going to go ahead and show me hey what do you know it's creating a new table in here so since I haven't actually touched the database yet I can actually feel comfortable about deleting this folder if I try to run it again it's going to probably give me an error or it's going to actually do the entire thing itself right so the actual migrations folder was created this time uh but I've had it where it didn't create but what we see is it actually will now just combine those two commands into one and so this is where you can really learn a lot about SQL itself because well we're using JavaScript to generate that SQL so in my case I want to get rid of all that and I really just want that one single uh migration file itself and that's where we're going to leave it is just like that now there's one other thing to note is this meta folder in here is giving a snapshot of each change that's happening so it's really keeping track of the changes between these SQL files themselves so when we go ahead and run it that's how it knows not to do anything it didn't actually change anything because we didn't change anything in our migrations so that's actually pretty nice that it has a history of these changes the other thing about this that's really great is I can check this into G so I can keep those changes throughout and I can see all of the schem of changes that might happen so if at any time I need to roll back those changes I could and also I could run these migrations on well maybe a branched version of the database and make sure everything's working correctly before I actually run those migrations on the main version of the database which is yet another thing to think about when it comes to developing with the various schemas and of course drizzle om so at this point now that we can create or generate our migrations we actually need to perform those migrations and to do that I'm going to make my own CLI tool to actually make that happen just because it automates it just a little bit to work with our serous environment that we have and more specifically with our neon database and all of our neon client related things itself because as it stands right here this actually won't do the migrations for us we need to do a slight variation on this so let's go aead and and take a look at how to do that now now there's a few ways to perform these migrations one of them is you could just copy and paste it and just run it wherever you run SQL like for example in index.js we could run it like this we could go into the console run it there but those things aren't scalable and they're prone to a lot of Errors instead we want to automate them so to automate them I'm going to create a new SRC file called CLI SL migrator JS so this right here is is going to use TSX to be called so TSX SRC CLI migrator JS so we want to call it this way and the way it's called is by doing if require domain equals to the module itself we're going to go ahead and console log run this exclamation mark so obviously we need to fill out all of these things but before I do I want to make sure that TSX is even working so inside of my my package.json I want to go ahead and add in my Dev dependencies two things so mpm install D- saav dev. EnV and then also TSX so EnV is going to play an important role as well as we'll see in just a moment but in my migrator here let's go ahead and just run this I'll go ahead and use this command itself into package.json and we're just going to call this migrate there we go so MP M run migrate hit enter and run this great so that's exactly what we wanted to see the next thing I want to check is the environment variables access key and secret access key we just need one of them really so in my migrator here I'll go ahead and do console log and process. env. AWS access key write it again ooh undefined now this is why we use the EnV so if I just run require and. EnV then config run that and run it again now we can see that environment variable and it's coming directly from EnV so just like with our serverless package we have EnV being able to be used our now CLI package can also load in those environment variables like that now why is this important well hopefully you realize it's important because well we need to grab our secret we need to get that database your L and so back into my migrator here let's go ahead and import that database URL by doing cons secrets and this is going to be equal to require and dot dot slash and this going to be lib secrets and then now I'll go ahead and do my secret itself so how am I going to go ahead and call this secret because it's asynchronous well I'm going to go ahead and do async function and we're going to call this perform migration and then I'll go ahead and do the const DB URL equals to A wait secrets. git database URL and then I'll just console log that DB URL okay so then in this main module here I can just do the promise handling and we'll say Val equals to and something like this and then catch error and let's go ahead and console log the error okay and so we can exit the process with process. Exit 0 this is successful we can also do process. one as in unsuccessful okay so um there we go now I want to just see this database URL let's go ahead and run it again and our database URL shows up so this at the very minimum is how I can actually use those environment variables now we might need to change them we might need to use different ones but it's just important to realize that if I did actually use my environment variables let's go ahead and change it just slightly and do export the access key to something like ABC now it's in the environment I'm going to go ahead and comment out to this EnV here and I'll go ahead and run this again and now it just shows me ABC and I get an error right so that is why we actually use this required. EMV but it is actually not required as long as the environment variables are actually in the environment it will use them uh but if if are not in the environment which in our case they will not be we will go ahead and add in that that way can we can actually get the correct data itself which of course caused this error still so let's go ahead and just refresh that terminal itself and create a new one and then npm run migrate like that and there we go so now we've got that data Okay cool so now we need to build out and use that database URL and to do this we need to use a neon pool or a postgres pool uh but the neon seris pool now the idea for this is to have multiple levels of a transaction that might need to happen which is exactly why we're doing the serous pool and so the repo itself for this package actually has some good documentation on how to do exactly what we need which is doing our pool with a websocket itself so this is what we're going to be doing I'm going to go ahead and copy all of this and we're going to bring it on into our code itself for this command line so the first thing is I need to uh you know take out the actual import statements and change them so they are es5 and we'll do require and like that and then same with this and here we go great so let's go ahead and tab this in a little bit so first and foremost the connection string well what do you know it's to this database URL it's no longer process EMV it's just to that database URL okay so we would have a console if there's an error with the pool itself that's something that well basically like we would want to just reconnect or try something new so the next parts of this are how we actually run the query itself and so the things that are important that we're really going to change is this right here right and so that's where we need to actually grab drizzle and change how we use drizzle itself so inside of here is where we're going to run our migration so at the top we're going to go ahead and first off do const and drizzle and this is going to be require and it's drizzle o SL neon serverless and then I'll go ahead and also we're going to go ahead and implement this down here so we'll do const and DB equals to a weight drizzle of this client and our schema so we need to also Define where our schema is going to be so let's go ahead and grab the entire module itself so right above secrets I'll go ahead and do schema and this is going to be our DB and schemas okay there's our schemas there next what we want to do is bring in migrate so I'll go ahead and do const migrate and this is equal to require and this is drizzle o rm/ poost scjs SL migrator okay so this is a manual way to call migrate and then the next part is going to just be await and calling migrate on the database itself and then using the migrations folder which in our case is in SRC migrations right right here so grabbing that relative path we'll put that in there just like that so that's it that's how we will actually run those migrations at the very end here we'll await pool. end just to close out that connection after all of that stuff is done so this is definitely in a lot of different places for the documentation uh but the key thing here is that we got our database and if of course if there's not a database so we'll go ahead and say if not DB URL then we'll just go ahead and return something along those lines uh but now we can actually go ahead and run this migration again and so let's go ahead and run it and at this time what it should do we should get rid of these console logs but overall what it should do is actually update our database and so we'll see that with I'll just console log in here like migrations done and we'll say running or run migrations get rid of this and then migrations error great okay so with this in mind let's go ahead ahead and run it again no big deal running migrations migrations done great so to verify let's go back into our neon console into our tables here and what we should see is our leads table inside of that leads table if we wanted to look at some of the data about it we see the different columns and their data types as we see here great fantastic so let's go ahead and take a look at our actual schemas themselves uh notice the ID is down here as well em created that let's go ahead and bring in the description and just do an endtoend example where we go ahead and run our generate and then we run our migrate okay and done refresh in here leads look at the description is right there great so now we have a way to change data in our schema change the design of our schema based off of different columns that we we might want to have then we have a way to actually migrate that schema now the client the pool itself can do a lot of different transactions at once which is why we use the pool itself and it does it using websockets so you're not going to do this in a serverless environment that's why we needed to create a command line interface for it you would not run these migrations once you deploy it to service you run it beforehand and so that's even more reason to actually get the production database URL as well or whatever one you're wanting to deploy to which could add in some additional arguments that you might want to have for that maybe you always have the the production database URL as a part of your secrets so in here maybe you would want to grab this and go get prod database URL and change this stage in here just to be simply prod now I'm not going to do that I'm going to leave it as is uh because that's really the only string that we are using as the production but you could also change the stage itself in the environment if you wanted to change it to something different getting that database URL assuming that it is available uh and if it's not it'll just go ahead and say you know it'll return it and not try to migrate anything so yeah uh the pool is great you can do a lot of Big Time queries with this pool if you were not deploying to serverless this is probably the method you'd want to use for your database in general for some of those bigger transactions that might take a little bit longer so cool now that we've got this we just need to start inserting data into our database using crud methods and then we're going to need to validate that data to make sure it's valid data uh but that's really the main things we're going to do as far as the actual uh you know schema and database related queries are concerned with our schema created our migrations done we are now ready to start actually creating and retrieving data from our database using drizzle and neon so to do this we're going to actually Implement a curl function so we use Curl HTTP col SL localhost 3000 and then leads that will be how we get that data which in this case it's not found currently then we'll go ahead and also use the same exact idea with grabbing post data so to do Post data with curl we do-x say post we're going to go ahead and pass in a header this particular header is going to be content type of application Json which is very common for data that's using an API of some kind and then we pass in data in this case I'm going to go ahead and put single quotes on the outside of it and then double quotes on the inside so we'll say something like email and test test.com and then we'll use our endpoint right there so those are the two endpoints we need to configure so the first step of this is to allow our actual data from the expressjs application to work so the first one is going to go into index.js we're going to go ahead and grab this git path here and we're going to go ahead and just say leads so leads this will allow us to get that data so if we were to run that curl command after we restart our server here we can actually see that data come through or in this case it's just saying hello world now the post data is not any different than this we're going to go ahead and grab this and just do post so instead of dogit it's just poost and that's it so with that in mind we can go ahead and curl it again actually we will go ahead and restart the server one time and then I'll go ahead and press up a few times to that curl post and once again it says hello from path and so both methods are now working HTTP post and HTP git typically speaking you use the post method to you know create data or update it sometimes you use a different method than update but we'll just leave it in as create data so that's what we want to do here and the way we actually can create data is first and foremost we're going to change the Callback Handler to asynchronous and then we're going to go ahead and say that we want to grab the actual data that's coming through from this request itself so that means we're going to go ahead and say the email data is going to be equal to await requestbody we'll just start with request. body or R eq. body so what we want to see here is just echoing back maybe the body data in here and we'll just go ahead and use it's something like that so once again we'll go ahead and restart our server here and then I'll go ahead and do another curl command run this and there's our body data coming back but it looks kind of weird it says buffer so the reason for this is because natively this data is not going to just automatically be turned into Json data so to change it into Json data and basically to accept this application Json content type header we just change change our Express app just a little bit by doing app.use and express. Json just like that so with this in place again restarting our server what we'll be able to do is we'll be able to run this call again and now it'll actually Echo back that data to us which means we can now start to actually bring this into our database which of course is going to require us to do that and so the actual body we can go ahead and now say that's the data and then we can go ahead and say the email is inside of that data itself and now I can just Echo back that email great so here is where we need to insert data to the database so how are we're going to go about doing this now first and foremost in the DB folder here I'm going to go ahead and create a file called crud DJs now what we're going to be focusing is the C and the r of crud that is creating data and retrieving data that's it we're not doing anything more than that at this point but once you have this found in place you'll be able to use it wherever you need to so the main thing here is we want to go ahead and create another asynchronous function so async function and this function itself we're going to just go ahead and call it add lead or maybe new lead new lead might be better and then in here we just want to bring in the email itself and then I'll go ahead and console log the email okay so that's that function then we'll go ahead and create another function and we'll call this list leads and once again we don't necessarily need an argument here and I won't console log anything at this point and then we'll go ahead and do our module exports and do new lead equals to new [Music] lead and then module do exports and list leads equals to list leads now no surprise here to actually use this data back into our expressjs app we can go ahead and grab it so what I'm going to do here is I'm going to say const and we'll just call it crud and this is equal to require and this is going to be DB and crud okay so the reason for that is then in my leads here I'll be able to grab my data so let's go ahead and just say that our results are equal to a wait cr. list leads and then our results down here for the data itself which will be a single result this is going to be uh new lead and then we would pass in this data right here now my git command I need to change this to be an asynchronous callback as well and so that's the goal that we're working towards here basically just have a quick and easy way to add a new lead in and all that so of course this shouldn't be anything new nothing about this so far is really that challenging to do at this point hopefully if it is challenging then yeah this might be a little bit too advanced for you but overall what we want to do now is we want to use this schema to do those two things getting new leads in our database and listing things out now of course the first thing I actually need to do is add data in there before I can list any of that data out so how do we do this well if we jump into our client we have a database client but we don't have anything related to drizzle drizzle has its own database clients itself so if we actually go into access your data in query what we see here is this drizzle Command right here where it takes in a client and then it also can take in a schema so what we need is just the drizzle portion of this and basically we want to have a drizzle client itself so now what I'm going to do is change a new client function inside of my clients for my database and this is now going to be git drizzle DB client just like that roughly speaking and then the SQL itself is going to go from the other normal git DB client cuz we definitely need to use that one and all I need to do is I'm going to go ahead and grab in drizzle and it's going to be equal to require and drizzle omm and what we want within drizzle omm is going to be related to the database itself in our case the database is neon so we'll do neon HTTP this is coming directly from their docs and so we'll go ahead and wrap this SQL client into drizzle itself and so now we'll go ahead and Export that get drizzle DB client in here just like that great so that's the first step back into crud here now we're going to go ahead and do con client equals to require and this is going to be ourclients or we'll go ahead and say clients in here and now I'll do con DB equals to await clients. G drizzle DB client great so this is the drizzle DB portion and so what we see in the documentation is we now are at this part so we're close to doing something like this and the documentation in drizzle documentation for the querying is vest there's a lot that you could do in here so we're really just scratching the surface of what you can do but the idea is something not too dissimilar to this once we actually get into listing out our leads but before we get there I actually want to insert new leads and the way I'm going to do that is by doing const and schemas equals to require and/ schemas and so now with that schema what I'll do is I want to go ahead and grab in a result and this is going to be equals to await dp. insert where we're going to insert it in is related to the schema and the lead table all right so we're inserting it into that specific schema this again is from the documentation so if you go into the documentation under under accessing your data you can see this insert here and we see that it says DB insert this users corresponds to what we've been calling lead table right I just made it this way so it's a little bit easier for me to understand what's going on and to unpack at any time but then we see thist values here and that's the key of actually inserting the data is values and then what data we want to insert so in our case our schema only has really one thing that we're wanting to insert in here and that's the email but if we had other fields in here this is where we would do it we would actually bring in those other fields so what I want to do here is then this is going to be an actual object we'll pass in here that object we will unpack and then go ahead and bring it in to the values as well so email being email in other words once we have this data Maybe we're going to go ahead and say new lead object and then you would pass that in something like that uh but for now I'll just leave it very verbose so it's going to be the actual email itself so what's happening here is it's going to insert that data but then it's just going to be done so if we actually wanted to return data we can do returning just like that so that will actually return back the result that's come through in here and we can narrow this down even more if we wanted to to let's say for instance the time stamp or uh let's go ahead and do that we'll just go ahead and bring back timestamp and this is going to be the data based off of the schema again and the data that's being inserted here so timestamp and then dot uh let's go ahead and grab what the schema value is in our case it's created at so there we go so this allows me to return back that timestamp as some sort of result so we'll go ahead and return back that result and see what ends up happening there okay so now it already is in place in that actual location so I'm just going to go ahead and console well let's go ahead and Echo back the results I think I have something close to that I'm going to go ahead and just do results and results so just changing it slightly we're not echoing back the data but rather echoing back what goes into the database so with this in mind we want to actually call that that one curl command that we were doing just a moment ago with the post and then I'll go ahead and run that and of course it might take a moment for it to fully boot up and I actually might have to restart my server alt together uh so let's go ahead and try that once again and we'll go ahead and email put that email in there we get an empty reply from server and so if we come back in here I get this n column value okay so part of the problem that I'm currently facing has to do with validation right so is this valid data the other problem is how I'm actually passing the data into my new lead I just talked about it I actually turned it into an object and then I unpacked that object right here so we actually want to just pass in the Raw data right here for now and then we'll go ahead and undo that so now the request body goes directly into the new lead but we still need that validation of some kind so while this is working with this we'll go ahead and just run it and see if we get anything back from the server in this case I do get something back I get that time stamp back or at least what I've been calling a time stamp and notice that I get an actual array back and a list of items back so what's happening here is inside of this result it's inserting it into the database and then returning back a list of items so basically we will say then if the result. length is equal to let's say one then we're going to go ahead and return back the result zero right and then otherwise we'll go ahead and return back the whole list of results themselves or we could say null or just an empty list or something like that but the idea being that if this is a list then we'll just return back that first the index value in that list and so returning back a specific time step we can also return back so so uh new email or the actual inserted email and then just do something like this right and that will actually change back what we return as we can see we might have to restart the server but if I go ahead and try it again uh yeah let's go ahead and restart the server here and we'll go ahead and run this one more time and when we get back should be that email or new email basically as the object that's coming back to the you know expressjs application so great so we now have a way to insert data now in this case I actually don't want just one single thing I want all of the data the entire schema to come back to me and that's what's going to go into the front end but we could narrow that down if we were interested in doing that so now if I do it again let's go ahead and restart that server again and we'll go ahead and run this and what we get of course is all of the data including the ID in here so at this point it now seems that I have seven items in here based off of how many times I submitted it to the backend and notice that the ID is auto incrementing the created timestamp is changing as well the description has that default text that we had in here uh right there so the last thing that we would need to really do with this crud is to well build out this list leads so it's actually very similar to what we've got here except it's just using different database commands so when you think about listing things out you're not inserting in any data but you're going to be using the same schema so instead of inserting we're going to go ahead and use db. select and then instead of insert it will be just from and that's roughly all we need to do so I can go ahead and get rid of this values here this time we're just going to leave it as results and I'll get rid of this all together and so I'll go ahead and put that in there just like that okay very simple very straightforward and then we can also do something called limit we can call limit on here on the number that we want to send back and so now with this in mind we've got got our leads here there's our results I'll go ahead and give out results here as the results I probably don't need that message in there any longer on either one so let's go ahead and save that and this time I'm going to go ahead and just curl the one single leads so we'll just run curl which will be a get request by default I'm going to go ahead and restart that server and then I'll go ahead and run that curl command and what I should see is all of those leads coming back in the item of results right and then again if I were to submit some more I can curl it again and once again I'm getting more results so back into crud what we could do is a lot of different things to just make this a little bit more organized for example so I'm going to just add in one little quick element here and we're going to do uh const and this is going to be descending and we're going to say equals to require and this going to be drizzle or in okay so I'm going to just reorder this and then do just order by and this is a camel case here and what we want is the sending value of the schema data so we'll go ahead and grab this schema data here and then I'll grab the field I want it to be ordered by so in this case we have created at so descending value of created at and let's go ahead and restart that server again and then I'll go ahead and refresh this request and so what we should see is a reordering of it so ID of nine is coming first instead of three and so if we wanted to see this a little bit further we can also do like maybe get lead so git lead would be an individual item and maybe we're just going to grab it by the ID and so once again we're going to select that same table but now we can use a condition in here called where and within where we can actually import something from drizzle called EQ as an equal so we would say where EQ and this is going to be our leads table. ID is equal to the argument of ID and then we'll go ahead and Export this as well so get lead get lead get lead and these results should actually be a single result now similar to like what we saw with our new lead and once again I'll go ahead and grab this single result if it exists otherwise well go ahead and say null right so if it's not a single result then null now this should be a single result because of the ID itself and so back into our index what we could do is just try this out with Git lead and then some result that we are pretty sure is going to happen which is the ID of nine and so let's go ahead and restart our server and then we'll rerun our call our curl call and then this should give us that one single result back and sure enough it does right so this wear Clause is also very powerful because it can filter things down in this case it filtered it down to just one thing where it's equal to this and this if you're familiar with SQL this is a very SQL like command that's happening here and of course if we only wanted let's say for instance the ID in here we could do table. ID and put this in as ID and like that so this is selecting the ID from that value and then we can curl it again let's go ahead and restart that server again and run that one more time and so that would actually give us just that data and that would be true whether or not it's an individual item or a listing of items and so drizzle gives us this really powerful way to perform SQL without writing a bunch of SQL commands but they're SQL like so it's getting closer to what you would need to do to write the SQL commands yourself and so while you're learning drizzle you're also learning a lot about SQL without writing a bunch of of it and it's really just that straightforward so at this point I will challenge you to go into the documentation and just find how you can play around with this data more because there are so many different queries that you could write and then of course you would want to update data and delete data as well but with all three of these we have a foundation to how you might end up doing all of those things uh so I do encourage you to read more about the database itself but at this point it's actually pretty powerful where we're at we can definitely improve it but as far as an API is concerned to store data and more specifically email data this is it we can absolutely do it but we are leaving one critical piece out and that is of course this data validation so that's something we need to do now let's take a look at Zod now we're going to do some very basic validation using Zod now you can absolutely make this more robust by using drizzle Zod which actually corresponds really closely to our schemas that we've already built and then even more robust is now we're going to go ahead and use Z as a way to validate the data we have coming through now eventually you can make this now we're going to go ahead and validate our data that's coming through from our request so I'm actually going to change this data to post data in instead and then we'll go ahead and grab the data itself from somewhere else which will be a wait and we'll call it validators do let's say validate uh lead and then we'll pass in the post data there so naturally we need to bring these things in and actually make them happen so to do that we're going to go into our DB and we're going to create something called validators JS and then I'm going to go ahead and generate some sort of validation function so we'll go ahead and do async and we wanted to call call it the function of validate lead so I'll go ahead and do that function validate lead and it's going to take in post data and then it's going to return some stuff and we'll do our module exports and validate lead equals to validate lead so the things we want to return is for sure the data itself the actual validated data right that's a key part of this now I also want to see if there's an error so I'll go and say has error and then maybe a message now the reason for that is then I can change the respon response based off of those two values so in other words if the has error is true then we're going to go ahead and return one type of response and then if the otherwise we'll go ahead and say else if the has error is equal to undefined then we'll go ahead and do another response now if it's undefined I'll go ahead and just give a message of server error and has error the actual status code will be 500 and then if it does have an error in general we'll just go ahead and say 400 and I'll give it a message based off of the message that's going to come back or we'll go ahead and say something like invalid request please try again okay and then down here with the new lead itself assuming that that actually worked and we didn't have to do a tri Block in there uh we'll go ahead and change the result to being 2011 because that's typically what happens when you create data the status code will be 2011 so with this in place we need to build out this validate lead function so the first things is are how we're going to respond back and that's going to be has error true and then message just being an empty string so the way we're going to validate this is by using a package called Zod now Zod itself is very popular and it also has a lot of other features to it as well as Integrations to drizzle and then maybe even more robust validation messages with this Zod validation error but what we're doing is something very straightforward we're just validating the email we just want to make sure that that's working so when we do insert data into our database it's valid data and it's not invalid data so to do this I'm going to go ahead and import or actually install Zod so we'll do mpm install Zod and then inside of my validators here we'll do const and z and this is going to be equal to require and simply Zod so what I would normally do is I would probably put this schema the actual validator and the schema together i' put them right next to each other because they're based off of each other but in this case I'm just going to go ahead and put it in this validate lead function and so that means I'm going to go ahead and say my lead object validation something like that or just simply lead equals to Z doob and then we can start to design or format what we want to validate in here and so what it's going to what's going to happen is we can put any sort of data in here we we can put a name we can put an email we can put a description and then in here we can put those things that correspond so like name email description we can fill these out more but the idea is what you put in here might correspond directly to what's inside of the lead itself so in our case it's just going to be the email itself so instead of name and description we're just going to use email and the way we validate that is by having an actual dictionary an actual Z object in here and that means z. string so we definitely want it to be a string in here so it's not going to be a number that's for sure and then we can use email as well and so using email is another method that's available to us because of Zod so if we look in Zod and do a quick email search in here we can actually see some validation that's going on and there are a lot more places where we can actually call email and you can actually see that in the Baseline validations so we can also validate the size of it we can make it bigger or smaller right we can validate the length of it which is similar to that uh we can do what it starts with what it ends with if we wanted a certain domain name um there's a lot of different things on how we could go about doing the validation itself which is beyond the scope of what we're trying to do here I really just want some simple validation and so this is really designing the way we're going to validate it then when we actually want to validate it we will use lead. part so let's go ahead and say valid let's do valid data equals to lead. parse and now what we're parsing is we're actually going to pass in an object of some kind so this needs to be an object which in our case it is it's the request post body object itself and of course if this is incorrect it's going to run a validation error itself so I'm going to go ahead and run catch and error so we want to put it into this catch statement like that and basically what we would do then is say let has error right so we'll leave it as undefined and then I'll go ahead and say has error equals to false and then has error equals to True right so if it doesn't have an error then we will you know set it accordingly so the error is only going to come when we try to parse out this data so this valid data really is actually going to be let valid data and I'm going to set this equal to just an empty dictionary and basically that's what's happening here this parse call is going to parse out all of the data that I actually set inside of here so if the post data had a name for example it's only going to validate the things that I declare and it will only return the things that I declare so that now that I've got this valid data here I can actually pass it in and then we'll go ahead and just do let message and then I'll give it a message of equaling to an empty string here and then the message being invalid data invalid email please try again okay so now we've got our validations and our various validation errors so let's go ahead and try to run this now so mpm runev and then I'll go ahead and do a curl call in here just like we've seen I'll go ahead and run this I get an empty reply from the server so let's go ahead and see what's going on my validators is not defined so let's go back into this page here and we'll go ahead and valid or actually bring in our validators the entire module let's make sure we exported it and it looks like we did so let's go ahead and try that again we'll rerun the server and then I'll go ahead and trigger a call and I get invalid request please try again so that invalid request please try again is coming from this right here so the message doesn't seem to be coming back from this validate lead so let's go ahead and see why let me bring this back here so we got our message and message ah there there it is right here I need to go ahead and pass in that message so we'll save that and let's go ahead and refresh that server again and then we'll go ahead and come back in here run it again and now I get invalid email please try again so let's go ahead and do an actual valid email by just putting this at the at sign hit enter and now what we should see is of course the database is going to have to spin up and then we see this so the nice thing about this validation is it's actually not going to hit our database until it's actually valid data which is another reason to do this and of course this is sort of the basic level of doing this kind of validation the more robust and more advanced level would be using drizzle omm because you can build out schemas based off of other schemas and what I would actually do from here is to modify my schema of validation that I've been doing and do something along these lines where insert user schema and you could import that in this validate lead if you'd like or you could just just use the schema directly in the view as well but the reason I like doing this validate lead here is I can actually have a little bit more robust methods but of course there's nothing stopping you from actually putting in the post data itself uh so yeah that's how we can actually validate this data and we do it using Zod feel free to make this more powerful and also research a little bit more about Zod itself on how you can just do that much more validation to me the validation is as good as you can actually use it as as you can actually implement it right so I could definitely spend a lot of time trying to figure out how to give a much better message than invalid email as in the validator itself actually will give you better validation errors than just invalid email right so I can actually see what those messages would be directly from the validator itself so let me just show you real quick if I restart this and we do an invalid data again I run that again it says invalid email if I go back into the node I can see just exactly what different validation errors are going so yeah we can absolutely make the messages themselves much more robust than what we've got here but to me it's like we better just validate the data and then provide some sort of message that makes sense as well so another thing that you could consider doing in addition to validating the data is you could also check to see if it's already exists in your database right if it's already there and the way you would do that is very similar to like what we were talking about with crud where it's like listing out the leads but more specifically looking where the email matches the requested email and if that exists then we run another sort of validation error which is something that's outside the scope of this but we're working that direction the key of all of this with whenever it comes to inserting data from the Internet is you almost always want to clean that data and validate it before you want to send it into the database the database itself will help with the data types whereas the validator itself will help us with these kinds of messages and prevent as many database server errors that they could possibly be because of poorly formed data uh so yeah validators are great and fairly important so yeah hopefully you start using them now uh it's pretty simple and of course it can be a lot more robust thanks to zods and its powerful tooling so let's keep going now we're going to go ahead and automate the process of staging our application so like the dev stage for example when I make some changes with the code I want to be able to deploy it to GitHub and then GitHub automate the process of deploying the application on AWS Lambda but also branching our database at that point in time so we can work off of that and the way we're going to do this is by of course updating our secrets this is the key part we actually need to update our secrets for that stage so if you think about this in seress diagel we have our production environment going through and this database URL well we actually don't use that anymore so we're going to be gone altogether with that one so what we need instead is really the stage itself and so this is going to be based off of an environment variable that we will then later set but the default stage is going to be prod we're going to use the production stage as default now the reason for this of course is going into our secrets here where our database URL is actually stored that's where that stage is being set right so then in my get database URL I can actually modify what stage is going through so in other words in my Dev stage I can add in stage of Dev and get rid of this database URL right here and get rid of it basically anywhere so we don't accidentally expose it when we don't need to so of course our AWS access keys have the ability to get that database URL that's stored in this parameter of course one of the things that you're thinking about hopefully is the fact that we set this parameter manually so we actually went into AWS to set it manually that isn't automated it's quite literally the opposite so what we need to do is we need to automate the process of not just the stage itself but also setting in that database URL now luckily for us actually using the neon CTL it's really easy to Branch the database so if we look at the branches that are available uh we can see that with neon CTL branches and list we can see all the branches that are available and right now it's just the main branch so if we want to create a new one we could do it like this create name and then like Dev right so that gives us back this connection string here and of course if I wanted to use that connection string or just the connection string I can use neon CTL and connection string and then the branch itself of Dev and so that's going to be the exact same connection string as we see and so this one we want to treat as ephemeral we want to be able to delete it at any time and then Branch it later at any time both things we want to be true but that presents A new challenge with our production application and when I say production I mean one that's running on AWS Lambda regardless of the current stage itself so we need to be able to automate the process of updating our parameter store and luckily for us it's actually not a whole lot different than getting data so we'll go ahead and do that now and let's go ahead and put data we'll do put and parameter command and that's the command we're going to use now it's almost identical to this git database URL so I'll go ahead and paste that over and copy it and so the idea then is the param path needs to change based off an argument so we'll go ahead and do the put database URL and there's going to be a stage that we'll pass in here and I'll just call this cons Pam stage which is going to be equal to the stage that's coming through and otherwise it's going to just be simply Dev what I want to check right away is if the Pam stage is equal to the prod then I'll just go ahead and return I don't want to change the pro the production stage ever right I don't want to change this automatically that's something I'm happy to go in manually and change if I need to and so this pram stage is now going to be what the actual path is for this item or really the database URL name which of course comes down here so we go based off of that for am stage and now we'll be able to modify it in just a moment so this put URL database URL we actually need to have the DB URL value as well and so that's going to be part of the arguments we'll end up using everything else is pretty much the same of course if we wanted to change the region based off of our environment variables we could totally do that I'm going to leave it in as just the default one that we've been using uh so because I'm not really going to be changing anything else but the idea here is the other parameters that we want is the value so we use value here and we'll pass in that database value now once again if that value does not exist then we'll go ahead and return this element here and basically it's incorrect it's not going to work um so we still want to use this decryption but the way we do that is instead of with decryption we just declare the type itself and this is going to be secure string and of course we need to put that in the string itself so the idea of secure string is of course coming directly back into the console itself when we did create a new parameter here we have these three type options here I'm just using secure string and all of the other default options in there yes there are ways to modify this as well so feel free to look at the AWS SDK documentation for that next up the final thing is overwrite and I'm going to go ahead and say true so basically whenever I want to put a new stage database URL I want to overwrite the value that's currently in there just to make sure that that it is my latest value of course there could be issues and errors with this process which is why we don't do it with our production database we only do it with everything that's not production everything that's not the main stage that we're using so of course the final commands are basically the same but instead of get parameter command we'll use put parameter command in there and then I'll just return the result we'll save it just like that so before I go any further I can I will tell you now we can also do something called delete parameter command where we can actually delete and basically undo everything we just did and it's a little bit closer to this git URL where you just basically use the name itself but the reason I'm actually not going to be deleting the command itself is it makes it a little bit more complicated but also since we are going to be overwriting the database on a regular basis it doesn't really matter if that's the correct connection string or not so I'll go ahead and just leave it in as just put next up we'll go ahead and do module and and export. put database URL we'll go ahead and Export that out so we can start the process of automating it now the way we're going to automate it this is by using a command line tool we're going to make our own command line tool so we're going to copy this migrator one and I'll go ahead and just call this put secret. JS now this is actually more like put database secret so maybe you want to change the name of it uh but I'm not going to change the name now first off the migration command we don't need this function anymore we already have a function we just created that function on what we're going to end up doing and so I'm going to go ahead and get rid of all of the things related to my gr now the main reason I actually copied that is to remember I definitely do need the EnV configuration because remember in order for me to use the AWS SDK itself I need to have the environment variables for the AWS secret access key and the ID both of those things I need in there which is exactly why we've got this command line tool going and why I like copying these other command line tools I use the same sort of stuff um and so now the question of course is like what are we going to put in here so before I do that what I want to do is I want to declare args and there's going to be process. argv and then we'll go ahead and do slice of two now the reason I have slice of two is because the way this works is going to be the same as the migrator but of course instead of it being the migrator it's going to be put secrets and then we'll go ahead and do something like stage and then the DB URL so this is going to allow me to get everything after like basically these two items here and so all I want to do then is just check if the rs. length is equal to two then we're going to go ahead and do process. exit one that's it so we just want to exit it and say something like along the lines of console log and usage is that simple enough okay so now let's go ahead and try and run this one I'll copy the relative path here or basically run TSX so I'm gon to go aead and clear this out we'll do npx and then TSX run all of that I'll start out with hitting enter and it'll say run migrations so this is a problem this should be not equal to two let's try that again we run it again and now it's giving me the usage itself so we'll go ahead and try Dev and then sum DB your L hit enter and now it says run migrations which of course should be you know update secret right something like that there we go so how are we going to update the secret well we're going to grab the arguments themselves and the way we do that is by doing con stage and DB URL and it's going to be based off of those args so this is of course is unpacking it based off of the position it's in which means that if we accidentally do the incorrect position we've got some issues here here so this might be where you want to check if the stage is equal to like Dev or something along those lines and then go ahead and run everything else I'm not going to do that I'm just going to assume that we can have any stage we want in any database URL we want so we'll go ahead and set the actual Secrets now from our new Secrets function so to do this I'll go ahead and say const Secrets or Secrets equals to require and lib and then Secrets okay and we'll do secrets. put database URL what do you know we can put in our stage and our DB URL here so the idea here is it is an asynchronous function so you want to do a wait but the nice thing is with asynchronous functions is we can actually treat them as promises so we can just do then and we'll go ahead and say value and we'll do console log secret set and with that value whatever that might be so we go ahead and do value and then put these into back ticks and then process and exit being zero because it's successful and then we'll go ahead and do catch and in this case it's going to be error and once again we'll go ahead and run something similar to this but now secret not set and the error and we'll go ahead and exit at one okay cool so now that we've got this let's go ahead and set a secret I'll just go ahead and run this value here hit enter and we've got update secret secret set object. object so what is the object that's actually being logged in here so we'll go ahead and do console log value and then we'll go ahead and do Secret set and let's just call this updating uh you know database URL let's try that again again and we run that and here's the value that comes back now this is the default result from the actual put database URL that's what this is this is right here so we can actually see that the status code is 200 which is successful right of course we could also then get the result we could quite literally use something very similar to what we did with this git database we could quite literally go and grab the result itself to get the actual value but it's not necessary really we just need to know that the status code is 200 and I'll let you unpack that and handle those errors later um but overall we just want to verify that that secret is actually in our parameter store now so in this the correct region that I set it in I'm going to go in here and what do you know there it is and so I'll go ahead and scroll down here and debug the value or decrypt the value and we've got our sum DB URL here so it's definitely working it's actually putting that database in there and every time I run it it creates a brand new version of it so I can refresh in here and we you can see that version and that thing is actually changing so if I change it again and refresh in here we can see what that is so actually the nice thing about this is there is a way to get different versions of it as well so if for some reason in the future you want to be able to change this to different versions you could totally do that uh which is pretty cool so now we actually need to put the correct database in there right so again using the same idea we're going to now do our NE on CTL and this is going to be our Branch or rather connection string and then we want to get the branch of Dev okay so this is going to be that connection string the way I'm going to automate this is by putting it into a export command now if you're on Windows this right here is not going to work for you until you go into GitHub actions there's a way to do it this is just not it anyways so we're going to export this as our uh we'll go ahead and do this our database what's call this our stage DB URL and this is going to be grabbing it just like that okay so now I can use MPX right here Dev and then use dollar sign of that and oops not dollar sign equal dollar sign let's get rid of that dollar sign equal dollar sign and now we've got the updated database version six let's go and refresh in here and so now we should actually see that database URL and it's there so of course we just need to verify that that part is actually working and we can do that thanks to of course this process. env. stage and our EnV dev has that stage as well so let's go ahead and run our local mpm run Dev our local application version our offline version notice that's that stage is coming in there so one of the things we might also want to check maybe is on our homepage is putting that stage out there so let's go ahead and grab that argument like this and we'll go ahead and put this on our homepage just to make make sure that we actually are using the correct stage in that environment so let's go ahead and restart that run it again so I'll go ahead and open up local host and so what we've got is that stage of Dev so what we can see then is we can actually start adding in some data now of course inside of the neon console we can verify this as well so going into the tables we see that we have our main branch and our Dev Branch so let's go ahead and add a few leads in here and of course the way I'm going to do this is by using that curl command so let's go ahead and I'm going to copy and paste what we used in that last part uh but there we go I'm going to run this a few times and then I'll jump back into my neon console here and we should see all of those new leads and we do so if I go back and change the branch to main something like that and also see those new leads they are not there so it is now working on that branch which is great so of course the last few things that we would really need to do is to push this into production but one of the things that I also want to make sure that I'm doing when I actually run this is inside of package.json perhaps this is where I'm going to want to run that command just to remember it um but the thing is since I'm going to be automating it what I'm really going to be doing is using this inside of GitHub actions itself so this is going to be a GitHub actions CLI command that will run when we get there uh but overall this is fairly straightforward on how we can set the staged URL and set the branch so of course the last thing that we could really test here now is to delete that branch and do it all over again so we go ahead and do that with neon CTL branches and list right so let's just make sure it's in there I'll go ahead and do branches and delete then the name of Dev all right and maybe just Dev try that again there we go and so we've got our deleted Branch now so let's go ahead and create a new one neon CTL branches and create the name of Dev okay so I'm going to go ahead and scroll up a few times to get that database string again to our environment variables there and then I'm going to go ahead and use it once again like that so I'm putting that secret on there updated it's number seven now we can verify that it is a different branch let's go ahead and take a look in the parameter store and we'll go ahead and refresh in here and so I'm going to do a quick search we got EP soft Glade and we got EP soft Glade great so let's go ahead and refresh our node again and let's go ahead and add one more so if I remember correctly off the top of my head this is going to have 13 or 14 items so this should be item 13 and we see that with ID of 13 we do it one more time we got the ID of 14 and so now that same exact branch that we originally tested on is gone and if we look at the new Dev Branch we we can see that it's down to 14 and of course the time and all of that has changed great this is awesome so the next part of this of course is now turning this into a GitHub action that it does all of that for us so we don't actually have to touch it anymore uh so let's go ahead and take a look at how we're going to do that now now we're going to go ahead and use GitHub actions to automate our deployment stages first we'll automate the production deployment and see how that works then we're going to go a little bit more advanced and use a different git Branch to deploy a different stage the dev stage itself once it's deployed then we will go ahead and submit a poll request to the production Branch to have that run then at that point so I actually already started with some of the workflows themselves so if you were interested in going from what we have now just go into the workflows on GitHub and we're going to go through these things right now so the idea behind GitHub actions is it's a serverless way to just run some code the code we run is really up to what we Define in these GitHub actions so in our case what's happening in this production one is it's going to go ahead and grab a copy of our code it's going to set up no. JS and specifically version TW 20 and then it's going to go ahead and run our mpm install and then run our mpm run deploy which of course is this Command right here and so that's it that's the production stage itself now one of the dependencies this production stage has is the actual environment variables from AWS so let's go ahead and start this off right now let's go ahead and put that into our GitHub actions repo so inside of our settings here there's a couple things that we want to do the very first thing is under GitHub actions we're going to go into General we want to go ahead and scroll down and allow for workflow permissions to read and write permissions as well as allow GitHub actions to create and improve poll requests those things are important for what we'll do in the staging process the next thing is we want to scroll to the secrets and variables in the GitHub actions um section and then we'll go ahead ahead and scroll down to repository Secrets this is where we're going to put in our environment variables okay so what we want to do is these two in variables here so the first one is going to be the access key ID I actually want to create a brand new access key ID specifically for GitHub actions so inside of I am I'm going to navigate to my user here into security credentials go into access Keys create a brand new one we'll go ahead and say other next up we'll go a and give it a actual description this time so GitHub actions secret basically or GitHub actions workflow basically and we'll go ahead and hit create we're going to grab that access key there and we're going to bring it into our GitHub action secret we'll go ahead and create that one next we're going to go ahead and create the secret access key itself and we'll scroll down and paste this in here and then I'll go ahead and copy this secret access key and paste that just like that okay great so now our GitHub action should have everything necessary to deploy our production application so I'll go ahead and run this workflow now the nice thing about these workflows is they are fast and efficient and they have really good logs to figure out what's going on and what might be wrong with them so one of the things that would have been wrong is if I didn't actually fill out these secrets now something that's cool about how the environment variables work in the serverless application is that it will use the AWS access key and secret key instead of using EnV so that's actually a really nice feature of the seress framework so the next thing I actually want to do is I want to put the stage here and I'm going to go ahead and say it's prod now the reason for that is because in my Dev stage I also want that to be Dev which I already have in there but I just want to make sure that I do have that in the production one that also means that since I'm declaring the stage in more places I want to make sure that when I'm grabbing my database secret that stage has an environment variable one right so it's certainly possible that you're just like oh I'm just going to use the production one but hopefully you didn't do that based off of a a number of things that we did with the last part of actually setting and automating the secrets themselves so with that stage in mind I also want to update serverless.yml and I want to change the I Ro here to also match that stage because originally it was just for the production stage we only had one stage really creating a role now we're going to have multiple stages creating roles so we just want to update that as well so we're going to save everything and then in my GitHub I'll go and update these and we'll go ahead and say prepare for stages and we'll hit commit and we can sync those changes at this time okay so back in GitHub notice that my deployment actually did happen deploy our production app and it happened in 43 seconds we can click on this deploy section here and in here we actually see our endpoint come through so I can open that up and sure enough this should actually work for me and then if I go into leads I think I just get one lead back based off of some of the configuration we set up which of course we could verify in our code itself for leads the actual default page we actually have G lead in here instead of get leads okay so now that with this change it's a minor change that maybe I want to be in Dev first in to the dev stage first so what we want to do is in our prod stage yaml I want to change this a little bit to being based off of poll requests so in other words my Dev stage workflow is going to have to generate that poll request so that's going to take a number of steps and it gets a little bit more complicated than what we just did so if you want things really simple and you just want to run just straight to production which I don't advise but you definitely could you could automate this with a push request and basically change this to the branch that you're going to be using for your main branch or your primary branch on GitHub which would just be Main and then if you had this active it would automatically run that workflow that we just did workflow dispatch okay so I'm going to go ahead and just leave only workflow dispatch for now and work on the dev stage and really figure out how what we're going to be doing here now we already have a few of the environment variables in place we do not have the neon API key which we'll need in a moment we have our stage here we also already have a built-in GitHub token this is provided to us by GitHub actions it actually we don't need to set it we don't need to find it anywhere which is really nice for later when we use the uh GitHub CLI which actually comes in by default which we'll see and moment as well so again we'll we actually check out the code itself we grab all versions of the code mostly for the poll request a uh action that we'll do very soon once again we set up node.js we do the installations the default project installations then I want to make sure that I have the neon command line tool and TSX this of course is so that we can automate creating the branches like we've seen before as well as actually getting the connection string and then outputting that into our AWS sec secrets that parameter store so all of this stuff is mostly things that we've already seen before something that's a little bit new now is this API key call so basically anywhere we need to run a neon CTL command inside of GitHub actions workflow we just pass in an API key in there and that's how we're able to do all of these things so how I designed this workflow was to delete the previous Dev Branch so that I can take a point in time version of the branch or of the actual database itself to actually create a new version of the dev Branch at that time when I run this workflow so that then my staged Dev deployment actually has a new Branch from the database itself it actually is working off a new instance of the database having more of that data that's been stored from the production Branch now of course that's optional you don't have to do that I just want to show you how it's done next of course is getting the actual connection string itself from that branch and then putting that into our secrets and then finally we're going to go ahead and grab info about the branch itself so this is the one I want to run so in order for this to work correctly we need to bring in the neon API key so let's goe and do that now I'm going to go into my production app here we'll go into settings I'm going to go down into those Secrets again and we're going to go and create a new repository secret for the neon API key now where we find this is in the neon console we go into our user account go into the profile go into developer settings generate a new key here and I'll call this GitHub actions and then we'll go ahead and grab this and bring it into GitHub actions now do be aware that this key right here has access to our neon project so it could quite literally delete all of our databases which might be something you want to have happened but it also might not want you also might not want that but just keep that in mind when you do actually use that API key itself okay so with this in mind now we're going to go ahead and run that GitHub action so I want to bring that Dev stage in here so we'll go ahead and do get status and I'm just going to go ahead and say updated all for Dev we're going to commit it and we'll go ahead and sync those changes and then back into GI up actions I'll go ahead and deploy this Dev stage now so the key thing that I've been doing so far is actually just working on the single branch of main here so what we actually want to do is work on a different branch we want to work on a Dev Branch specifically with Git the dev Branch will will correlate to the dev application the dev stage of the application as well as the dev branch in neon so it's going to be Dev all across the board just to keep things consistent and so I know exactly what's going on now the way we do this locally is by doing get checkout dasb as in Branch Dev and this switches us over to the dev Branch so all of the changes we do in here will only go into this de Branch until we merge it into the other Branch itself so that's into the main branch that is so for example if I change this right here I just change that one single push on this workflow we'll go and say updated Dev Branch go ahead and Commit This and I'll go ahead and P publish this branch and push it into GitHub so now back in GitHub if I refresh in here um what I should see is something related to my Dev Branch so I have this Branch here and I have the ability to now bring in and merge these two branches together right so I can actually merge the dev Branch with the main branch itself so it's one commit ahead as we see here by just changing it we can go ahead and create a manual poll request and go ahead and hit create poll request here and then now I've got a poll request right so in my case it says 11 just because I did a bunch of testing yours hasn't or yours probably will say one maybe and then we can just go ahead and merge this and go ahead and confirm that merge and then we can also delete that Branch if we wanted to and so all of that did was now in our main branch we still only have one branch in here and then in my workflow we can double check in this Dev stage here that that Branch was actually pushed it actually did something in here so it actually did make a change into that workflow so let's go ahead and actually do this once again and still on the dev branch and still doing it manually before we automate it so the way we're going to do this is by going into index.js this time I'm going to go ahead and get rid of this homepage P message right I just want the Delta and that's it so I'll go ahead and save that and we'll do get status get add get commit updated homepage and then get push okay so once again it pushes it and then we can go right here and click to create a new poll request and do that right here now before I actually create that poll request I want to modify my production one just a little bit so back into the production workflow here I've got this poll request and we can do this types of closed so when the poll request is closed it triggers a GitHub event for that poll request and we can verify if it's merged here so if it is merged then we could run all of the production deployment stuff so we'll save that and now we'll go ahead and do get status get add Dall get commit prepare for auto deploy via merged PR go ahead and push okay so now I refreshed it everything should be up to par and I'm going to go ahead and now create this poll request I could have created the poll request before but now I have it no descriptions in here yet but this poll request now when I actually merge it what should happen is it should actually start a deployment process but since I just pushed some code into the dev stage itself I might want to actually wait for this to finish before I run anything else with serverless or maybe not I don't know let's go ahead and give it a shot I'm going to go ahead and merge this poll request and delete the branch so we goe merge confirm merge and delete branch and it's deleted going back into our GI up actions now we've got this deploy production app it's starting to deploy it based off of that merged poll request cool so if we wanted to automate this it's just one more step and it's a sort of major step inside of the Dev stage but not really that major and all it is is taking this Branch information here and then using the GitHub actions PR create command which is the GitHub command line tool to actually create a poll request for US based off of this actual branch and the branch that we're going to so what we want to do then is actually run the command here so I'm going to go ahead and say title and we'll give the title of automated PR from Dev right or really from Dev stage okay so that's the title we'll give it and then I'll go ahead and pass in a body and coming soon for now and then I'll go ahead and pass in the actual uh base branch which is going to be our default branch and then the pr branch is going to be the next one and that's going to be head branch and there we go and then we can also do our GitHub repo in here and just say repo and the repository okay so there we go we've got our branches that we're going to do here and let's go ahead and see how this ends up working once again I'm still in that Dev Branch here so I'll go ahead and do get status get add get commit and we'll do updated Dev workflow for auto PRS get push okay and that's going to take a moment for that to finish so going back into get up actions we see that the production app did was successful because it's really not that different than what we've been doing the poll request itself I should have none in here if I have any Dev poll requests in here this GitHub action will fail you can't have more than one for any given Branch now there are more advanced ways to have multiple branches and stuff like that um I'm not doing those Advanced ways I'm just going to be doing this really hopefully simple and straightforward one okay and so we've got this brand info here which I need to change the name of it but it does do the pr looks like everything was successful itself and then we can see the branch is the pr branch is the dev and then the main branch is right there for the default Branch great so now we can go into our poll requests here and there it is our automated PR from the dev stage the actual body itself needs to change we'll change that in a moment but overall now I can actually merge this data and again I can delete the branch if I want to it's really up to you on if you keep the branch or not um because locally I can still work on that branch and then push it again so the last part of this is really going to be do our uh poll request so it's going to be our Dev stage poll request and what we want in the body data itself instead of coming soon we want to actually get the info from our deployment right so the deployment info itself uh which is going to be in here so we need to update that as well and so let's see our Dev stage looks like our Dev stage has been running our deployment in here as well no our Dev stage is actually missing the deployment and the info so that was a key part that is missing but let's go back into package.json and make the deployment happen for our Dev stage as well but overall the GitHub action stuff uh is working correctly it's just there was one piece that was missing which is deploy Dev stage and then of course this is going to change it to just simply Dev and all that next up is we're going to go ahead and get info Dev stage and info now the reason for this is going back into the dev stage itself we'll go ahead and just grab one more name here and we'll go ahead and do deploy Dev stage and then we'll go ahead and run the you know mpm run deploy Dev stage and then that's going to take as long as it takes and then down here we're going to go ahead and get export the dev stage info equals to the dollar sign of mpm repo and now info Dev stage and then that is what's going to be inside of this body right here we're going to go ahead and pass that in just like that now we've got a new body stage and let's go ahead and try this out with get status get add-- all get commit and updated put the message in here updated for full Dev deployment stage get push and of course we'll let that go through I need to make sure that my poll request is empty looks like it is of course if I were to cancel any of the poll requests in here or deny them the production stage won't run it will only run if they are merged and that's it great so there's obviously a lot of options on how you could actually go about running those things as well now I'll just go ahead and let this finish so we can just kind of polish off this GitHub actions workflow and there we go the workflow looks to be successful so let's go into our poll requests here now we can see the automated poll request from the dev stage and what do you know there is the endpoint URL so so much of this poll request was reliant on this so that somebody could then go in and review some of those changes right so we would actually a be able to open up that page into like a new tab and review what's going on I made something change to the leads and I get this internal server error so clearly that's something I do not want to actually have go into production so then let's go back into our local environment here and see what's going on with that problem and then we'll go ahead and just change and see how we might fix it so of course the leads itself is having a problem get leads is simply not a function so let's go back into our actual uh DB crud and it's list leads there it is okay so we found the error itself go back into our index page fix this error in our Dev stage still get status get add get commit updated to list leads and then get push and then again you know that's going to go ahead and run through and finish and so this is why we actually have those PRS in place is to just do some checks to make sure everything is good in this case we do not want this we're just going to close this poll request uh and we will delete that Branch as well and that's it so we actually do not want that one of course we could put comments and stuff like that uh but once you close it what will not happen is the deployment actually running right so this won't build it it won't go through and the reason of course is it was skipped because of the way we set it up and that nothing merged nothing changed and so that is also showing that this is working correctly as well so that's actually pretty exciting to me I think is we can now see the dev stage in its full capacity we can do all sorts of tests there um in the sense that I could even share this with developers and stuff like that to they go through the different tests that they might need for this particular API and so yeah granted this is beyond the scope of really building out a expressjs application because the GitHub action itself could be used in many different places but the thing that's really nice is our Dev stage itself has a lot of features built into it that are really only possible by using neon's database service and as well as using AWS Lambda and of course is because when I'm not using the database or the aw with Lambda it's not charging me it's practically zero cost to be able to have this deployment stage or this Dev stage that I can then use when I need it to use and then that's when the cost will start to incur but only for a few minutes or so of course I get this little error here and I think it's related to actually deleting out that Branch uh too far in advance so I'm going to go ahead and just make another little change I'll just add a line here and do get status get add Dall get commit and minor fix and then I'll go ahead and push this just to verify that the actual GitHub action is working before we call this part all said and done and there we are with that minor fix that should be in there so I'm going to go ahead and now merge this poll request and again we can test this out as we see fit and I'll go ahead and run this and confirm that merge and I'm going to leave the branch now you could absolutely delete it if you needed to um but I think one of the problems that we saw with that previous minor fix or at least the up updated it to list leads the problem was that well when I went to merge it um I think it deleted it during the GitHub action workflow running which caused that issue um but now we've got a brand new one where it's going to go ahead and go into production and all that now the other thing about this is the actual production workflow does not have to actually go into using servus at all we can now pick wherever we want is far as when this thing gets merged and all that and of course the actual source code itself is going to change based off of that being merged and there's our list leads in there like that and also that just little extra space in there to make sure that that is all working as well uh so it's fairly straightforward on how we go about going from here uh the most important part of all of this was making sure that we could automate the dev stage itself so now at any time when I need to make a change to my code I work in that Dev branch and then you know commit it and do all that let the dev stage actually do its thing so then my main branch can actually work based off of that when whenever I need to so I basically won't need to work off of the main branch any longer unless there's some serious reason that I really need to but of course if I were working through this Dev branch and running through this Dev stage I should be pretty good from here on out yeah GitHub actions is pretty great for a lot of reasons let's take a look at another place to deploy this application than just AWS Lambda but still working in everything that we've been working with now as you may know expressjs was not really designed for the serverless world but the reason we can actually run it in servus is number one it's a minimal framework there's not a lot of overhead to run Express itself number two is that we have servess environments out there that we can run it on like AWS Lambda it has no JS and it can actually run Express for us but almost as important if not more important is actually our database so as soon as expressjs needs a database something like neon in servess post CR is how we're actually able to run and execute all of these things effectively without overloading our database and being ready to scale up if we need to now of course a better option for Ser list is probably nexts nextjs was designed for front-end and backend operations those backend operations are designed to be running on serverless they're like small little functions that can just run on serverless and it's really really fast but the thing is you actually have to learn react to really get the most out of xjs and maybe you're not ready to do that yet but what we will do is we're going to go ahead and integrate every everything we just did with our project and bring it right into reactor where it feels very native to our nextjs projects if you start building them out and this is thanks to versell of course versell manages nextjs but also they made it really simple to have some rewriting of rules for our API routes as we'll see in just a moment now generally speaking expresss is not going to be deployed directly to versell but I want to show you a way to do that as well and that's something we'll do in a moment but first let's go ahead and see how we can actually integrate nextjs with our entire ecosystem that we've already created in our serverless expressjs application so I actually created a repo specifically for this one and the only thing that you need to change for this repo is in versel Json this rewrites here that's all we need to change to make all of this work and more specifically the actual destination like we've got here now you can simply clone this project or you could just go to your local computer run n PX create next app at latest run through all the default options and then you just want to bring in this versell Json here and so what we want to do is we want to have these rewrites in here because of how nextjs handles API calls so inside of nextjs you can actually add in an a place to run backend API calls which is why we're actually doing this rewrite so basically we can treat our backend that we already have in in existence as one of the routes that you use for the back end of your nextjs application so that's what we want to implement now to do this I want to jump back into my serverless node.js API application so in here I'll go ahead and grab that with my serverless and info and then we'll go ahead and grab the you know stage being pra and then the region being the region we've been using which is Us East 2 hit enter and this will give me the actual endpoint I want to use so go ahead and cop Cy this endpoint and we'll paste it in just like that okay so I'm going to leave it as is and then what I want to do also is I want to make sure that versel is installed with mpm install DG versell at latest the reason for that is so we can actually use versel to well run these rewrites traditionally nextjs does not need to run those rewrites it won't actually do it by default so using versell will allow for that to happen so we'll go ahead and just call cell to start out it's going to ask us to set up a project I'm going to select all of the defaults in here I don't need to set up an existing project I'm going to use the you know directory's name I'm going to keep it in the local directory itself I'm not going to modify any settings and now it's going to go ahead and push this into versell itself and maybe even make a production version of this so we can verify this by going into our versell account which of course I'm in a free account here there is my application right and so we of course want to have a production build um and we'll do that in just a moment but before I do that I'm going to go ahead and create a new item here and we'll just call it versell and Dev so this is the versell development environment that's running our nextjs application with a port value which in this case is giving us Local Host 3,000 now my other application that we've been working on is not running at all right so that's another part of this it's not running through local it's going to quite literally go to our production endpoint here so I'll go ahe and open up Local Host now and I'll go into SL API leads what I get is not found but this is coming directly from expressjs so quite literally is calling expressjs our actual endpoint directly from a versell managed API right so this this actual inpoint is from versell so just having this alone can allow me to create a custom domain so right in versell I could just add in a customer domain right now and that will allow me to have this/ API leads so in other words if I wanted my example.com SL API leads I could quite literally just deploy it as is now of course I need to update my actual API itself to manage things a little bit differently which we'll see in a moment um but the idea is using versell with this rewrite is just a a quick and easy way to sort of do a proxy service itself uh which gives us another Advantage as well okay so going back into API and SL leads I actually want to not have it not found but actually go to those leads themselves so as it stands right now the API is at slash leads that's that's how we designed it we didn't create a path for API so we can actually change that destination just like that we can run it again and then I'll refresh in here and now it actually should do an API request to my actual endpoint on AWS Lambda with that neon database and getting back that neon data and of course I can also post dat as well but overall What's Happening Here is it's actually now treating this as just another path an API path in nextjs so if you're familiar with nextjs you'll know that typically speaking the API itself is going to be that backend part of things so I'm not going to go into the nextjs stuff of it just this right here that part is great so we can also test this so inside of page.js I'm going to go ahead and do a quick little test for this page.js is just the homepage of our nextjs application and it's all in react so if you're not familiar with react this might be a little confusing so just bear with me if it is I'm going to go ahead and get rid of all of the stuff in between main save that and I'll refresh in here notice everything's gone if I put H1 of hello world save that refresh in here and now it says hello world great so all I really want to do here is have a button of some kind that's going to emulate you know press me that's going to emulate actually calling our API there's that button right right there great okay so this is all Tailwind CSS classes in here so I can do BG green like 500 or 400 uh to actually have a background again that's another thing that's outside the scope of this but nextjs has a lot of features that come into it by default which can make it a little overwhelming to start out with but the idea here is we want to actually press this and actually do an API call that's really what I want to see and to make this happen inside of next we're going to go ahead and get rid of this import here and I'm going to go ahead and pass in the string of use CL client at the top to basically treat this page as a front-end page then I'm going to go ahead and bring in something from react itself we go ahead and import use State and this is going to be from react itself and then I'll go ahead and do const and new or let's go ahead and just say data and then set data equals to use State and there going to be just an empty string for the moment and then I want to actually have a click event in here so I'll go and do const handle click which is an event and we'll go ahead and console log that event and so we can add in a click Handler on the button itself so we'll do onclick and that's equal to that function itself now this is straight up react if you don't know this stuff it's okay I just wanted to do one simple thing with the fetch call on this onclick here and then the data itself we're going to go ahead and just say if the data exists we'll go ahead and do json. stringify of that data we really just want to see that it is working as we see fit so the way we're going to do that is by doing Fetch and then we want to fetch to/ API leads what do you know it's going to the rewrite itself I do not have to put that entire URL in there just the destination can be left inside of R sell. Json so what that does is it allows us to treat this like you normally would in a versell application um so or an xjs application in versel I don't really have to think about how to actually handle this I don't need to think about what URL I'm using or anything like that but realistically it's not a whole lot different than calling any normal API itself but in this case it's actually going to augment it in a way that's really cool as we'll see so now that we've got this let's go ahead and actually get the data back from this which I'll just say response equals to a weight fetch which means I need to turn this handle click into an asynchronous function and then I'll go ahead and do set data being a weight response uh response. Json like that this of course could have errors itself which we're not going to handle right now but overall I got a now click event that I can do in here let's just make sure everything's running I'll refresh in my create app and I'll hit press me and what should happen maybe at first it doesn't necessarily come through uh because it needs to spin up and all that uh but after I press it it will actually come through in there which we could console log the event each time it's happening and I'll go and press it now and you can see that it's clicking and it actually is doing that call for us and it's getting that data from our API in a way that's very nextjs based in other words the developer the nextjs developer if you were working with somebody else they could just work with their normal development process altogether and of course we could post data in here as well so for example to post that data we could just change this method being post and then the header uh is going to be our you know our content type and this going to be application Json and then our body data is going to be uh json.stringify a dictionary here and that's going to be email abc1 123@gmail.com and again it will still have a response so this is how I can post data and of course if you know react you will be able to change this as UC see fit as well so let's go ahead and try that out out I'll post out some data here and there is the result back from it right so this is coming directly from my API itself uh and there we go so it's now almost fully integrated here so that's pretty cool so now at this point what we could do is we could actually add this into our repo which I will so I go get add and we'll do get uh you know first prod deploy and we'll go ahead and push this into production and so what's going to happen here is it's going to go into vers sell uh it's going to go to the repo itself and it's going to do all of the build that it needs to to make sure that this is working properly uh which might take a moment or so uh but overall the idea being that once this is ready the production application version will be able to use API leads right in there uh and we already can but of course we could then customize our domain and do all that stuff but once it actually gets fully ready it will work the way we needed to and so the way I would actually change the r API itself is I would change this to being API leads and run off of that so that process would be then get status get add get commit and then updated versel inpoint for you know nextjs API so I'm basically going backwards now and so in my actual serverless node API itself I would then need to add in these paths here which I already did off the video and then I would actually have to go back into that application itself uh into serverless no jsapi look at those poll requests and then we would want to actually commit these things right so this is of course assuming that the dev stage was correct I'm not actually going to go through that process but the idea being that we now release the production version a new version of the API itself and really just kind of separating these two things out so realistically what our node.js application is doing is just consuming a pre-existing API it's not actually deploying our expressjs application on versell uh but it is getting close to that it's getting close to what we might want to do with our application here and so every time you run a release it should actually start building out a production version of it uh assuming that we are in the right you know branch and all that and it actually does go into our repo look looks like it has and so that would take just a moment for the deployments to start filling out uh right on here so it looks like my deployment is not actually filling out so let's go into our settings here and into G and oh yes we need to connect our git repository so I'll go in GitHub here and then let's go ahead and connect the serverless uh one that we have which is the API next so I'll go ahead and connect that one now and I might have to run another production push but there's our main here um and so now we have our project connected to get so it actually should do that deployment when I do make those changes so let's just do a quick change here and just call it a day and this one I'll go ahead and say Hello World V2 and we'll go ahead and do a commit again and push it again so that we can trigger a deployment automatically and so that should actually happen now that I actually did that push and there it goes so it's now going to build that next version and by now hopefully our API leads is working correctly again it's still working correctly based off of that new rewrite that we just did right here uh and let's go ahead and just refresh that server just to make sure that that did happen and so I can refresh in here and all of that happened because of everything we set up with our seress framework and all of that serverless automation within there our production build takes a minute to actually deploy a new version of the API whereas the nextjs application might take a little bit longer than that uh depends on how big our nextjs application ends up being but overall once we actually deploy this we'll have a new version right on versell as we see here and now we can visit this and here's my production version on verell and there it is it's actually committing and making new data in my API backend and you know API leads is definitely showing that backend as well so great this is actually showing us how easy and how powerful can be to start leveraging that API into our projects into our xjs projects the other huge benefit of this is the developer who knows nextjs might not know how you built out this back end and there it is I want to point out also I never actually put any AWS keys in here I never put anything related to Neon in here I could quite literally just share this back end this API as I see fit and build down to my front end so like if you're working with another developer that's not necessarily on your team but you want them to build out a feature this is how this is one of the ways you could do it you could just send them whether it's your production API or your development API you could send them this link and say hey this is how you're going to do it and now you can actually build out what I need and then we can go from there of course you can still use wild cards there's a lot of different things you can do with these rewrites so just check the ver sale documentation for that um but this is the methodology I would take to leverage everything we've done to then start actually learning and building out an xjs application and then maybe modifying and migrating over to nextjs because a lot of the code itself can migrate not exactly because of how it's written we've got all this require here you have to change some of the import statements and all that but overall it's like now going towards what you would want to do within a nextjs application which is really exciting the final piece that I want to do is actually deploy our serverless node.js API itself this pure API itself directly to versell as well and just see what that looks like and how well it's not actually a whole lot different than what we just did but it's not going to give us the functionality of the reactjs framework we'd have to worry about that ourselves so let's take a look at how to do that now now we're going to take a look at how to deploy our expressjs application directly into versell now this can be combined with what we did with nextjs but I wouldn't recommend it generally speaking you probably don't want to deploy expressjs to versell even if it's technically possible nextjs is a much better alternative specifically to leverage a lot of the things that versell does really well and part of the reason I showed you the previous one first is because that's a better method to start leveraging your expressjs application and building off of that so the first thing that I want to do inside of here is I want to go ahead and add a new project this project is going to take a few things in here so let's go ahead and grab this serverless nodejs API the actual repo itself that we've been using and I'll go ahead and just deploy this as is now the problem with this of course is it's not going to work it's not going to work at all because of what versel attempts to do so we need to make it work and we need to change a few things on our project so the first thing that I want to change is I'm going to go ahead and add in versel Json and let's go ahead and change that to actually Json and I'm going to bring in that rewrites again so this time the destination is a little bit different and it's going to be a catchall and it's going to go into the destination of just simply SL API so in order for this to work we need to create a folder called simply API and in this folder we're going to go ahead and do index.js and this we need to import using the actual es6 modules from dot dot dot source and in our case it's Index right there so what we're importing here is the expressjs application which is defined right there and we also export it or we need to export it with module. exports and app equaling to app just make sure that it is exported so then it can be imported and then we'll go ahead and do export default of app okay so far so good so we've did that same rewrite we just did we also are adding in the index itself the actual app itself inside of the API and that rewrite is going to just return everything from that app to the API itself okay next up of course is going to be that we need a public directory so forel is really good at leveraging the public directory to serve the project so when it comes to nextjs it actually generates a public directory for you that will then manage all of the frontend side of things for sales optimized for front end and serverless in the back end and we're trying to emulate that here a little bit with just Express so the first thing I'm going to do is put get keep in here this is if I didn't want to actually put a front end of any kind but I'll also go ahead and add in index.html and then I'll just put in poorly formatted HTML here just to have a Hello World call of some kind okay so I didn't actually change this in terms of the expressjs application itself it's still using this SL API paths here like we saw before so we might need to change that as well uh but the reason I changed those in the first place was really for that integration with the xjs application based off of a rewrite okay so with this m let's go ahead and change it we'll do get status get add Dall get commit and we're going to prepare for versel and then we go ahead and do get push so what vercel is going to do is it's going to wait until this is on the main branch right now we're working on that Dev Branch so what versel will also do is well it doesn't really care about this production stage at all it's going to do its own thing it's going to basically do a similar item of this to deploy it which means we need one more step and that is our package.json versel is actually looking for something called build or versel we'll do versel build it's also looking for a script called build itself in this case we're just going to go ahead and do Echo the word of Simply hello and that's it in other words our build command is not going to do anything and the reason for this is that we maybe we add in another build script in here of some kind but ver cell build is going to attempt to build something and if this command does not exist then it's going to fail so during the versell build command it does something like our mpm run deploy but instead it's going to be versell build if that's available so we'll go ahead and leave that as is and then I need to redeploy this again but to do that I need to go back into GitHub uh and specifically into this repo itself so we go ahead and look for that repo and here it is right here and so what I want to do is just accept the pr that I just did um and commit pull and confirm okay and so there we go notice that versel actually looked at that as well so I'll go ahead and do another one with get status and get add and then get commit and then we'll go ahead and do updated script for versell and we'll go ahead and push that okay so it's going to take it a moment to build out so back in for sale we actually need to update this a little bit more too right so back into our project we've got our project right here well the problem with our project as it stands is the fact that well it doesn't have our environment variables that we need so our environment variables are going to be related to AWS right so the reason we need that of course is for that runtime so let's think about our production stage itself we've got en.pr which might have the stage variable in here for prod it also might need well not might it definitely will need various things related to our secrets so in our secret Library here we've got the AWS client that's coming through now when we deploy it to AWS those variables are going to be in there as in the AWS access key ID and the secret access key and of course it also has a role in here that we created specifically for the different stages and all that but we aren't pushing that into production anymore so we need to make sure verell has AWS access Keys itself so of course what we could do is we could go into AWS and create some new ones I'm just going to go ahead and use these local ones that I've got here just so I can emulate or show this example working correctly and there we go and I'll go ahead and save this the environment itself I don't necessarily need a bunch of different environments but I'll go ah and save those variables for all of those environments and there we go so one of the things I don't need in this version of my expressjs application is a neon API of any kind or my database URL because of course we've already done this but the idea being that my secrets are now stored on AWS so if you wanted to get rid of that functionality You' have to readjust how you're doing this and you'd probably be read addding your database URL as an environment variable but again this isn't a deployment that I recommend but it is one that you could do the question of course is whether or not you should so now with this deployed I see that it does say hello world so it looks like I'm on the right track that hello world of course is coming from the public index.html here so let's go ahead and visit this and at least the front end is going to work here so we've got the front end working which is kind of cool um I I want to visit the actual main the deployment as well so let's go back into the project and grab the actual deployment uh URL which is this one right here and so there we go and now I'll go ahead and jump into the API SL leads and if I did everything correctly it will work but of course the problem that it is facing right now is merely the fact that I didn't have those environment variables when I deployed this version so let's go ahead and just create them now I'm going to go ahead and update my index page and we'll go ahead and do p and EnV working try this again and actually one thing I might need to do also is just verify any poll requests that I have in here so this will also trigger another deployment so I'll go ahead and run that what this will actually do is solve that environment variable thing so I probably don't even need to submit this yet so I'm not going to do that yet let's go ah and see if this recent deployment solves the environment variable thing that we had with versel so get going back in here we'll just go ahead and take a look at the deployments that are coming through and of course it's building one notice it doesn't take very long that's not surprising because it's only installing a few things here it's not using the cache in the same way that AWS Lambda is it's literally installing all of the things in our package.json all over again and there we go we've got a new production version so I'll go ahead and open up my app again I can actually just go back into this URL here and if I refresh now everything's working and there is my expressjs application running on versel so what we could do is yes we could keep this I don't recommend it but we could keep it and then build out your front end using something different so expresss has a lot of different front-end support as well in other words the index.html could then bring in something like pug or some other JavaScript that you might want to have in there that then calls the API itself uh but that starts to get way too complicated then it's like oh you might as well use for yourself you're going to start building out the front end um itself forell and xjs uh to build it out and make it even more powerful but I really wanted to show you how those two different approaches worked so that if you wanted to use something like versell to deploy your applications you totally could now all all of this is possible because of how easy it is to be flexible with our deployment options neon is well suited to do all of that for us right so the fact that our database we didn't have to touch our database at all for those last few deployments was so nice and that's because of how we configured everything to work and in this like automated fashion so when it comes to deploying to versel the other part of this that you could consider is you don't always have to have your database URL exposed inside of the application itself so even if you were using nextjs you could still use the parameter store that we did and then in your workflows you could still do the dev stage where you would basically stop uh as far as the dev stage is concerned you would stop you would create a branch maybe the deployment Branch you could then also create that uh pole stage actually everything in here with maybe the exception of deploying a development stage on AWS could still work in a nextjs application and nextjs itself can also do a bunch of things with that and of course if we wanted to deploy our development version on versell well we could go back into versell into our project here the first thing that I need to do for this project is jump into the settings go into my environment variables here and we want to add one specifically for a preview environment I'll go ahead and grab the branch itself which will be our Dev environment I want to go ahead and add the stage of Dev right and then we'll go ahead and save that so now we've got a preview environment on that stage right so it's going to give us that Dev there so the next thing that I would want to do is jump into our deployments here and we're going to go ahe and create a new deployment so it's going to be commit or Branch reference this is going to be Dev so let's do a quick search for the dev Branch there it is right here it was happened just a little bit ago I'm going to create a deployment for that and this should work right so it's going to be basically the same thing just using the dev Branch itself so in other words the branch itself should also have the same data that our production one has because well we've been deleting and recreating those branches uh but overall what what I really want to see here is that the dev Branch actually works and it is something that's different and again it's just another way to just Branch out our database do some tests and preview environments and all that so here it is there's our preview environment I think it's working so let's go ahead and give this a shot by looking at the Domain itself um and so there's our world let's go into our API just make sure that that's working so API leads and what we should have in here is something maybe slightly different than the other one this has 18 values in here um and so once again I could just go back into this one let's go ahead and just create a preview or uh you know a stag version I'll go ahead and update this and go ahead and do staged version and we'll go ahead and push it okay so of course one of the challenges with pushing this is the fact that I have my Dev stage here is going to quite literally create and delete a branch which could cause Havoc or issues with staging this version uh so it's not necessarily something I recommend doing but we could we could just try it out just to see that you can absolutely deploy different versions of your application with different stages and different branch databases at that time as well um which in our case the stage version uh it just came out 20 seconds ago so we can go ahead and take a look I'm not I'm not confident that the API is going to work here but we can give it a try at least we've got some you know other text in there so let's go ahead and grab in the SL API and leads and I I would imagine that it's either going to work or it's not in this case it actually did work uh which is great so we have the ability to now also have those different stages as well on versell and each stage could have it a different domain as well as the main production one can also have its own custom domain which is maybe one of the main reasons you end up deploying to versell as well is because it's quite a bit easier to put a custom domain on versell than it is on AWS and the servess environment so yeah that's pretty cool I think for sale is a great tool and now I would recommend that you start modifying converting all this stuff to nextjs which is really the the next part of so much of this but even if you didn't we now have a really solid foundation for an API service that we can start building out a lot more endpoints if we needed to hey thanks so much for watching hopefully you got a lot out of this now I want to leave you with two challenges one is how do you actually Implement authentication whether it's users or other applications how do you actually secure our API more now that will be a really good challenge to undertake at this point if you were able to get this far the other one is well how do you actually convert this expressjs application into a nextjs application those are the two challenges I want to leave you with they are fairly straightforward especially because there's a lot of thir third party support out there but I think those are really good challenges that you can Undertake and guess what they can feed into each other as well so thanks again for watching my name is Justin Mitchell I really enjoyed sharing this with you because I think servus is one of the most fantastic things that you can learn about and Implement because it really just helps us focus on just our code we don't have to worry about the infrastructure that infrastructure is taken care of by the different services that we leveraged right so AWS Lambda will scale up if we get a lot of users coming in so we our on database so would versel if we went there it's really really great to just focus on the code itself and not really worry about the infrastructure that is one of the keys of serus and that's why I wanted to create this course for you so thanks again for watching and I look forward to seeing you again in the near future take care
Info
Channel: freeCodeCamp.org
Views: 57,489
Rating: undefined out of 5
Keywords:
Id: cxgAN7T3rq8
Channel Id: undefined
Length: 244min 24sec (14664 seconds)
Published: Wed Feb 28 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.