Live Coding! Serverless API with AWS Lambda, Typescript, Nodejs and DynamoDB!

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
all right what is up welcome to another night little live coding action tom mcgirl thank you for coming by um been enjoying doing this tonight we had a cool uh cool series scheduled i think tonight if you've been following along we've been building a craft beer cataloging application using recoil js react.js and expo react native today we're going to be doing something a little different but kind of in the same vein today we're going to be spinning up a back end for that app but we're not going to be spinning up a server at least we're not going to host it we're going to let aws host it we are going to be spinning up a serverless backend using aws lambda dynamodb we're going to write that in node.js with typescript i'm really excited for this it's something i haven't really done a little disclaimer i gave a little bit of a look into it just so i wasn't totally unprepared and we do use it at my current job at electric ai but i've never personally spun up anything serverless so i'm excited to try that out and i'm excited to do it uh here with you all so yeah again if you uh if you want to catch up uh you don't need to have watched the last series that's not a prerequisite for this again this can be anything we're going to be spinning it up from scratch we are going to use it to back that app but again you don't have to follow it along until now this is a kind of a fresh one-off video so i'm excited to get this out there i got the twitter url down below where you can follow along follow me on twitter you'll get updates for when i'm going live i've got the github url that i'll host the repos for this stuff i open source this stuff so that everyone can join along and um you'll see here in a second um because we're about to get started i got the github url twitter url there on our starting student screen here you can see the youtube channel uh check that out it's gonna be cool uh we're gonna be posting all the videos there so yeah so let's just go ahead and dive right in um before we do that though i forgot to mention i am sticking with the schedule of tuesdays from 9 00 p.m to 11 p.m eastern time i'm hoping to add more days as this gets kind of like you know more fun and have more time on my hands but for now we'll be sticking to that so uh yeah thanks and again follow on twitter and you'll get those notifications all right so let's jump in so as i mentioned what we're going to be doing is building a serverless rest api with node.js aws lambda and dynamodb and the reason i want to do this is we've been building in the past series and again you don't have to have followed up until now we've been building out this craft beer cataloging app and i have the server started here so we can kind of see what i'm doing but up until now we've basically had all this stuff locally on the device so if i share this with you here we have our app here i'm going to run the ios simulator and what i want to do is i want to make a back end for this to host these devices and as i mentioned you don't have to have this app you don't have to follow along up until now because you can make this uh aws lambda server list api applicable to anything you're building we're basically going to build an api backed by dynamodb which is a non-relational wide column store it's going to allow us to store some data it's going to allow us to query data and we don't have to think about deploying it to a provider we have to deploy heroku or we don't have to deploy our own instance we don't have to spin up node and express and write the routes we're going to let amazon handle that and we'll go through that as we start to build out things there are a lot of steps to this so i'm hoping uh you know we can get it all done today but if not we'll we'll go on till next next week as well but i think we'll be able to get a good portion of it um and i will like i said be posting this on youtube this evening so that you can check it out if you miss anything you can rewind you can watch it at whatever speed you want i've seen in the comments some folks are listening at 1.7 speed so i must sound like this a little bit really fast i think that's awesome yo if that works i really appreciate that so that's great so right now i'm just loading up this application this is just the one i'm i'm thinking about this is like the domain i'm going to be building for doesn't have to be what what you're working with but this is going to give me an idea of what i want to actually be storing so right now i have these beers that we've been if you followed in the previous examples we've been adding we have our our screen here where we add new beers and we show them here we have them sorted by brewery and sorted by style but they're stored on the device they're local and what that means if i spin up another instance of this app which i can do by spinning up for example the web version you're going to lose these right it's going to be a totally separate list because the web storage is different than the app storage so what we want to do is if we want to persist this across all of our devices and and have different instances pull from the same data like so here you can see in my web version um i don't have any beers and if i add a beer here well looks like that blew up but if i were to add a beer here it would be different right so we want to be able to share the years across and that's why i want this back end api so let's just kind of take a look at what we're going to be we're going to try to put in this api so we're going to need an api to store these beers the current things that i store for my beers are brews is the name uh the brewery the image so we have the name here the the breweries here the image and the style and the rating so i want to be able to put these things into a database so that when i load the app i can request that data and this app was built using recoil.js so it'd be cool to for us to figure out how to actually make that request with recoil but again today we're going to focus on just building the api and you can make this applicable to anything you're building so you don't have to store brews you can kind of follow along and change it to whatever you want to store but yeah let's jump in so let's talk about serverless here so what we're going to do i have the amazon tutorial up here just in case we can follow along um i also have a tutorial from serverless the website which is actually serverless is the uh the server service we're going to use it basically helps you scaffold things out and i have an example repo from the serverless team of a typescript example so before we dive in let's talk about aws lambda a little bit so we had kind of have a good example here so i'm going to zoom in on the diagram they already made so what serverless is going to allow us to do is it's going to allow us to host an api without thinking about the servers without thinking about deployment without thinking about scaling so if this were an actual application which it is not um i'm not going to be you know getting rounds of funding for this or anything like that yo yet yeah this is a pretty cool app you never know but no what i want to do is i just want to do i want to play around and learn stuff and that's what this is for so what lambda is going to allow us to do is it's going to allow us to ignore that stuff and what we can do is we can focus on our code and focus on our use case and so what we can do is aws land will allow us to write functions that get invoked either by an event that happens in our system or in our case we're going to use api gateway so that we can make requests just like a rest api the other thing we're going to do here is we want to store that data so we can store it and retrieve it right so when we when we create a new brew we want to go up and store that and then we load our application we want to be able to pull all of that data down and so we're going to use dynamodb for that there's another database option with amazon there's a few actually the other more traditional one would be like using like a managed mysql database or something like rds which is the relational data store from amazon but dynamo works perfectly for app and that's because dynamo is a non-relational wide column store and what that means is it's really good for data that you don't have to do a lot of joins on and um a lot of relationships and we don't have a lot of relationships we have some like brewery hasbrous things like that but we're not really reflecting those relationships in the app um the only way that we're reflecting that in my app in this case is we're sorting on them but the way we're handling that is we're taking all the brews and on the front end we're doing that sorting and it's not really a relationship we're never actually querying for just a brewery and it's brews we may do that later and dynamo will actually work for that um but it's kind of perfect for what we need because we just want to get a list of all the brews and in this case we actually are just kind of doing like a what we call a scan of the database to pull down all of the information we're not actually querying the database where we would give it like an id and say give me this brew maybe eventually we do but for now we're not going to do that so that's kind of perfect and so the other benefit is i'm not going to have to you know write my own node server with express or maybe if you're in the the python world you're using something like django or flask or tornado or fast api um we're just going to let amazon handle that portion of it they're going to handle the hosting of it they're going to give us a url all that good stuff the only thing we're going to do is write the functions that run given a specific route and we're going to write the code for those functions for example storing the information in the database looking at the requests things like that so let's get started so the first thing i have to do which is outlined here is i have to install the serverless framework now you don't need to use the serverless framework to use serverless with amazon but the benefit here of using the serverless framework is it gives us a really nice way to declare our our lambdas and our dynamo normally what i'd have to do to set up a lambda is go into amazon and i'd have to um first set up a lambda and i'd have to use the ui and do that what this lets us do is it lets us establish that stuff through code we can have it's called infrastructure with code so we can actually declare what our setup looks like in code you'll see that a little bit it's pretty verbose but we'll kind of follow along but it lets you declare the infrastructure with code so that you can say hey i want this many lambdas hooked up to this these are the handlers this is the database these are the routes and it uses amazon cloudformation which is infrastructure through code that lets you kind of specify via a yaml file or in this case a typescript file what your services should look like and we'll dive into that it's kind of easier if you see it it's a little hard to talk about it um so first let me install this serverless framework and then we're gonna have to create a user and that's kind of where the trickiness of amazon gets into place but once all this stuff is set up it's really great so first what i'm gonna do is i'm gonna run this command npm install serverless g that's going to install the serverless framework globally so that's going to install it globally for my machine so now i can run the command serverless or sls for short anywhere on my machine and if you didn't want to install things globally there's options for that as well if you're a type of person who likes your development development environment like kind of separated and containerized you could use something like docker maybe you're on windows you can use i believe it's windows subsystem or windows linux subsystem wls so i got a permission error that's because it's global install i just have to give that a pseudo so i'm just gonna do sudo and it'll run that command all right so let's do this and actually we'll just pseudo this all right so that's going to install globally and while that's happening what i'm going to need to do is i'm going to need to create a user so i've set up an amazon account but i want to create a user and that user we call that an iam role it's a it manages permissions on an account so let's go take a look at that and see what that's all about so i'm gonna do is open up a new tab here go to amazon i'm gonna not that amazon this amazon i'm gonna go ahead and sign in here type in my password give me one second i'll open that up over here and so what the iam role is going to last do is kind of create a user with specific permissions and the benefit of that is that we can kind of isolate what users have what permissions and it just lets amazon know who's accessing a particular application so i could say you know this type of user maybe it's a developer has access to deploy and then maybe there's an admin who has access to do other things so and you'll see we'll be able to create groups which is kind of cool as well so let me just put in my credentials here all right one second all right let's see what we got here all right this should be it hopefully nope all right let's try this one got my password badger here no okay give me one second here we go thank you all right so it's very secure um no but this is good we want to have some security here right we're dealing with stuff and you're going to see at some point when i'm copying secrets i'll have to move my thing off screen because i won't be able to share that obviously but you'll see how this gets started so let me zoom in you can see the absolute crazy amount of services that are in nws and it can be overwhelming but don't get overwhelmed we're going to navigate this together right i'm learning you're learning it's going to be fun so the first thing we want to look at is iam and how you may say how did i know to do this well when i first looked at this example on the aws website they have this section about creating let me see if i go i think i got to go back about creating that so i'm not hosting a static website manage users this is a section where they talk about authentication but they also talk about iam roles so you have your user pool create an amazon cognito user pool this is for authentication which we're not doing at the moment but basically this is creating the dynamo table here it is skipped right over it creating the iam role and so you're going to see as we do this we're going to have to give this role different permissions and i i know the permissions i need to give it because i i tried it out before earlier but what i'm going to do is i'm going to pretend like i don't know and you can see how i kind of whack a mole where like it asks for one permission and we add it it'll air out say we need another permission we'll add it and so this way we're adding only what's needed and it's kind of a cool process that i went through and i kind of want to share that with all of you even though it might be boring but it's i think it's a really cool experience okay so i have my iam roles and so what i'm going to do is i'm going to go to groups i have groups developers and so this in developers i made this tester before i've already set some permissions but what i'm going to do is i'm going to go ahead and so you see i am full access lambda function full access and a few others what i'm going to do is i'm going to go ahead and just create a new a new role to try this out so what i'll do is i'll create a new group let's call it live coders and for now what we're going to start out with is we're going to want to give it access to lambda we're going to give aws slam to full access just to start just so we have something all right so we're going to create that group now we want to create a user so i have this brew book tester i'm going to go ahead and use this but i'm going to point it to a different group so i'm going to modify this here and i'm going to remove it from this group all right and we're going to add it to the new group and so all i did to create this was i said create new user it makes you name it and then you pick a group so just like i'm doing now except i'm just removing the old group i'm going to add a new group i'm going to add it to live coders add to groups so now we have live coders that's our user and so now what i'm going to do is i'm going to get the permissions for this user i'm gonna do that in a second because what i need to do is run this command so here if i go to serverless and you can see there's a configuration setup where we input our configuration so we installed serverless so here it says run the blow command and follow the prompt so we can do that in a second that we're not going to do because we don't have a pro account so we're just going to do a search for serverless set iam role typo there sorry about that i am roll so this is the infrastructure you'll see in a second but first what i want to do is i want to run let's see so i've stored a command here that i want to run to scaffold at our app let's try running that and see what happens so this command is going to actually scaffold out our application so right now i'm in my brew book react native folder but what i'm going to do is i'm going to go up a folder hopefully that's not too small there we go that should be better so i'm in my github repos it's where i put a lot of my stuff and i'm going to paste this command and so let's go through this command what it's doing is sls which is the server list command create i'm using the template aws node.js typescript i'm giving a path lambda test i'm going to change that we're going to call that this is going to be the actual folder that it's going i'm going to call it brew book api and i'm going to name the api also brew book api so create that all right so now i'm just going to do a quick command for sls config and so what i want to do is i want to now input my iam role into into the server list so that it knows when i do a deploy and things like that that it's going to use the correct role so let me just google how to do that again how to set i am role for serverless framework okay let's see customize sermous i am policy i had this here as well maybe it's in here so let's see i am no nothing in here about it i know there is a command i ran out before let me see let's see if i can find it here on the machine where i ran it let's see give me one second oh yes okay so there's a command serverless credentials so let me just pull that up serverless credential config credentials so this is a command i want so here what i'm going to say is serverlessconfig credentials my provider is going to be just aws my key is going to be my i am role key and my secret's going to be my i am role secret now you won't be able to see that because i'm going to have the secret off screen but i will show you what the key is here so i'm going to go to live coders this is the group name i want to go to the specific user so users brew book tester and then i have my security credentials and so you can see my access key here i can deactivate these if i want so i'm going to do that and what i can do well let's make this one active and i need to create a new access key so let me close that delete that i'll delete this one and i'll create a new key so i'm not going to show you my secret i'm just going to move this off screen really quick but i'm going to copy that so first i'm copying my key here and i'll type this part out so you can see it and then i'll just move it off screen to enter my secret so again it's serverless config credentials and then provider aws and then key looks like i'm going to post my key and then secret and i'm just going to move this off screen real quick while i post my secret all right let's open this up okay and it's setting that up i'm just going to clear this here all right there we go and i can close this okay so there we go i set that up you're not going to be able to see the secret here which is good for me um okay so my ima roll is set and it's now hooked up to my service and that'll allow me to actually run commands so the first thing we're going to do is let's take a look at the repo that it created right so that would be under brew book api let's just pop this open and you would call it whatever you want so whatever api you're building maybe you're building an api that you want to store um your favorite comic books or maybe you want to store a list of to-do's maybe you're building a to-do app you know like every other front-end thing um oh actually i don't even need these anymore so it's whatever you're trying to do with it um do whatever you want you can store wines maybe you want to store your favorite uh foods anything and you'll see that the properties are because we're using a non-relational database where we don't really need to define a schema in advance we can put uh whatever keys we want and we can change it whenever we need to so that's one of the super cool things about this i think all right let me just adjust this here a bit just this a bit there we go all right so what we're going to do is we're going to take a look at this code so what we have here is what it scaffolded it out so you'll see it scaffolded out a webpack config which we don't need to worry about right now it's you we're using typescript so you can see the ts files and so one thing that stands out is usually with serverless it generates a serverless.yaml which is a just like json or markdown yaml is the type of file that lets you specify kind of configuration but in our case because we use the typescript version it actually specifies the configuration using a object a typed object so here it's saying can i find model server aws or it's corresponding type declarations that can be solved by us just installing um probably just installing the types for that but we'll do that in a second we probably just need to run npm install so what i'm going to do is in here i'm going to run mpmi that should install our dependencies and while that is happening let's take a look at what this is let's take a look at what it's actually doing i'm just going to subtract that there okay so we have our servos configuration this is kind of verbose right there's a lot going on here i went through this for a while and kind of tried to figure out um all the things that i was doing and it's doing a lot so it's we're going to go through it and we'll try to figure it out together so the first thing is you have your service name and in this case it's brew book api we have a framework version that was scaffolded out for us it has some information about webpack that's just kind of the custom handling plugins we're using serverless webpacks so it's listing the plugin there which is pretty cool um provider so we're saying we're using aws because with serverless you could use other providers so you could be using google cloud functions you could be using azure functions we've chosen to use aws and maybe we'll try this out with azure functions and try it out with google cloud functions and just see how they differ aws is just the one i'm most familiar with um just from you know having used it at work a little bit but also um it's been around for a while so it's pretty pretty robust but yeah i definitely would like to try the others maybe we could do a video where we after we've done it we can compare and contrast the pros and cons of the different um providers i'm i'm liking that serverless as a framework can abstract a lot of the the work we're gonna have to do but you'll see once we start doing more we're gonna be getting into the ui and clicking around and the aws ui has been around for a while and credit to them i don't know how they made a ui for that many services it's not the best but for what it has to do it kind of is pretty sweet so check that out all right awesome so now what we have here are plugins we have our provider again the runtime we're using node.js 12. so we're going to get the benefits of using the latest node so we can use async await right we can use our import syntax and you'll see that in a second we're using api gateway minimum compression size we can look this up and you can see here it's kind of tight but it's uh basically we'll have to look up some of these properties so i'm not even sure what they're doing minimum compression size for your pay gateway if i were to guess it's the minimum compression size environment these are environment variables so we're going to add environment variables here once we have a dynamic database we'll add the name of it here once we have an s3 bucket which i'll get into in a minute why we're going to need an s3 bucket we'll put that here as well any other environment variables can go right there and then they can be accessed in our code here's the cool part the handlers this is what tells lambda when to hit your actual endpoints and so here we're using again we're using an api we're using land different api but you don't have to you can have a lambda function that is just a pure function um and most all lambda should be pure functions right because there's no running state right these servers are spun up quickly they're given arguments and they execute they're not they don't have kind of like in memory state right you're not running redis on a lambda i mean maybe there's a way to do that maybe there's use case i'm not sure but there's no in memory right anything we store is going to be stored in dynamo our database and pulled out but the functions are spun up on the fly and that that'll get us into a different area which we can talk about which like our cold starts so because these things are spun up on the fly the first time they're spun up they take a little while and so we call that a cold start we're not worried about that not many people are gonna be hitting this api if they were it would start to cost money but we're gonna make sure we spin it down before uh before it starts to get there so you do get free i think it's 20 000 free executions on the lambda for the free tier as well as a certain amount of bytes transferred um and we'll go into that once we start taking a look at the uh ui they've done an amazing job with their billing i mean rightfully so right so many companies use this to understand the billing is like a whole task in itself i'm sure there's people who have jobs where that's their job is to understand the billing of their infrastructure anyway i digress so we have our functions we have a function called hello that's the standard function it's got a name for the handler and it has a vents map to it and this is saying when there's an http event of type get path hello execute this handler what is that handler well that is in our handler ts file i'm going to pop that up to the side here you know how much i'd love to do that all right check it out so we have our handler file here what it is is it's a handler you can see it's importing from api gateway proxy handler again api gateway is what's converting our lambda into an api a lambda could just be a function as i just mentioned it can just run code you can give it two numbers you can add two numbers that would be a lambda but our lambda is actually serving as an api so it has api gateway sitting in front to handle those posts those gets the updates the puts the deletes all of the 18 http requests that we're actually accepting to trigger our lambda and that's what the events is that's what's trigging our lambda so what we do is we have this hello it's of type api gateway proxy handler so it's a very specific type it's a function right it's an asynchronous function so this is really cool we're using typescript we're using modern node we can benefit from async await which is fantastic i've seen this stuff written in python and i know i'm biased i love front end i love javascript i love typescript i like the way this looks i think it's so nice to have async weight available i know they have that in python there's ways to do it just haven't done it so what we're doing here is it gets two arguments the event in the context but here this is just a very basic basic thing we're just spitting out a 200 and a message go serverless webpack typescript your function executed successfully so let's leave that for now we're going to give it a shot um so let's see what happens when we we push this up so we're just going to push this plain uh serverless thing up and we're going to see what happens you're going to see it's going to complain about some permissions we'll adjust those permissions and we'll go from there so to do that what i would do is run sls deploy very easy it's going to kick off that deploy it's going to check my i am role that i plugged in earlier off screen using the config credentials command and it's going to tell me whether or not i have permissions to do this i'm going to need a few different permissions to do this and you'll see in a second so the first one is saying it's not authorized to perform cloud formation creates stack this is very cool it's saying that this role is not allowed to perform this operation cloud formation create stack what does that mean well cloud formation is how amazon handles infrastructure via code it's how this serverless ts is able to be converted from code to actually building stuff in amazon what the serverless framework does it takes this code passes it off to cloudformation says hey spin me up these machines spin up everything configure it but amazon's smart they're saying well the role that you gave that user who's trying to do that they actually don't have permission to do that because it costs money sometimes when you do that so we're going to go ahead and give it cloud formation create stack permissions and this is what i mentioned before we're going to kind of whack them all right a new permission error pops up we're going to pop it new one comes up keep going sweep hand on the wall you understand let's go all right so let's pull up our i am roll so i think i closed that window pull this up solutions no i don't want solutions go to my account i want to go to the console fancy console look at that loader bam i am which actually check this out that's what i was talking about they figured out a way to make it easy to navigate this plethora plethora of services i am right here let's go all right so let's find our user bam rubric tester that's our user part of the group live coders we're not going to edit the user's permission we're going to edit the group's permission so that any users in this group will get those permissions so let's go to the group live coders permissions attached policy so remember it was cloud formation create stack so we're going to go ahead and search cloud formation so we have cloud formation full access formation read-only access deploy role for cloud formation deep racer yo what easy full access that's it all right whack-a-mole time let's try deploy that again let's see what else pops up you can do this too you can go through this process if you'd like or you could watch me do it like a sucker and then once i put all the rolls in there you just copy them um i'll post i'll when i put this in the repo i'll put a readme i think that means i'll put a readme and put like the permissions that i had to give the iam roles i think that would be useful so we'll try that all right so now it's going to give us another error so it's got past that point right it has the ability now our role can use cloudformation to create the stack the next thing you're going to see is likely related to the logging that we're going to use with aws um i believe it's called cloud log let's wait let's wait and see um so yeah so that it's like a little bit of a process to get these permissions set up but it's worth it right because it makes us feel secure knowing that we could give someone a role and let them do things let's say for example you have a developer there we go let's say you have a developer who's you want to be able to make changes but you want to lock deployment down to certain people or you want to lock deployments that involve changing infrastructure to certain engineers you could do that right you could have a service reliability engineer group you could have a we can call them a devops group you could have a developer group and the devops can change the cloud formation stack using infrastructure as code but maybe the developers can't maybe the developers can deploy handlers but they can't do that or maybe they can't deploy new logging infrastructure only this type of person can or maybe you have admins that can do it all right so now what it's saying is it's not authorized form api gateway post on resource api gateway east one so we need to give it the availability to use api gateway so we're going to attach a policy here api gateway so we have api gateway invoked full access api gateway administrator api gateway push to cloud services so let's give it api gateway administrator this way they can create those and let's go ahead and run it again while that's running because i kind of have an idea of what the next one that's going to come up is i'm just going to go ahead and pull up services and i'm going to load up cloud watch i believe because i think we're going to need that in a second so i wanted to go over cloudwatch because cloudwatch is pretty cool what it does um is it it handles all the logs and monitoring for our services so it'll show your service and you can look at the logs so if you console.log you can view it in cloudwatch pretty cool but so now it's going to complain about not authorized form i am create role so we need because we need to create a role so i'm going to give it access to that let's go here attach policy i am look i am full access we're going to give it i am full access and you probably want to be more specific with this stuff um we're actually doing the creation though so we're going to need full access but you know you want to be more specific right you want to get you don't want to get very general permissions you want to give permissions only for what's needed again i'm not an expert i know that full access worked uh and i think that's what we need so let's go we definitely read only definitely won't work right because we're doing some actual uh write operations some change operations um and if anyone has questions in the chat feel free to ask i'm happy to answer any questions that come up to my knowledge right so some stuff i'm learning so we'll figure it out together all right and so once this is up what we'll do is we'll actually try hitting this endpoint so i can do that via curl but what i'm going to do is i'm going to see if i have postman installed here and i don't so i'm going to install postman um because it's going to be useful later when we start doing post requests because i just don't like doing post requests in um curl it's just not great especially sometimes i have to post in the case here i'm going to be posting some basic 64 data and it just works better so here i'm going to go ahead and download this let's see i don't want to sign up for a full postman i can sign in i do have an account let's do that i'm going to sign in here and i just want to download the client also i can actually do it via web now that's pretty great i'm gonna download the desktop agent though because i really like it it um it's pretty great if you've never used postman i highly recommend it for testing apis uh you can create collections so you can have a collection for a different application you can have a collection for a different service let's say you work at a company or you're just working on a project yourself and you have a few different projects you can group up services with collections and you can have all your example requests you can do tests with it you can set up fake responses you can save responses really really powerful we use it at my current job at electric ai and it's awesome we make sure we test all of our applications with it when the back end team is finished working on a particular service they'll send me the postman and i'll be able to run it and try it out it's fantastic so i'm going to just pull that i'm going to open up postman so that we can once this is up we can try hitting it so let's see what else to complain about nothing complained about nothing this is great look at that wow very good okay so we're not going to activate metrics thank you amazon nice try maybe in a little bit i do like the service if i like the service i'm willing to pay for it i'm getting what i like out of it all right but so what we want to do is let's try hitting this so here's our endpoint gives it to you right there it's really awesome and what we're gonna do is we're going to hit this end point and then we're going to go ahead and take a look at the console and look at the lambda um maybe we'll do that first maybe you want to see you want to see it let's be real i know maybe we'll maybe we'll view the lambda first come on you want to see the request let's open up postman where are you posting i just talked about how cool you were now you're not even opening what's going on let's go all right all right i get it let's see see where we're going with this i'm gonna have to do it from here on our all right i'm gonna do it from the web uh let's just call let's just not do this let's just click this paste and get all right ready block because of course of course see because we're from another website that's why i need the postman app to work so let me see maybe let's go ahead and do this let's remove this it's not clearly not working and i'm just going to go ahead and it's not open lies okay so postman is misbehaving that's okay we can curl it we can do it we talked about it before we're just gonna do it so the way we do that is are going to use curl give it a method the methods get we're just going to paste that bam all right so we made a request and here you can see uh it gives us a bunch of information some headers right but here's the message go serverless webpack typescript version 1.0 your function executed yes this is great i think it's pretty great so we were able to execute our function we got our response now let's check out what sls deploy did for us so i'm going to go to my get out of here postman fail me fail me leave i'm going to go over to my console and we're going to look at our services so first it should have created a lambda so we're going to go over to lambda and you'll see what it spun up here so here we have our brew book api dev hello that's this that refers to this hello the function right there individual functions there's our hello function and check it out it gives you a little diagram we have our lambda and we have api gateway sitting in front and it even has our code you'll notice this is not very useful the reason it's not very useful is because it's minified right it's minified because we've written had written it in typescript and we ran it through the typescript compiler we didn't do it but when we ran sls deploy it ran webpack it ran our typescript through the typescript compiler it minified that code and put it right here now if you didn't go through that process or if you change your ts config so that it doesn't do this you could edit your code here now your code doesn't have to be minified here right it's sitting and being hosted on aws they don't care if your code is minified it's not being deployed to a client so if you wanted to do that what we could do is we can go modify our ts config and make it so that it doesn't minify and obfuscate the code however we're not going to edit code here because we want the benefits of working in typescript and this editor just doesn't look as good as vs code sorry amazon microsoft has you there right i wonder if azure functions has vs code built in i bet it does i bet you i bet we'll find out but so for now what we're going to do is we're going to look at our logs let's see if anything got logged so down here we go to monitoring up here sorry and we have our cloud watch metric so cloudwatch is a service that handles logging it also shows you all this amazing stuff like normally we'd have to configure this ourselves right maybe heroku and other services would give us some of this i'm sure they would but look how much it gives you so it shows your duration how long it took success rate we had no errors look at that yo no we have had no production errors on this api that's sick i mean we've made one request and it does nothing but at least we can say now when we give a talk we got a conference we can go up there and be like we wrote an api on lambs it had no no run time errors zero perfect perfect api you can do that you can say that now and here you have the log stream of the invocations so you can see the time when it happened 826 i don't know what time zone this is in because it's definitely definitely not 142. so i don't know it doesn't help me but let's click the log stream and we'll see what happened so in our live stream here we can actually see the logs for the invocation and some of them are useful but if we add our own like console logs you'll be able to see those but so here you can see the duration that it started it ended and it did a report started here did it ended did a report so super cool and as we add console logs this will become more cool all right that part's out of the way now the fun part so we have our serverless function we have it running it doesn't really do anything we want to make it so i can post some data pop that into dynamodb so now we have to set up dynamodb and for this we're going to use an online resource and a tutorial because we're going to go through this together so let's see so here this is just an example here of a aws typescript so this is right from serverless this is an example of a typescript app with dynamos we're going to kind of go in here and take a look they're not using the serverless typescript file they're using the serverless yaml that's fine we can convert this to typescript it's it's as long as you know how to read yaml and you know what's an array versus what's an object it's pretty easy to kind of understand so here is the part about dynamo so there's well there's this part what this is doing is this is creating some actions so you're saying uh your your role should be able to handle these actions right we should be able to query um i'm just going to open this here as a reference we should be able to query we should be able to do a get a post and update a delete and then you have the resource and there we're specifying the exact table we don't have to do that we can specify like any resource like an asterisk and that's probably we're going to do so i'm going to put this over here as a reference i'm going to put my code over here i'm going to go ahead and get rid of this get rid of this and we're going to open up server list to yes but not that okay and let's configure this a bit so first what i wanna do is i'm gonna change my handler um instead of calling it handler hello we're gonna call it handler uh save group that's going to be the name of the handler the method is going to be a post and the path is going to be save that's it we don't need to call it save brew we'll just call it safe and so here what we'll do is in our handler let's just go ahead and update that our handler will change const hello to save through and right now it'll just do the same thing nothing's changed there and we won't deploy x we're gonna make some more changes so again the first thing we have here is our provider aws environment api gateway we're going to add some configurations here so first what we need to do is create a dynamodb table so let's do that so to create a dynamodb table again we can just go over to our uh we could have gone over to our aws console and created it but what we're going to do is we're going to leverage oh sorry i lost my example there we're going to leverage this to create our table so first we're going to create a dynamo table name and we'll put that up here and we'll call it something like brew api dev for now because it's going to be our developer so we'll call it uh dynamo let's see table and we'll give it a name in this case they're calling theirs so they're referencing you can see here they're using yaml to reference self.service so that references the service up here the stage dev it's dev in this case and the provider stage what they which they've written here as provider stage is where is that where is that where is that um i thought it was here but i'm pretty sure it's okay so they're saying this or this so option stage or provider stage so we're going to use dev for that so what we'll name ours is we'll follow that same pattern so we'll do our service name is brew book api so we'll say uh we'll make a constant for that service name brew book api and we'll change this to be service name and we'll make this a template string of oops sorry service name and then we'll do dash dev i think that's good we could say db or something but it's the table name so that'd be weird so we'll just call it service name rubric api dev for now um and so that's reference here and then down here uh we're gonna actually use that so first we're gonna create these iam role statements so you can see we don't have that here yet so we're going to create a new thing um also we want to expose we're going to want to expose our dynamo table environment variable so we're going to do that here we're going to say dynamo table and we're going to just not put it equal in anything because it'll be equal to itself so dynamo table equal to the value of dynamo table okay so now what we're going to do is add that iem roll statement so you can see we have our environment here right and then below that we want this just like this so here they have environment and below that they have iam roll statements so we're gonna do the same thing i am and you can see it actually gives us that auto completion and so here i am role statements actually takes an array and you can tell because they have this dash thing here that means it's an array enamel so we're going to give it an array and inside the array it takes objects and so one of the things it's going to have is an effect and our effect is the string allow and you can see it had auto complete there an action and our action is going to be an array and you can see these are all the different actions that we have there so what we'll do is we will take these actions i'm just going to copy these and we're going to put them inside strings so we'll take all these actions because we're going to want scan where we want to get we're going to update we can remove some right now we're only going to use uh query scan well we not use query yet we'll use scan and put but we'll do that in a second so what we're going to do is just put these in strings all right there we go and then resource and for resource we're just going to use asterisks for now so it's anything all right let's close that out oh there's brew book so let's go ahead and do resource and we'll just make that oh sorry about that make that an asterisk all right so that's our i am iam roll statement that's a tongue twister okay next we want to declare our resources so here you can see we don't have any resources right now we're just using serverless we're just using a lambda we need to add resources and one of the resources we want to add is a new dynamo table and this will tell aws to set up a dynamo table for us again i mentioned we could do it via the console but this is doing it through the configuration so that when we share this code we can deploy it anywhere if i wanted to run this code it would it would know everything needed for the deploy it's all contained in this what we call infrastructure as code or cloud formation configuration pretty cool a lot of words a lot of acronyms it's like being at like in the core not like we're in the 90s and we're at like some enterprise and we just joined they're hitting you with like a book and like check out all the acronyms all the business acronyms it's like that but we're gonna get through it together don't worry so uh here we have resources so let's add that so resources sits at the same level as functions see here at the ammo so we can do the same thing we're going to just sit at the same level as functions and again we could convert this to the ammo but i'm liking the intellisense liking it so i'm going to keep it like this let's stick with that resources so resources takes an object uh the object also is called resources and that's an object and in here we're going to put we're not going to call it to do's dynamo table we're going to call it bruise dynamo table and this is going to be our name of the resource you'll see it in a second when we go into the console and the type of resource sorry i messed up it's going to be like that and so the type is going to be the same type they have here exactly aws i don't know where yeah i don't know where these strings came from i literally have been copying this set up because it's a good example and this is what it told me to do on the website i think if you wanted to figure out where these come from you would have to look at cloud formation let me just pull it up just just to give you we don't have to dive into it it's going to be pretty i think it'll be pretty intense but let me just just so we're not like thinking it's magic i mean it is like let's just be clear it's magic right it's all magic um but just so we can see what it is this is cloud formation infrastructure is code look this is what this is what i look like right now they knew i feel seen this is me this is exactly how i look i'm looking at this like cloud formation space i look like a stick figure with a floaty head exactly so it tells you how it works you write code infrastructure amazon puts your code into an s3 bucket i've seen that i'll show you that i'll show you what that looks like it's pretty cool runs that through cloud formation cloud formation does magic then puts it somewhere and it's awesome and it spins everything up look at all the people using it next door coinbase come on expedia all right but let's see uh let me see if we can look at the docks i can show you kind of what we're talking about here resources look check it out this is where this is how you get it you just like how do i know you just know you just have to know you should be in the know i'm sure there's show me less say less say less about this so i just know because i copied the other thing but so that's what cloudformation does takes infrastructure creates code all right i'm boring you okay moving on uh bruised dynamite why is it saying why does it say no oh because we have to add the other prop and the other pro we're going to get there in a second so we have our type and we're going to do the same thing deletion policy retain what that means is um oh it's going to be a string i'm pretty sure what that means and we'll find out i think it means soft delete let's we can look it up in a second but if i'm going to infer it means soft delete meaning that instead of removing the row it marks it as deleted um but we'll see we'll see i don't want to say that and then you're in the comments and you're like tom it doesn't know he's talking about the deletion policy retained it's not that yeah we'll figure it out for now we're just going to go ahead and skip over that not a big deal right now all this stuff we can dive into the docs we just want to get it spun up fast right all right properties uh so here these are basically define the properties of our um database here right so we're going to say our attribute definitions attribute name this is like the name of the attribute its attribute type in this case we're using a string key schema id key type is gonna be hashed your id is gonna be hashed read and write capacity units this is where we get in like what we get charged for we're gonna start with one we can always scale up so i'm gonna do is i'm gonna copy this and then we're gonna modify it in here so first we're gonna say property and you're gonna see in a second when i set up s3 we're gonna be adding more of this cloud formation jibber jabber so attribute definitions is going to be a list and our first object in here is going to be this all right and key schema is also a list and the first object in here is going to be this just doing some vim magic don't don't mind me all right there we go that's our list and then um provision throughput is just an object where we put this good good good separate that by a comma toss a comma here so that stops complaining change this to this toss a comma here table name this part's pretty self-explanatory our table name is just going to be that constant which we call dynamo table put a comma there whoo that was a lot a lot of typing right but hey we're in a good spot so let's go over what we have we've added resources we have a bruised dynamo table it's it's a type it's an aws dynamo table right it could have been something else remember we could be using some other provider we could modify things we're going to say it's an aws dynamic table deletion policy retain we're creating some attributes you're going to see this in a second basically what this means it's going to be key value pair we're going to hash the id write capacity units capacity units we're giving it a limit of one this is going to keep us in budget for now and what is this complaining about where did it expect to come let's form it let's format it and see if oh maybe we're missing a there we go bam all right so now let's run it and see if it's able to create our dynamodb table so to show you what that's going to look like i'm going to open up my console here and i'm going to go to services and i'm going to go to dynamo your best bet just search it just don't look through the list just search it ignore the list list is absurd all right so here we have test api dev this is not the database i've created this is an existing one i was playing around with let's run the deploy and see if we can get that new database created so let's run sls deploy see what happens so again what this is doing taking my configuration putting it in s3 bucket reading from that s3 bucket putting that into cloud formation right so serverless is definitely constructing like a cloud formation style thing from this i don't think it's using this code directly because again remember this could be used to do google cloud functions azure functions so it's going to convert it into cloudformation style code run that through cloudformation spin up the service and you can see that here uploading cloud formation file s3 right uploading artifacts uploading service rubric api zip to s3 i'll show you what that look i'll show you that that's three bucket validating template updating stack checking stack update progress while it's doing that let's take a look at s3 so for s3 basically what we can do is go to our services we can search for s3 there's s3 and you can see the bucket that i created so i have another bucket here where i made a test bucket i'll probably delete that but here is the bucket that it created for our cloud configuration files now there's my server list rubric api dev and here we have the zip file and compile cloud formation template json you want to take a look let's take a look hope there's no secret stuff in here i'm going to be deleting this anyway so no harm no foul oh please don't open in so just a very similar thing to what we had before you can see it's got a very similar structure but it's a little different because it converted it right it converted it to that cloud formation style so pretty cool okay now let's see if our deploy finished so our deploy finished remember we moved our code to here and made it a post so we could try that out but what i want to show you is the dynamodb table that was created so services dynamodb and we're going to be doing this a lot services dynamodb services lambda services s3 because we want to check out what's going on so we have our tables and there's our brew book api dev table really cool so what's awesome is this console is great i actually do love this i use this a lot of work sometimes to check data if i want to look at some data to debug something and so you can go to items and you can see all the items we have right now we have nothing we haven't posted anything yet so we're going to write the code to do that in a second but here at least we have it set up where we can see our database pretty cool right i think it's pretty cool so what we'll do is we'll try storing some basic information so to do that we need to leverage the dynamodb api and i just so happen to have a good example of that uh somewhere i believe i stored an article where i was taking a look at that that's the github service provider prop okay so we'll find it together so basically what i want to do is write the handler so that i can actually query um aws we'll pull down the dynamo bucket or we'll we'll pull down dynamo right and we will there's like an api client-side api we'll use we'll pull that down we'll use that we'll do the upload so wait the way we're going to do that is in our handler so over here in our handler we're going to modify this function so we're going to want to accept an input right so we're going to post something except that input and store that in dynamo and because dynamo is very freeform right it doesn't have like a it allows you to change the schema on the fly so we don't need to pre-define that or anything like that now we can eventually use libraries that help us with that but for now we'll just use typescript to kind of make it code safe it's not going to be safe in terms of allowing us to read and write things to the database we'll change that later so first things first what we're going to do is we are going to decide what we want to send so based on my app for now we'll just try sending the brew name the brewery name and the style how about that let's just try that for now so basically what we're going to do is we're going to write a handler to take a look at the event the event will have the information it's going to be a post event and so the way that that will come in to play is as parameters so we can look up an example of that but i actually have one here that i've done before so i remember that here we have basically let's see let me pull up an example here aws lambda dynamo tutorial we'll pull up and see how to pull these out so here we go let's okay so there's this one this is the one i had up before this back end here's an example pretty cool oh good okay so this is a good example handler event request body all right so let's try this let's do so first what we'll do is we will we're not going to check for authorization yet again i will authorizes api you'll probably be able to hit it from if you're trying to do that i'm going to shut it down basically when i'm not using it so no use trying there but for now we'll try it out so what i'm going to do is i'm going to grab information from the request body and so to get the request by and do event i'm going to json.parse event body so let's try that out so we'll say quest body equals json.parse event dot body thank you typescript and what we'll want to say is we can give the response body a type or the request body of type so we can say type quest params and we want to say it's going to have a brew name which is a type string it's going to have sorry get rid of that equal i'm forgetting this now type string it's going to have a brewery name of type string it's going to have a style of type string for now and we'll say that request variety is of type request params and what that's going to allow us to do is destructure it so we'll say brew name brewery name style equals request body and we can log that to see if everything's working okay so we will do that as we'll just say console.log and we'll make a little template string here we'll say brew name and we'll say brew name comma brewery brewery name and this will again this will go into cloudwatch which is pretty cool and style will be of course just style and i'll show you that once we run this code in a minute so the next thing we want to do is we want to actually store this into dynamo so we need to create we need to basically hook into our dynamo database so to do that we're going to pull in a libraries just like they have here we're going to pull in this aws client library we're not going to use this style because we can write our nice new code so we're going to say import dynamodb from aws client and so we probably have to install aws client or sorry it's aws sdk and we might have to install that okay so we'll install that so let's do down here npm i dash dash save aws sdk so this will be pretty cool all right so let's see how this is going to go okay so now it's saying there we go it's there but it's value's now red we're going to read it in a second and the way we're going to do that is we're going to follow along here let me just make this we can subtract that for now we're going to follow what they've done here we're going to create an instance of the database so we're going to say here we'll do const dynamo db equals new aws dot what we've just actually imported as sort of new dynamodb dot document client so now we have that dynamodb and then what they do here is down here where they're going to use that they're going to create these parameters so let's see they're calling it ddb so here they have this ddb put and they're putting these parameters and you can see in the parameters they have a table name we're going to use our environment variable to get that table name and we'll put the items and we'll pull that right out of our request body and we can do some normally we'd do some validation but for now we'll just kind of bypass that don't try to don't try to put some weird stuff in there we'll catch you no we won't right now we're just going to put it in there but we'll see let's go to record ride let's see what they're calling that so here they're they're using then we'll use async weight for this but so basically we're gonna do the same thing so let's do a check let's do a check to make sure they're all strings first uh so what does it say about brew name it's saying that's not used ah brewery name good all right so let's just try storing it for now so what we're going to do is we're going to say we're going to put this in tracks we're going to use async weight so we're going to do try um and we'll need to have a catch error and on air what we'll do is we'll console. log the error oh console.error actually and what we're going to want to do is we're going to want to have a way to send an error back to the client so i'm going to just make a helper function here we're going to call this send error response or we'll call it get error response and this will be a function that takes in an error message of type string and simply returns a response that looks something like this and a lot of this boilerplate code you can reduce obviously if you're using a lot of these lambdas we're going to have utility functions and you can build that out you can easily make that happen i'm actually in my current job they're talking about even deploying like a web framework for onto lambda which i have not done but we'll see how that plays out but for now we'll just make a helper function here we're going to write all our code inside this handler to make it easy but then yeah we could always break this up we could have multiple files and we'll see that but so we're going to have this handler for the message we're just going to pass it error message we're going to ignore that we're going to ignore that and we don't need to stringify it oh well we'll stringify that but so we'll leave the body and status code will be 500 for now so that's just so we can easily get an error message and so here we'll do that in our error just so we can send something clear we'll say return get error response error so now let's do our happy path so what we want to do is we want to say data equals dynamodb so first we'll create our params that we want to use and so our parameters are going to be similar to what they have there we have our table name and our table name we're going to grab from our environment variables so the way we do that is process.m dot dynamo table so that's going to pull from environment dynamo table so we want to think about it here and then our item is going to be brew name furry name style we're just going to use those keys i like that it's fine and so what we'll do here is we'll pass the primers to dynamo.put params and we're going to convert that to a promise so we can use a weight on it or we could have used then on it but we're going to use a weight and if we have our data we're going to go ahead and return our data so you can see here they're calling chord ride they're using props but then they're they're doing a then we're gonna do something similar um but we're just gonna return basically we don't actually even need to return our data what we'll do is we'll return the representation of our items so what we'll say is return status code and i like the autocomplete that's giving us there it's a 200 the body is going to be json.stringify and we'll stringify our um in this case we can just stringify our params. item because we're just saying hey we've successfully stored that and we don't need to modify the headers i don't think right now um i think we should be good just doing this let me see here what i did let's see headers console so that's they're here they're handling the message um that's good and then here they're doing let's see all right there we have that let's see we have message sas code so this should be fine let's stick with this let's see if it works so we don't need data get rid of that we're just not using it we could look at data and return something from there but for now we'll just handle this um okay so let's go over what we have and see if we can we can ditch this now for sure sorry about that all right so we have save brew it takes an event it pulls out from the request body we can we're going to do some validation in a second but just to test that it actually works we have our console log we're locking it out we're going to try grabbing your item we're going to try putting it in our dynamo table it knows the table name our iam role right is going to be associated with our account it can hit this table we're going to try to put that those parameters in here and if it's successful return this if it's not returned the error that's why it's in a try catch so let's go ahead and let's sls deploy that to put that code up there into lambda land as we're calling it i'm calling it now and what we'll do is try running it so there's a few things we can do here if postman worked which i don't think it did let's try opening it again maybe it just doesn't want to work right now so in the meantime since that's being annoying what i'm going to do is i'm going to install another thing that's really nice which is http httpi it's a it's similar to curl but it has a nicer interface i've used it before i will show you what it looks like so it's got a nice little api so you can do http post the url and then you can just post parameters like this and it'll put them in the post body so it's pretty cool uh so we'll we'll use this for now until postman figure itself out i'm not sure what's happening with postman seems to have failed let's try reinstalling it it might just be my os versions old we'll see so let that install there and let's see how our deploy is going so our deploy is complete here's our new url so what i'm going to do is i'm going to wait till this is done or let me check let's see if our let's move that to applications we're going to replace it postman agents in use all right so let me try killing the postman agent don't see it see postgres all right not sure what's happening there postman post one kinda just died on me okay there we go now it looks like it's opening so let's wait and see there we go there we go thank you postman all right so let's see how this goes open all right let's see how this behaves here and while that's happening let's go over here what we're going to do is we're going to look at our table we're going to get ready to see if it actually writes to it before we do that i want to look at our logs so what i'm going to do is i'm going to go to services cloud watch pull that up over here pull cloudwatch here and let's pull dynam over here and so we're going to look at our lambda we're going to look at our brew book api monitoring we'll see that in a second and let's see the postman open no postman failed once again what is going on with you postman and homebrew is also not liking all right so let's just do it curl i'm just gonna do it with curl so i have to google how to do that i forget i know it's dash x post but i forget how to send the parameters so let's see all right oh dash d so it looks pretty straightforward let's try it that's not too terrible all right so what we're going to do is we're going to send a post to our url here let's try right down here so we'll do curl dash d and it looks like when specifying multiple d options in the command will concatenate them and insert ampersands in between i like that so let's do brew name equals let's do the crisp then we'll do brewery name equals six points we don't need quotes and then we'll do style equals pilsner and then we'll paste our well let's give that a shot internal server error let me try something really quick it might have to do with this just want to see if that's what it's doing okay so internal server so let's look at our locks so here we go here we refresh to get our new invocations i can also click the refresh button down there at the bottom all right so this should be i think this is the old request let's see let's refresh it i think that's the old one let's see if it even got our request oh let's see if http pi is there okay great so we can use this now so what we can do is http we'll do the same thing we'll say that we'll make it a post let's see if it's just me messing up curl totally a possibility so let me actually clear this and we're gonna do http post and then the http documentation says that we basically send it dash f post that's the first submitting forms so we'll do that dash f post and we'll put that and we'll submit like that so let's try that post and we'll paste that and we're going to send it through name equals the crisp and then free name equals six point and style equals pills there oh ah okay so i gotta change that so we'll just do like this see if this works okay we'll just do oh sorry prename equals brew name equals the crisp there we go okay so internal server error so let's take a look error from cloudfront bad gateway all right so let's take a look and see what's going on so this successfully deployed our function is hello ah so it might be the case that we have to change our let's make sure we properly mapped our function name too so in our server list we have our hello function okay so that's our hello function but it's referring to our save brew handler so that's good so let's take a look at what happened in cloud formation so i don't see our invocation so let's see what we have here so it says application error from cloudfront internal server application error okay so let's 502 bad gateway so let's see it might just be an issue with our endpoint okay let's make sure we have everything set up right so the method is post uh let me see here method is post that makes sense um let's see with our our provider we have our name our runtime uh stage oh we could specify our stage as dev i've done that before let's do that that'll help maybe dev uh we also what we can do here is specify a region let's do that that would be n a i believe it's us east one i'm looking at my reference here and then we have our environment dynamo table put item get item we might have used the wrong if we did dynamo do we put that's correct resource we have the right handler i think that should be good so let me see hello there's that hello save should call save brew that should be correct and this should be right table name dynamo table id save retain that should all be correct bruce demo table and let me see we might have to do coors true maybe here that might be the issue to allow cross origin other than that i think that might be all let's try deploying that yes i'm not sure why this wouldn't be working so we'll find out we'll figure it out if not i can reference i tried like i mentioned earlier i tried doing something similar before just to make sure i can get this stuff set up kind of but and this usually worked let me see let's look at some other examples make sure we're on the right track here so the interesting thing too is i see you can see the errors now we lost our no errors in production um so maybe we can i'm surprised we can't see the logs from this okay there we go there's our invocation it's great okay so now we can at least look so here let's click this and let's see those invocations see if the logs can tell us anything okay very cool all right so let's take a look syntax error ah awesome okay so let's take a look at this unexpected token b at json position zero webpack handler ts23 so let's take a look maybe we messed something up handler23 ah probably here event.body is that correct do we use the wrong thing let me take a look so we're parsing event.body as request body that should be good jason parse maybe our body is messed up and then we are let me try this did we mess up by adding the dash f in the beginning there is it not form data we just want to post it so we have brew name brew name let's just send the crisp as one word just so we're not we got style pilsner let's just try this again ccp ah all right all right there we go we got a little bit of different code here so now our code is validation exception one or more parameter values in valid missing the key id in the item uh we didn't provide an id for the item that's valid so we need to provide an id for the item so we want to create a unique id or we can make an id from our brew name brewery combo like we did on the front end um that could be cool but let's try create let's just create a unique id for now so one way that i usually deal with unique ids is i will import i believe it's uuid is one thing i can use so here let's go down here and let's go to terminal let's see where's my terminal i'm going to import uid so there's a library we can use uuid that'll help us create a uuid and so uuid has different versions uh version one should be fine for now for us just creates a unique id so what we'll do is we will up here in our handler import star as uuid from uid and now we will actually leverage this uid so we'll do that in our item we'll pass it an id and that matches our service remember we said that it has an id and so we'll do uid v1 i believe this is what we need maybe it's dot v1 there we go so let's try that now so first we'll have to deploy it again remember we we need to deploy x we're making a change so this is the one difference we're not having like a easy local you know test it and try it there is a way to do that so you can set up things locally to test them locally it's very easy to do with if you're just using pure lambdas because we have dynamo setup we need to set up a local dynamo and there are ways to do that one that i've heard of is something called local stack a fully functional local aws cloud stack so this cf dynamodb uh these are all the different things you can test locally i haven't tried it i know we use it at my current place of work electric ai but i have not tried it so we will see okay so there we go we have our thing let's try making that call again oh did i type the wrong thing i did so i messed up this through name brew name save brewery style let's try that internal server error okay let's see what we're getting now so now again we know we can look at our logs we can see what's going on here so i'm going to go back here i'm going to refresh this might be right here okay see event is not defined ah so it's just a typo issue so let's go find that right here ah mess that up all right so let's just do a quick scan see if there's any other errors that typescript's showing us nothing so we'll do sls deploy and if any of you have not used lambda a lot of people may be saying or like kind of nothing about not doing it because it might cost money um that is true aws lambda does start to incur costs but we're kind of under that free tier we're doing it as like a hobbyist almost like a dev so we're not going to hit that incurred cost unless you all start hammering this api or something which is why i will shut it down when i'm done but um basically if you're doing under 20 000 requests right you're not going to hit that cost and even once you get past that it's not gonna be a huge huge bill or anything like that so if you're doing it for hobbyist you can get most of this stuff done for free and the pricing page the billing page will tell you how close you are to the point where you start getting charged um and for something like this it's gonna be pretty tough to get to that point unless you're making a lot of requests sending a lot of data storing a lot and i'll make sure that i don't get to that point but very cool and something to uh to consider and i can actually put these routes behind authentication which would be the smart thing to do so that no one can just hammer the routes so what i'm going to do is i'm going to say sls is deployed now i'm going to try running that code again so we have our brew name brewery name pilsner all right so there we go looks like it was successful returned to me a brewery name and id in the style let's take a look at our logs and see if it logged that information so what i'm going to do is i'm going to go back to our logs here i'm going to refresh click the latest invocation and so you can see here brew name undefined brewery six point style builder so it didn't get the brew name um we can take a look and see why that was so it might just be the console log didn't log brew name so it's for some reason didn't get brew name it might be because the command i ran let's see if i messed it up again brew name that should be correct yeah it didn't return it here so something happened there let's see brew name brew name we're passing a brew name all right so let's see why brew name didn't go through and let's try to redo that post so i'm going to just repost that up and uh so something's getting messed up here so let's go ahead and delete that brew name equals let's just go ahead and delete this here all right so new brew name equals the crisp brewery name equals six point style equals pilsner missing authentication token that's because i messed up my api name there ah so something's getting messed up hold on let me just type this over so we don't get this messed up so i'm just gonna pull it here i'm just gonna type it here so we're gonna do http host to this url brew name equals the crisp breed name equals six points style equals pilsner bam all right so we're good there let's check our log so we're gonna go back to our logs again we're in cloud watch we're gonna refresh click the latest you can see here brew name undefined for some reason still not getting that we'll have to look into that um actually it might be this might be the old log let's go back refresh again refresh again 35. so this might be a new one sometimes it takes a little while for that to appear but let's look at dynamo let's see if it's stored the appropriate thing and then we'll really know so over here what we can do is go to our dynamo again dynamodb click our database so you go to tables we have our brew book api dev i'm going to go to items i'm going to so i see there they are the crisp there's the bad one let's delete that we can go to actions delete and there's our item hell yeah we got one item stored in the database great this is cool um we still have a few minutes let's try to pull the item out um and then we can start using it the other thing we're gonna wanna do is store an image so that's gonna get interesting because that's where we're going to start introducing s3 for that we'll get into that in a minute but let's try pulling our items so for that we're going to need a new handler so basically what we'll do is we'll create another handler here so we'll change the name of this one to let's instead of calling it hello let's call it save brew or we'll just call it save for now and we'll create a new handler we'll call this one get and or we'll call this one bruise and we'll call this one save so for bruce instead of save brew we're just going to call it handler we'll call it get bruce it's going to be a method of get and the path is going to be bruised get same thing or we can just do slash bruise that's probably the easiest and coors is true and so now what we'll want to do is open up our handler and create a handler for getting the brew and so what we're going to want to do there is create a new function we're gonna have to name our function get bruce to match what our handler expects there so we'll do export const get brews and it's gonna be of type the same type here it's got the same function signature so we can just copy that and what we can do here that's our type signature put that there so now what we're going to want to do is get all the brews from our endpoint and so this is saying we still need to return something we will do that in a second so for now what we'll just we'll just add this to a p is the typescript gods there a status code of 200 and we will just post a message of um getting all bruised okay so now how do we get the bruise so the way we can do this is we can scan we can use the scan of our dynamo table now normally we wouldn't want to do a scan scans are very inefficient what they do is they get everything from the table and so if your table is long you're going to want pagination but it's okay for us for now we don't have a lot of brews and we do want to get all of them but at some point we would want to do a query and a query is something like a sql query where you can say get me all of the brews by this partition key or give me all the brews by id a partition key is something that dynamo uses to group like things together so you can imagine us partitioning on brewery so you could say get all the brews for this brewery if we did again if we did want to do a get all brews from a database if our database got quite long we'd want to paginate that and dynamo has ways of handling pagination and it's pretty nice so we'll get into that later but for now we'll just do what's called a scan and play scan and get all the data so let's go ahead and do that so for that what we're going to do is we will do a try here we want to wrap it in a try catch because we're going to be hitting dynamo and so let's put our error and we're gonna do the same thing here we're gonna say and we you know we might wanna have again move some of this stuff to utility but what we're gonna do here is we're gonna console.log error and then what we'll do is we'll return get error response with the error and then what we're going to try to do is we're going to do an cons data equals async and we're going to call dynamodb and we need to i believe we need to give scan the table name so what we'll do is we'll say const params equals and it's similar to what we have below where down here params and we might be able to even get the type of this and here we're going to do process dot m again to get from our environment variables dot dynamo table and so we might be able to say let's see if this actually is a thing we could do can we type this dynamo eb params or something so i don't know what the we could figure that out but for now let's just not worry about it so this is saying just go ahead and scan this table and we're going to pass params here and it's going to return a promise which is why we have the oh wait and sorry we don't want async we want to wait so this is return of promise we're going to wait this function needs to be asynchronous oh sorry i also realized we wrote this in get error response and not get bruce so let me just move this out all right there we go so now what we're going to do is we have our params we have our try we got data here if it fails we're going to return a failure if it's successful we're going to go ahead and return a status code of 200 so we'll kind of just copy this and we'll just modify it let's get esco 200 body and we'll go ahead and send so similar to what we have here where we're doing body json stringify params item we're going to go ahead and our body is going to be json stringify data all right so let's go ahead and delete this and let's see if this works so what i'm going to do is i'm going to run sls deploy again to deploy this code up to our aws server so again i'm not managing this server i'm not deploying a heroku i don't have to worry about that it's just an amazon beauty of that is if this was a real application and we need to scale so for example we needed to start loading up servers we got a lot of load we went from maybe no users to 10 users to 20 users to 20 000 users um it would know how to scale and we would pay for that obviously as it goes on but it scales efficiently and it just handles load without me having to think about it right and that's really one of the real benefits of lambda the other benefit is i'm just writing functions right i didn't have to write all the routing layers i mean i had some routing layers here you could say that this is me writing the routing but i didn't have to run express and do all that stuff and set up node and then have a container to deploy it and deploy it on heroku and configure all that it just lets me do this pretty easily and it's pretty extensible i can keep adding functions i can put them in i can divide them up in a different file so that it's not all cluttered in one file and that's something we might want to do but now it'll have the benefit of i have an api that i can hit i can add authentication to and i can start making requests to get my data and now i can hook this up to the front end and make some of this data populate okay so now it gave us our new route so let's just go ahead and hit this route so there we go it returns a count of how many uh items scan count that's like how many we actually got from our entire scan so like the how many it found how much returning and then our items so it's an array and there's one of our brews so pretty cool so we have a database now for storing this stuff now one of the other things we need to store though is the image right we're going to want a way to store the image now we can hook our front end up to this right now i think we'll save that probably for next session let's try to use the last few minutes to get image data here we can do this a few ways for one we can just upload for now raw base64 encoded data bad idea just probably not a good idea right so the other thing we do is create an s3 bucket an s3 bucket will host static content and then when we when we take in a brew we can paste base64 encoded data so we can say hey here's your base64 encoded image take that pop that into s3 get a nice public url for that image return that to the client so let's set that up now so the way we're going to do that is i actually have the code for that because it was kind of a pain to do just because i had to look it up a bunch so i just stored it here is we need to add an s3 bucket so first what we'll do is we'll go to services s3 create a bucket so i have one from before i'm going to create a new bucket i will call it brew book all right uh we don't need to copy any settings keep that the same we're gonna make everything public because we're gonna want our um we're going to want to allow certain routes to be public so we can actually see our um we basically want to be able to see our brutes right so let's do that create bucket all right so there's our brew book bucket i can upload objects here but so what i'm going to do now is i'm going to go ahead and start using this so the way that this works is you saw i had here this provider change so i'm going to go ahead and add this this is going to be important to allow our role to actually upload to s3 and so the way i found this is i had done some googling and oh sorry this is gonna go in here i've done some googling and saw this remember i said i didn't know where this kind of this part came from this just came from some googling it's the cloud formation for setting up aws and then s3 bucket i'm just going to go up here and add a constant for that and that's just going to be there that's just going to be the name of the bucket so constant s3 bucket equals brew book and so the benefit here is we have our s3 bucket set it's just going to be brew book and what we can do is now we can specify that down here s3 bucket and so that's saying okay we have access to this resource we can get objects and we can put objects here and we can even do kind of like what we did above where we have this separated on different lines so it looks the same all right oh it likes to do that okay that's fine we'll just let it stick with that so now what we need to do is we have our um s3 bucket here we need to actually be able to upload to s3 now so for that and again what we're going to do is we're going to take base64 data from the user of the image upload it to s3 get the url of the image right and then store that so that when we get it to the client the client can just use that url to render the things in place that's the general idea here so first we need to establish an s3 object so i'm going to go ahead and do that first so we can actually get s3 from dynamo and then we can initialize an s3 object here we just call it s3 this new s3 all right so it just creates a new s3 instance very simple and then what we're going to want to do is i'm going to create another function called upload to x3 all right and so what upload to s3 is going to do is essentially upload the image so we can even say upload image to s3 and so what this is going to take is it's going to take the image data so it's going to be like encoded image data basics to form coded and we're gonna need an id of basically where we wanna we need an id to name this thing and that'll be like the leverage the brew id or some form of identifier uh we could use a brew id because we have that so we'll see okay so uploading the data so basically what we want to do here is we're going to get some encoded data and an id so first if we don't have that let's just do a quick check if there's no encoded image data because sometimes we're not going to have an image and that's fine or there's not an id let's just return and that will return null and then we what will happen is when we go to store that in dynamo it'll store null there or none so nothing will be stored and that's okay by us otherwise if there is image data we're going to want to pull it out and we expect the image data to be base64 encoded this will definitely error out if it is not base64 encoded data so maybe we want to wrap this in a try catch so let's do that and let's put the error and we'll console.error the error and then what we'll do is we'll just return otherwise so here's what we're going to do we want to grab that base64 encoded date and create unencoded data so basically what i want to do here is use what's called a buffer to do this so what we can do is we can say decoded image equals and so this would be a type buffer and so i think what we want to do here is buffer dot from coded image and we want to say that it's base64 i think i have let's see if we can search what this looks like uh decode base64 image javascript all right and so okay all right we want to take out that portion okay let me see i think i had a good tutorial on this upload face 64 to s3 there was a good tutorial i looked at yes thank you devyon main he was the one that posted this yes this is exactly what we want okay i'm going to go ahead and kind of copy what they have here so this is exactly what i want base64 because what it'll do is that basics for data has a header and it will make sure it has that first right so that'll be good so let's do that um i'm gonna copy that thank you for posting that davian let's give this some claps we gotta sign in because if we're gonna leverage this gotta give him the claps right we gotta give him props where it's due helping us out a few claps there yeah okay so look so that's exactly what we want so we want that and yeah we want the file type too so i'm going to copy both of these so here we're going to have our and we'll call this decoded image and it's buffer i'm just using typescript and we don't have basics before we have encoded image okay and here we do type and this is going to be encoded image this is going to be our decoded image what is this saying hmm i feel like this is just like a weird issue here um hmm i don't think i need new i think i just want buffer from yeah there we go okay so buffer from type this will get us the image type so png jpeg and so forth um so what i'm going to want to do now is basically create a file name for this so what i'm going to do is i'm going to say const file name equals and we want to give it a path so i'm putting mine inside brew images let's give this a template string so we do brew images and then we're going to say we'll store it at slash id and then we'll say we'll give it the path and the path will be the type so dot and then the type all right so that'll be our file name and now to actually upload that to s3 what we're going to do is we're going to create our s3 upload params let's log this too to upload and we'll put file name beautiful beautiful beautiful and so what we'll do here getting close in time let's see if we can get there what we'll do here is we will say s3 upload params let's call it upload and so here what we need to do is we're going to give it a body a bucket a key and i think he has it here body yeah body bucket key content encoding and content type so let's do the bucket so first things first our bucket is our process dot m dot s three bucket and we're gonna want to add that to our environment variables here there we go there's our s3 bucket oh sorry about that and so now we have our bucket um in this example we need our key uh for our key what are we using for a key here our key will be our file name yeah file name uh body is gonna be our decoded image and content encoding is going to be base64 then our content type is going to be this is correct we need to say image slash and then the type that we are using okay so we have the upload params and then basically what we need to do with the upload params is we need to call s3 upload and so what we'll do is we will wrap that into a simple try catch it's going to be important because this could fail we will do the same thing here if it fails and so we're going to try to do is we're going to use that s3 object that we had and we're going to try to upload our data so the way we're going to do that is we have to basically take um you can see here upload promise similar you can basically do the exact same thing here so we're going to see s3 so we're going to wait this s3 dot upload upload params we can just call sprams rams and i think we'll have to have a promise okay let's try this see if this works so we're gonna upload it what does it return location and key okay that's kind of cool um so we can take the location and key and return it or we can just return the we can return the file path that would be probably nice to have so for that what we'll do is the way the file path works would be a template string and the way it's going to work is i have my notes here it's going to be https colon slash then the name of the s3 bucket which in this case is process dot m dot s3 bucket bucket and then what we'll do here is this is pretty common this is just how it looks like s3 setup we say s3 dot amazon aws.com and then the file path or in this case file name all right and let's return that and so we'll return that so now we have this function for uploading s3 and we want to call that down here in our regular upload so we're a little bit over time but i think let's try to see if we can get this going so back down here in our save bruce we're going to want to pull out our encoded brew data so we'll take that out of we'll call that brew image group image and you'll see here it's complaining because it doesn't exist on type request params and so we'll just update our type it's going to be a type string and so let's go down here so now we have our brew image and so what we're going to do is we're going to say we're going to want to wait so it's a constant root image path equals upload s3 through image and we need to give it an id uh so the idea we can give it again the id is going to be used to basically reference like it's going to use the file name so what we're going to do here is we'll basically pass a brew i'd like we should we'll take a brew id so we'll take a brew id and we'll use that so let's modify our boot prims and then eventually what we'll do because we have that brew id on the front and we'll start using that and we'll pass that so we'll take a brew id that's a string save that let's go back down here and so we have our brew image we'll give it the brew id we're going to weight that because again that's an async function right so we look at upload s3 we're going to make this an async function this way we can have our weight down here okay so we have our async function it's going to wait and return that we're going to wait for this here we're going to take our brew image path and we're going to return it here through image equals through image path uh we're gonna store that as the brew image and then when we return it here it'll be returned so now we should be storing the brew image okay that's a lot let's try uploading it let's first check if there's any errors that are visible right here let me just clear this let's look and see if there's anything visible here as far as errors i don't see anything so let's go ahead and upload this deploy it and what we're going to do is we're going to try to get postman work again because it's much easier to send base64 data if we have postman and that's because we're going to have to it creates a lot of basically a big file when it gets created so i'm wondering if the reason postman is not working is because i'm on an old version of osx or something we shall find out i'm gonna try to load it up again and see what happens so look i have this up here let's quit that try opening it again okay so we move it to the applications folder gotcha now can you open all right so let's see if this opens up and so what we'll do is we'll make a request once that opens and try to actually okay so it looks like it's trying to open move to application folder yes we're fine with that i'm going to open up my applications folder let's go to postman double click i don't see it loading okay there we go post one quit unexpectedly gotcha i see this here i definitely don't want to launch at startup it won't even launch right now let's check for updates maybe there's an update required all right it's not working let's quit that again if this doesn't work what i'll do is i'll just get an alternative i'm wondering if it probably probably has to do with my i don't know operating system version right now so let's just try this let's search for postman alternative i'm sure there's plenty out there ah i've heard of this it's on the arrest client let's try this let's check this one out insomnia rest client let's go to their site pricing all right let's download it let's try this this is all we need all right let's see if this works out and so what we'll do is let's see if our upload succeeded so it looks like it succeeded um we're going to try using this we're going to try posting our data so let's open up insomniacore let's see if this works if it does great i really would like to get post working i'll work and debug that on my own i don't want to waste too much time there trying to do that on your time so let me just open this up here check out insomnia hey try a new tool let's see how this works hopefully this one works verifying insomnia and while we're doing this i'm going to go get a picture and base64 encode it so that we can leverage that so i'm just going to search for the crisp actually let's do a new one let's do i had this beard the other day fantastic um really really loved it so i'm gonna use this one from hudson valley shout out tatsu valley they're great i think i use this one in the brew book app uh so i'm gonna go ahead and just copy this image address or let's save it to my desktop here oh there you go save this here and i will go to a base 64 encode image just go to an online one base64 image encoder brilliant i'm going to drag the image here bam it's going to encode it and we'll do this programmatically in code on our app side when we get there i'm going to copy the image let's open up insomnia new request let's call it save gru it's a post request okay so for the body we're gonna go ahead and select the body type so we're gonna do json for now and we're going to say our brew name is monomyth our brewery name is hudson valley brewing our style is sour double ipa and our brew image which i believe is we're calling it let me make sure let me just double check that in our types per image and we'll send the brew id so our brew image is going to be our base64 encoded data it's going to take a little while to paste this bear with it bear with it let's see if insomnia can handle it uh oh can it handle it we might have crashed it okay there we go so you can see it's a lot um there's other ways to do this but you can see somnia is kind of choking on this postman didn't although you know postman didn't open this time so insomnia has that going for it so let's see hopefully this will work let's see what's going on here so it's had a little trouble understandably it's a big big thing here okay so got it there we go it's big it's huge um it's a big file and so we also need brew id which i should have typed first in hindsight and our brew id let's just give it an id like what we do on the front end so mono myth dash and the brewery name hudson valley actually we probably should put hudson valley brewing first for now let's just go with this right here let's fix it that's a valley brewing mono myth okay great so now we're going to go ahead and put a comma here and we're going to paste our url save all right moment of truth send it it's going it's going whoa look at this look at this look at this id oh you can't see this i apologize it's probably super small on your end hold on let me make this a little bit bigger so i made the request it sent it it generated an idea brew name hudson valley brewing and this brew image url so let's look at what this looks like in dynamo let's see if it actually stored it so we'll go over to our dynamo so oh let's look at s3 let's see if it actually uploaded it so let's refresh s3 oh brew images oh what is this let's go there's our beer click it let's open it so you're gonna get access to nod that's okay what we're gonna do is we can make this folder public so what i'll do is i'll go to brewbook i'll click this and i'll make public now we'll be able to actually access these brews so now when i go to click that brew in a new tab let's go obviously won't be storing huge images like this we'll be storing thumbnails and things but this is pretty cool so let's check dynamo and see if it actually stored it in dynamo there we go there's our item and we can delete this one because it doesn't have the image awesome so we have data being stored in dynamo that's it for tonight to go over what we went and did let's walk through that really quick so we started out we had the idea of uploading data to an aws lambda api storing that data into dynamodb and we did it all using serverless so let's walk back through that really quick we had our serverless scaffolded it's scaffolded at the serverless typescript file equivalent to a serverless yaml it's configuration we did went through our configuration followed some tutorials were able to get this going deployed this got our aws user set up with some permissions whacked those permissions one by one got this deployed we set up two routes one for excuse me saving the brews one for getting the brews and again you can repurpose this to whatever you want this is just kind of to show how we can get aws set up with dynamo it was pretty there are some weird things i think it's still a lot for me to figure out the cloud configuration or cloud formation stack stuff but the tutorials online help and there's a lot of useful resources so thanks to to that to anyone who put those out there so we got that ability where we're able to get the bruise we're using a scan to just get all the items from our dynamo table we saw how to put environment variables so we could leverage them we saw how to look at the logs and so all this is running in amazon we don't to think about the servers it will scale up naturally as requests start coming in we're not expecting requests of this right this is just for testing but now what we can do is we can hook up that app we made instead of using that data locally we can now pull it from dynamo and that means we can get it from any device we're on we can sync to the server and so that'll be really cool so in the next series what we'll do here is we will try hooking up our app to this database and seeing if we can pull that data live that will give us the opportunity to try sending the data receiving the data writing the code to handle the asynchronous fetching of that those brews and leveraging recoil and seeing how recoil's asynchronous story works recoil js but so again this was setting up an api with aws lambda dynamodb all using node and typescript and the serverless framework so i hope you enjoyed this i'll be posting on youtube we will definitely be diving in more in future iterations below you can see my twitter url follow me on twitter i'll be posting when i go live um the github url is there as well my youtube name you can see here is just the youtube channel there it's just my name tom mcgirl i got my github url twitter url twitch url follow me on there if you like what you saw here feel free to follow me on twitch be super happy if you like it if you want to watch it on youtube you could watch it whatever playback you want we're gonna be doing more one-off sessions like this so i had fun with this one setting up aws lambda with dynamo and node i'm excited to try some other different ones we're thinking about doing a one-off session on writing a memoize function a one-off section on building a portfolio site using gatsby js i think this one was a good one to start with because it kind of ties into that rubric application that we've been building in our in our series that came before this those are also posted on youtube all seven episodes there we might pick back up on that next week and kind of attach it to this api i think that'd be pretty cool but yeah hope you found this useful again i'll be on tuesdays 9 p.m eastern to 11 p.m eastern today we went a little bit over but i think i'm pretty pleased with the result so yeah thanks everybody and have a good night
Info
Channel: Tom McGurl
Views: 3,706
Rating: 4.909091 out of 5
Keywords: twitch, games, AWS, AWS Lambda, DynamoDB, Serverless, S3, NodeJS, Typescript, Serverless API, Lambda Functions, Serverless Framework, Javascript, live coding
Id: ifdV3NuyoBI
Channel Id: undefined
Length: 135min 41sec (8141 seconds)
Published: Tue Aug 25 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.