Create Amazon S3 Presigned Url using Serverless framework and demo

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi there my name is girish jaju in this video we will discuss about s3 pre-signed urls and also implement a serverless solution to generate the same what is the need for stp signed urls we know that we first create the sd bucket the bucket is private meaning only the owner or the creator has access to the bucket and its objects what if we need to share the objects with some other user or applications which may or may not have iam accounts or roles how do we do it this is where the s3 present urls come handy these urls are generated by using s3 apis and are accessible for a limited duration we do set the expiration time users or applications can use these urls to access or upload objects to the private s3 buckets that are owned by you the pre-signed urls only have permissions that you associate with them while creating like get object or put object in this tutorial we will implement a serverless application that will expose an endpoint to be called to generate a pre-signed url we can use any application that can call an s3 api to generate these not just a serverless application in this demo as i mentioned we will expose an api endpoint via amazon api gateway a user calls the api endpoint to generate the pre-signed url the api endpoint is integrated with the lambda written in node.js which makes use of the s3 apis to generate the pre-signed urls once the urls are generated they can be used to access or even upload the objects in the private s3 bucket that we own let's implement this now we are ready to create our serverless application so i am in my visual studio code and i have a folder called youtube s3 pre-signed url demo it could be any name let's initialize an npm application so we should get our package.json file here we'll create a serverless and use a server use the serverless framework to create this application so sls create an sd and we will use the node.js template so aws node.js which will give us the boilerplate serverless yaml so let's clean up and start modifying this file for our needs if you scroll to the bottom of the file you will see there is a resources section we will uncomment this this is where you can put any cloud formation code that your application needs so let's give bucket a name which should be globally unique so i'll just say my cloud tutorials youtube s3 pre-sign demo hopefully that name is available we will also uh aws recommends that we should have the bucket and the policies not publicly open so we'll write some of the policy document so for example block public issues to true we'll also have block public policy to do we'll have ignore publications true and lastly [Music] rustic public buckets that's true okay so this should create let's name let's call our resource my s3 bucket or let's say let's call it demo bucket for example let's uh go to the provider section this is where we will add uh stage by default uh serverless framework deploys it in dev so we just keep it dev and also the aws region i want to be in us west to oregon which is the closest region for me uh so we got that let's remove this once you have that let's see our lambda function needs to be able to connect and run the s3 apis to this bucket so let's give some iem rules so let's say i am role statements and effect should be allowed now what are we allowing we are allowing action should be we are loving s3 get object and s3 put object api call object now are we allowing on all the entire sd bucket on certain resources let's so let's just give particular resources on which you want to allow this now this resource takes the format of bucket arn slash star how can you do that so we have to join we can use intrinsic function join and join separation separated by slash and what is the resource arn so we can use another intrinsic function called get att and they get it you take your resources arn name and our resource name is demo bucket so you can go to get a dt demo bucket dot a arn and applies to all the objects in that bucket so what it's doing is is try is going to print or or resolve to a bucket arn star so all the objects within the bucket that's what we need let's go to the function section and this is where we will write our function so let's name it for example generate pre-signed url now this will be this will take the bucket name to work on so we'll just pass it as an environment variable let's call it bucket name and again we'll just pass it as a intrinsic dev function which is called demo bucket the function should actually be invoked from an api so let's say events and the events we are saying this should be invoked using http event and the path for that event let's call it preside and method should be so that it can take the body and generate a pre-signed url for us so we don't need all of this so let's clean up a little bit okay so let's now go to our [Music] function and we'll just write this function here so if you go to handler i don't need this hello so i just want to remove this and i don't need this user so let's start with getting some of the libraries that we need so we need the s3 client and you can get it acquire aws sdk slash clients slash s3 our bucket name would come from the environment variable so let's call it bucket name and it will come from process dot env dot bucket name let's initialize the s3 client s3 equals to new s3 okay so this is good now in our function let's start with a try and let's say catch so let's uh if any error let's console log it console.log error error okay and then we'll return in case of error will return status status code 500 and maybe a body or a message in the body let's say please check the locks okay here we'll process let's say first parse the body that is coming back to us json.parse event dot body now this body has a certain elements that we send in so one of them is object key equals to body dot object key we also need the s3 action body dot s3 action and we can also have optional content type which is body dot content type this would be needed when we create a pre-signed url for put requests and the generated url takes expiration time so let's by default let's see expiration time is 60 seconds let's create the parameters object let's say patterns equals to now this parameter object will be passed to the s3 api so it takes bucket as bucket name it also takes key as object key and also takes expires expiration time we might want to give them a little bit longer expiration time if it's a put object request because we are going to be creating some curls and it would take some time and we don't want our generated url to expire before we can even take an action on it so let's see so if uh s3 action equals to equals to let's say put object in that case we can say terms dot content type equals to the optional the content type that is passed in the request and you can also say params. expires to just make it five minutes so let's say 300. looks good so far let me just minimize this now we make the s3 api call to generate this url so we say calls signed url equals to s3 dot get signed url and it takes two parameters one is s3 action and the other one is the params object that we created which takes the bucket name key and the expiration once everything is good we can return from here and then we return status code as 200 and in the body we will just return the string define value of the generated url so json.stringify and the url generator is signed url okay let's try to deploy this and if any error then we'll troubleshoot so to deploy is simple you just sell as deploy so let's head over to my management console and here i have some cloud formation stack a new formation has just started youtube s3 pre-signed url demo and let's wait for it to finish so you see the new bucket is created let's just filter it on youtube and let's just wait for it to finish all so it's creating all the resources as you can see if you see the resources section it has already created the api gateway it has created demo bucket and a log group actually we did not change the name of the function so it just says hello log group and the server is deployment bucket etc let's go back to our visual studio code so it's still in progress and in the meantime we can create couple of sample files so let's call it let's say we create a simple html file let's call it sample.html and this is a simple html file to test get object pre-signed url let's create one more file i call it sample one dot html and this is what we'll test to do put object resign url so our endpoint is ready as you can see the cloud formation stack creation was successful let's go to our bucket and refresh this and we have this mct youtube s3p sign demo so let's upload this file here and i'll just also want to show you the permissions that we set and if you see here we have blocked all the public access and all the acl etc so that our pocket our bucket and the objects cannot be accidentally make public made public so what we'll do here for this demo we'll upload the file that we just created sample html let's go to sample html i'll open it and upload upload was successful so let's see if i go here this is the object url so if i try to open it in a new tab this is what i get access denied because all the objects and the bucket are private now let's try to use our api that we created to generate the pre-signed url and try to access this file using the pre-signed url so to do that let's go back to visual studio code this is the end point that we have and what we'll do we will actually uh i have a couple of postman requests ready so what we'll do here is replace this with the url that we just got and here this is the file that we have so i'm sending this object key and the s3 action in the request request body so let's try to generate the pre-signed url okay so url is generated and you can see in the url you have the access key id the expiration time and the signature and you know that we set for the get request we set the expiration time to be 60 seconds so let's try to open this using the pre-signed url and yes we are able to see the sample file that we uploaded if we wait few more seconds and try to refresh this url this link would expire and then we should see the error again so let's just wait for a couple of seconds still good so good so i think we give 60 seconds so it would be another few seconds left and yes once the expiration time is passed you can no longer access this file from the url that is passed okay so that was the get object uh s3p sign url let's try to upload a file using a present url so there could be situation where you want to uh you have clients who needs to upload the files in your sd bucket and you don't want to of course give them permission uh to the sd bucket but you can give them pre-signed url so they have temporary credentials to upload the files so let's see how we can make this so going back to our postman let's try to create a s3 pre-signed url for put object call and in this case my object key is sample1.html s3 action is put object and content type is text html let's see our st bucket one more time and if i refresh this page we should only have one file yet okay so what we'll do here we'll try to generate a url for oh sorry i have to copy the api endpoint that we just created and we'll try to generate the end point for this okay so we got some error let's see if you go to functions and let's go to monitor and cloud watch okay so uh unexpected content type i think i have a typo and it should be content type with c capital so let's fix that let's redeploy okay so stack is finished let's try our request again to generate the pre-signed url for put object and yes we got that so what we'll do we will try to write a simple curl command to upload this file so in minus h header we'll say content type is content type is text html and this request is of type put and what are we uploading we are uploading this file called sample1.html and in the end we will just give the entire pre-signed url path and it return with nothing that means success so let's check our s3 bucket if i refresh and yes i see the sample one.html file got uploaded let's try to access it using the object url first and obviously i got an error because the bucket your objects are not public so let's try to generate a get object for this file that we just uploaded if i try to access the file using this endpoint i should be able to get it and this is the sample file that we created so this uh concludes the demo which shows how we deployed a serverless application to generate s3 present urls for both get object and put objects and how we can use this url to get or upload the objects into our private s3 buckets these urls expires based on the expiration time that we provide and let's do some cleanup so first thing first before we do the cleanup we will we will actually be deleting the cloud formation stack so let's go to sc bucket because the cloud formation will not allow us to delete the sd buckets if there are content in them so delete this and let's copy command delete so sd bucket is empty there is another st bucket that we have where the cloud formation stores the cloud formation data so let's empty this bucket as well once the buckets are clean uh clear we can go to cloud formation stack and in the while you're in the stack you can delete the stack this way we don't get accidentally charged for the storage or the cloud cloud watch logs or anything of that sort so delete is initiated and once uh delete completes we just wait it should be completed in a few seconds and let's check the events so yeah it's delete is uh completed i hope you like this demo and i'll be looking forward to seeing you in subsequent videos i'll be creating some more demo videos please subscribe and like this video and thank you so much for watching see you next time
Info
Channel: My Cloud Tutorials
Views: 988
Rating: undefined out of 5
Keywords: aws, s3, serverless, nodejs
Id: JDwpJI_Tiqs
Channel Id: undefined
Length: 25min 19sec (1519 seconds)
Published: Mon Apr 05 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.