Upload to S3 From Lambda Tutorial | Step by Step Guide

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
what's up guys welcome back to AWS simplified today's video is showing you how to insert a JSON file into s3 from your lambda function in three easy steps this video pairs well with my previous one on region and s3 file from lambda so be sure to check that one out as well so the first step that we need to do is to create an IM role with the necessary permissions the permissions we need to grant this role is the s3 put object permission typically I do this using one of the policy templates that has this permission when creating my lambda but unfortunately there is no pre-existing policy template that has this permission so we need to create one ourselves so if you're confused don't worry I'm going to provide you with a JSON policy document in the description of this video so you can use it in your project so let's head over to the iam section and create the role that we need so typing in I am here going to the role section I'm gonna click on create role our lambda is going to be using this role so click on lambda click next in the bottom right for permissions so we need to create a s3 policy now if we type in s3 here we can see that the existing policies are not quite fitting for our project here so Amazon s3 full access this one will work but it's way too much permissions for this lambda function so we don't want to use that and the second one here the s3 read-only we want to write so this one won't fit as well so we need to create one ourselves now let's head over to create policy up here click on that button it's going to open up a separate window in terms of the service we want to type in s3 click s3 and we want to search for put objects and that's the permission we want put objects so take that box scrolling down a bit we need to fill something on the resources section to expand that so what this section is asking for if you want to apply this policy to a specific AWS bucket or resource typically you want to do that especially if this is any production system but I'm going to choose all resources here just in the interest of time leaving everything blank here we're gonna go back to the bottom right and click on review policy I'm gonna name this three put policy leaving the description blank scroll down create policy okay we can see that it was successfully created now we can close this window if we click on the reload button here we should see the new policy is visible and there it is as three put policy tick that checkbox let's just see what's inside it out of curiosity so we can see that we have the s3 put object action that is permissible on any resource represented by the star there so let's just minimize that there's also a second policy that I want to add just that we can see cloud watch logs if you want to debug so let's type in land basic execution rule click on that checkbox as well just expanding that out to see what it has so you can see that it has the necessary actions to interact with cloud watch so that is great minimizing that so everything is done here let's click on next so we can progress we don't want to specify tag so let's click on next again and let's give this role a name so s3 put object role and we can see down here in the summary section for the policies we're applying the s3 put policy and the AWS lambda basic execution role which is perfect clicking on create a role now alright so the role is being created that is it with the first step so let's do the second step now which is just verifying the s3 bucket name that we're going to be uploading our files to so- called AWS simplified transactions whatever you're using just make sure to take note of it all right so let's move on to the third step now where we're gonna actually create the lambda so type in a and lambda clicking on create function at the top right function name let's just name this whatever transaction put objects and we're going to be using python 3.6 here now in the permissions section is where we need to specify the role that we just created so expand that out scroll down so we're going to use an existing role and in this drop-down box here we have the s3 put object role which is the one that we just created so select on that now we can just click on create' function in the bottom right and that was created pretty quick perfect so now we need to actually code up our lambda so let's just take this boilerplate stuff and go into sublime I'm gonna paste that in let's get rid of the junk that we don't need here so before we code up any logic we need to do some imports we're gonna be importing the boat'll 3 library which is used to interact with s3 from python so next step is to define our s3 client from the below 3 library so total 3 dot client s3 and then next we can actually you know do our logic so let's just assign a temporary variable to our bucket name so we saw myung was a WS simplified - transactions ok perfect and now we we want to specify a Python dictionary object that we're going to be converting into a JSON so that we can upload it into s3 so let's do that so transaction to upload and let's just assign that to an empty object for now let's copy that a few times we're gonna be setting the key of transaction ID D and let's just say 1 2 3 4 5 that's fine I should delete this line let's get this one and we're gonna say the type is equal to a purchase in my examples usually this is either purchase or refund next amounts this is not gonna be a string this can be an int so let's just say it's 20 dollars or so and then copy again let's just say the customer ID that this transaction belongs to is c ID - 1 1 1 1 1 alright so before we go any further I want to specify the file name and I want my file name to be based off of the customer ID that we just saw over here so that when I'm in s3 I can easily figure out which transaction belongs to who so let's just assign a local variable called file name let's just say it's C ID - 1 1 1 1 1 probably should have signed a local variable to that and use it throughout but I did not say Plus JSON so that's the file name that we're gonna be using so the next step is to create a byte stream out of this dictionary object so that we can actually upload it to s3 so let's do that by assigning a local variable upload byte stream we're gonna say bytes and we need to convert the dictionary object to a JSON before we do this so JSON dump s and the variable name that we just created and we want to encode it in utf-8 so we can understand it alright so now we have a local upload to byte stream variable that contains these bytes of this Python dictionary object and now that we've done that we need to actually call the s3 put object API so after you put underscore object and to specify some parameters here so the bucket is equal to the bucket variable that we specified earlier that contains our bucket name and the key is equal to our filename specified here so it's the customer ID - blah blah blah JSON it's a file name and the body which is another word for just the content is equal to the byte stream that we specified in the previous step so let's assign that there and let's just do one final step let's just say put completes just so we know when we're looking at the logs that this was actually successful and I'm going to be providing this code in the description section below so if you need it feel free to check it out later so let's take all this stuff I'm gonna go back to the console and paste everything in there and go to the top right here and click on save and we want to test it so click test since we just created this function we need a test event so let's just create a dummy test event the input here you can just leave this as default we're not reading off the input so it doesn't matter at all go to the bottom right and click on create alright now we can click test and actually run the thing and see if it worked all right we can see that the result was successful if we scroll down here into the body we can see down here the request started and put complete that is great so let's go over to s3 now to see if this file is actually there and there's the filename that we just specified click on that guy click on open let's open this up and there's the contents that we just specified in our Python code and if you enjoyed this video please don't forget to Like and subscribe if you want to learn more about s3 I'm putting a link to one of my s3 playlists on the right hand side of your screen thanks so much and I'll see you next time
Info
Channel: Be A Better Dev
Views: 91,044
Rating: undefined out of 5
Keywords: python, aws, aws simplified, awssimplified, lambda s3 python, lambda s3 file upload, lambda s3 upload example, s3 lambda upload, s3 lambda file upload, lambda s3 upload, lambda function to upload s3 file, write s3 file lambda, json file s3 lambda, lambda upload json to s3, s3 lambda python, aws s3 file upload, aws lambda s3 python, aws s3 lambda python, aws s3 lambda api gateway, s3 trigger, s3 lambda trigger, s3 amazon, aws s3, serverless, serverless nodejs, aws developer
Id: vXiZO1c5Sk0
Channel Id: undefined
Length: 9min 13sec (553 seconds)
Published: Tue Nov 12 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.