AWS S3 + Lambda +DynamoDB | Upload S3 Data Into DynamoDB by Lambda Function trigger by Data upload

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in this video we will create a Lambda function and will trigger the Lambda function by S3 event so to trigger the Lambda function there are many events one such S3 event so in the YouTube playlist I will discuss about all the events for a Lambda function and in this particular video we'll be discussing about S3 so whenever the file is going to be loaded in S3 bucket it will trigger a Lambda function and to extend the discussion here we'll be loading those S3 data in dynamodb so let's straight jump into S3 bucket first we'll create a bucket where we'll be loading all the files yeah let me create a bucket I will be creating all the buckets in EU West one which is the Ireland region let me name this bucketers S3 trigger for Lambda let me see if this one looks yeah the bucket name is available for us now I will jump into create a Lambda function so any Lambda function we can code using dotnet go Java node.js python or Ruby so we'll be creating all the Lambda function using python because that is my expertise and this is hugely used in the market right term so this is very useful language to work on so I will straight away go to create a function let me name this function as S3 trigger four Lambda so you guys will be using a meaningful name for your projects so I will use S3 trigger for Lambda function name and I will be selecting the runtime as python 3.8 and here I will create a role when the function is creating otherwise you can use the existing role I'll be discussing about IAM role which needs to be added to any of the Lambda function we create and that role has to have access to this S3 and Dynamo DB because we are going to load data in the dynamodb tables so we'll come to giving the access to this Lambda to this role we are creating while creating this Lambda function so let's just select the first option which is creating a new role with basic Lambda formation go ahead and click create function moment it is done so whenever we create a function like this we'll see a template code is added in the body so let's say in this body it is expecting an event at context so every function will be triggered by event whether it's a S3 even or any glue job or any any other types of event available so let's say I will be adding a trigger which is going to be S3 trigger you can see there are a whole bunch of triggers available we'll be discussing about these triggers this useful triggers in the real time scenario when what we use normally in this entire YouTube playlist so today's topic is S3 so we'll be adding S3 trigger now we when I selected S3 trigger is looking for it is asking for a bucket so it is because I have only one bucket available in my S3 so it is by default coming this bucket otherwise in this filter whole lot of bracket whatever is available it will be coming now here even type I will select any put or post request I don't want to do any copy event to be a trigger for this Lambda or any delete or any permanent so this sort of triggers we can select so whenever the file is related we can notify by running a Lambda code so we'll be talking about those things later on in a more real-time scenario so two make this video quick and fast I will be adding I'm not going to add any prefix of the files to be uploaded but I will upload I will add a suffix such as CSP file so whenever any CS dot CSV file is uploaded in this bucket this Lambda function will be triggered so I'll acknowledge this recursive invocation using the using the same S3 bucket for both using an Envoy is not recommended so I will acknowledge this one I don't want to make it both input and output bracket because it is going to be a cyclic call of three import do some processing and up make the output bucket as the same bucket it will be triggered the Lambda function will be triggered again so I'll do this acknowledgment here and click on ADD so my trigger is added now here because in this code we have an event I will print the event in the log we will see what is the output coming so every time I make a change I need to deploy now let's say I will open this bucket in a new tab and in this bucket I will upload a CSV file very quickly so I have some files here so let's say upload this file and then upload so this Lambda functions will have been triggered if I go to monitor and then cloudwatch log this is where all the Lambda logs are stored by default if I go here for this log so there is a log created right away here okay so it is thinking the code is wrong it is thinking oh yeah that is wrong event so we need to name it as this one then deploy now this function will not be trigger until I upload this so I'll upload this file again it will overwrite and it will do the trigger again so for our testing we'll keep on uploading every time we make the change in the code so load it if I go here let's see second log is here I can see okay it's still not recognizing so this thing happens let me just put a new line it will recognize I'll upload it again pretty quickly float I'll go to the lock for a refresh third log game yes so this is the record for us so The Event Source is S3 region EU West one and then it also whole lot of bucket and the file related data which is here so that's the bucket name and then that's the key which is the file name so we need these things bucket and the file if you see this record is a is an array and it has only one record so we have an array a dictionary of array of one record so if I want to access this one so it's going to be pocket name equals to event records then zero then here the key is bucket name similarly file name is going to be here records 10. it's going to be object and key so if I just print the bucket name and the file name in the next print statement comma for again it is button enter so that make sure it's reading the code correctly I will upload the same function again same file again sorry and we will check the next lock so here we should have a four clock coming up yes it is here if I see key error okay maybe this is where we are making a mistake if we check this one again here oh it's going to be S3 sorry so you need to add S3 here then under S3 we have bucket and we have object So within the S3 it will be so if I just upload yeah we'll go and check the log again with this log yes so we have the bucket name and the file name printed here so remember this code so this is the part of the code to get the bucket name and the file name from the S3 event now to work on with this data I will be uploading this data into dynamodb so I'll go to Services Dynamo TV I will create a new table so table creatable I will say customer table okay I will I will have to give partition key and the sort key name I'll come to the discussion why we need this to when we discuss more about Dynamo DB later on so let's create this two table first day and customers I just keep it simple and create this two table I'm creating day as string as of now and then nothing I'll just create the table now to upload the data into Dynamic DB so I'll just copy paste the copy from from here here just and I will try and let me explain this code so we need to create moto3 client we need to create a dynamodb resource and using this resource we will be uploading this data in here so we'll create a table and using this table we'll upload the data using put item and we'll be looping through all the data by splitting in by new line character now in this data if you see we have and header to ignore this header we need to do a pop customers Dot zero so I need to do the first I need to remove the first record otherwise it will treat this header record as a record and it will try to insert now we will do deploy but one thing to do here is to this I am user out I am roll rather which is created for the Lambda function so press three trigger yes this is the role created when we created the Lambda function and here we have a basic Lambda execution axis but we need to give dynamodb access otherwise it will fail we will try that first we'll try to fail this without giving any dynamodb access and then we will give the access and try again so we will upload this function again file again and see what does this trigger do float yes and we'll see the log again Yep this is the log right now yes it is showing access denied so because there is there was basic access now we need to because we are using the resources S3 client the Bluetooth resources we need to give the access for both S3 and for dynamodb as well so if I go to the IAM here so I will give I will do add permission so I will give S3 if I do search S3 full access is there I will give read-only access for now and see whether this this access allows to read the file and just upload into yep dynamodb so this is only real S3 Dax as I give I haven't given the dynamotv access yet so we'll check now again load yes it should be coming up right away now has that log added here yes I think so it's 351 log now okay so it doesn't work so let me give full access let me add permission foreign so I have s34 Lexus now I'll upload this file again check the log again let's see it is created a new log I think it came under this okay okay so there is a problem in the code maybe some from variable limit is not oh yes this is wrong variable so this is the customer customer data yep it should work now let me deploy that upload let me check the log yes 353 start request yes Endo file and the file so it is printing this line because we have try catch so this try didn't work because I don't have access so it is coming to this except block because it is not allowing me to upload the data in the Brand number DB table I will just check the dynamodb table here if we see explore table items I will see Zero Records now I will give access to the dynamodb to this road dynamodb if I just do a search yeah I will give a full access because I'm going to write this data here yes this one effect affects immediately I will upload this file again I will take a look 53 maybe this has uploaded here okay it's still having this problem see see data is loaded here no it's not loaded so there can be some other problem uh input customer data zero one two three table put item the problem I will remove this I'll just print customer here every time and see what it is painting yeah this is the latest lock yeah it is printing the data just fine okay now I think it is working so it took a bit of time to give it the access now it has loaded the data and if you see end of the line the last record it has printed the file so all this data has been loaded now so from this code you can see whenever I upload a CSV file here the Lambda function gets triggered and by this code it loads the data to dynamodb so that comes complete my short video thank you
Info
Channel: Cloud For Free
Views: 3,597
Rating: undefined out of 5
Keywords:
Id: Mi86AdywOPk
Channel Id: undefined
Length: 24min 31sec (1471 seconds)
Published: Sun Jul 23 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.