API GATEWAY DEMO | Create API ENDPOINT to fetch S3 DATA using LAMBDA FUNCTION | PYTHON BOTO3

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone and welcome back to the channel in the last session we discussed in length about api gateways and at the end i told you that we will be having a hands-on demo for api gateways in the next session but to spice things up why not design a small and a simple service that will help us understand api gateways and aws lambda in a better way isn't it so today we will work on a small project where we will understand how we can use aws api gateways along with aws lambda to fetch our data from aws s3 so if you're ready let's begin [Music] but before moving forward to the aws console let's understand the requirements here so initially our users at team a were uploading files to aws s3 bucket and the users at team b used to download or use them directly from the bucket post switch there was a requirement that all the data that the users are going to download should be base64 encoded so we made a design change and for the base64 encoding we introduced a simple lambda function and to provide the users a bit of convenience we thought of providing them with an api endpoint resource so that they can pull the data they need so now let's check out how we can get this done so for the first step that we are going to do is basically having aws s3 bucket so if you have already created a bucket then it is well and good you can use the same thing so i'll just go ahead and click on s3 there is a service that we are going to use the s3 bucket so just click on s3 or you can just type here as well s3 s3 and you can use it okay so here you can just click on s3 and i have a lot of buckets already in my s3 store so i'm going to use this bucket my new web bucket and there are a lot of files here that are already available to me so i can use any of these to simulate the scenario that we are having like we are trying to download these files so what happens i will be uploading files to this bucket that i have and you also can create your own buckets not a problem okay so this is the first resource that we want so we are done with the s3 part okay so the next thing that we wanted to do was we wanted to create a lambda function isn't it so let's go to lambda so for that you can just click on the services and just type lambda and you will get the service drop down there so you can just click on lambda here so as i have already created a function before so it is showing me the direct page where all the functions are listed but if you are new and if you haven't created any lambda functions you will be shown with this page okay so this is the one that will be shown to you so here there is a very beautiful demo that actually lambda provides and i want you guys to check that out as well so the first thing that you read here is aws lambda and the next thing is lets you run code without thinking about servers and that is the most important and the beautiful aspect of using lambda and you pay only for the compute time that you consume there is no charge when your code is not running with lambda you can run code for virtually any type of applications or backend service all with zero administration okay so let's see how it works so here is a simple demo that we can run so i am going to use python because uh i know a bit of python so not a problem with that so you can use any of the languages that you are familiar with so then so here is the lambda handler actually basically this is the function that is going to be executed and by default sorry and by logic it should return hello from lambda isn't it so i'll just run it okay so it printed hello from lambda isn't it so we did not create any instances here or or we did not install any python packages here to run this code we just wrote the code here and it executed for us and that's how we got the result and that's the convenience of using serverless okay so next is how actually lambda responds to events so if you click on this you will see there are so many events that lambda can process information for okay so there is a mobile phone where we can get the mobile or iot backend data so if you click on this the message passes through the mobile or iot backends and here is the streaming data or it can be from the kinases or any message broker that you have or here it is the file it can be from your s3 bucket that is being processed with the data that you have and here the lambda actually is taking streams of data from everywhere and it is trying to process it isn't it so whenever the application stream or the data flow increases it basically scales up automatically as you can see when i'm clicking on it repeatedly it just scales automatically because the number of input is getting increased every time when i'm clicking on it see you can see it is increasing the number of instances or the background processing power that it has whenever the traffic increases okay so next is basically here you can see that lambda responds to events once you create lambda functions you can configure them to respond to events from a variety of sources try sending a mobile notification stream data to lambda or place a photo in the s3 bucket okay so that is what it is trying to tell you as what we have already discussed now so the next thing is basically to see how it scales seamlessly so this is the number of invocation that you have and this is the price that you're going to pay okay so let's suppose i am going to continuously click on streaming analysis and let's see how much the cost is going to increase for us see when i'm trying to increase by sending it a lot of input see after 1 million requests or invocation the price has increased okay and let's suppose i keep on pushing the data here but if you see gradually with with higher amount of invocations you see the cost becomes highly reasonable so that is why amazon lambda is considered to be a highly effective processing power that we need so here also what you've seen is lambda scales up and down automatically to handle your workloads and you don't pay anything when your code isn't running so your first 1 million requests are 400 000 gb seconds of compute per month are free i think we have discussed in length about the pricing model for lambda so i don't think so we can discuss it anymore you can just go ahead and read them in the documentation okay so the next thing that we have here is to create the function itself okay so let's click on create function and here i'm not going to use a blueprint or any serverless app repository i'll just create it from scratch okay and i'll give my function a name so my function's name can be anything like uh function okay so here i'm going to use python 3.7 okay and there is a permission levels also you can add where you can add the execution rules that you want so the one that i am going to use is basically s3 because i want execution rules for s3 so there are three options here you can create a new role with basic lambda permissions or you can use existing role or you can create a new role from aws policy templates okay if you click on create a new role from aws policy templates it means that either you don't have an existing policy or you have a separate requirement for your new function that you're creating okay so here i'm just going to create a new rule so click on this and give it a name so i'm just going to copy this and i'm just giving i'll just give it a name as hyphen rule okay here i want to use amazon s3 so i'll just use amazon s3 object read only permission and if you don't find it just type s3 and you will get both of them okay so you can just use object read-only permissions for now and if suppose you want a delete permission or anything you can explicitly do that from im as well but i will be using this one for now so you can just click on this because we are going to read the s3 objects isn't it so this is the permission that i need for now and what we have done right now we have selected the author from scratch we have given the function name i have given the runtime that is the language that i want to use that is python 3.7 the permission type that i want this create a new role from aws policy templates and here i have given the template as amazon s3 object read only permissions okay so the next thing is create function okay so now your function is created okay so successfully created the function mys3 function demo now you can change its code and configuration to invoke your function with the test event choose test okay so now what we have to do is we have to just move around this form and i'll explain you uh some of the things that are important to us okay so this thing that you see here is a designer you can actually add trigger points or you can add destinations to this and you can create a workflow here as well so once you click on add triggers you can see from which all services you can trigger aws lambda see you have so many services like api gateway aws iot alexa skill skills kit or application load balancer you can also invoke it from the application load balancers or cloud watch code commit dynamodb kinesis s3 sns sqs and there are so many other integrations that you can have okay so when you select one of them you can add them as a trigger point from which you are going to trigger this function okay and there is one more thing called destination so here we have the listening types as sns topic sqsq lambda functions or event bridge okay and you can choose what type of source it will be like it will be a will it be asynchronous invocation or it will be a stream invocation okay and here there is one more condition that you can add whether your function invocation or the destination invocation will be based on whether your function has failed or on success okay so these are a few things that you can also do it but for now i'll just focus on the task at hand and this is basically your ide for writing your lambda function it has all the details and here if you see that if you want to use any python libraries explicitly that are not available by default with python you have to create a package and you have to upload it to lambda so that is one more thing that we will do later on but that's not required as of now so you have this function and this is the lambda handler so when you create a lambda function you actually specify a handler which is the function in your code that aws lambda can invoke when the service executes your code so if you see aws function or the aws lambda functions will have lambda handlers as the function within which you write your code so as you see here we have the aws lambda handler so this is the function lambda handler within which we will write our code and the lambda handler function actually takes two parameters in normal cases that is first one is event and the other one is context so event parameter is used to pass event data to your handlers so this is mostly like a dictionary but you can also pass list string integer or none type types as well you might ask why is it different because the trigger points are different so you might think like why is it like different based on like the dictionary or a string or list so some might send string data some might send integer data or some might send json so it depends on the trigger points on how the lambda handler actually receives the data as a part of its input so like for example it can come from cognato or it can come from api gateway or it can come from amazon lex or kinesis anything for that matter which can act as a trigger isn't it so and the context is used as a parameter that actually provides the runtime information to your handler it's like passing the context object and one of the context method is get remaining time in milliseconds which returns number of milliseconds left before the execution times out so similarly we have methods like aws underscore request underscore id which can get you the request id for the execution there is a list of method that you can have with context you can read them in the documentation as well so now that you have a good idea of what are events and contexts let's move on so to execute this program what you have to do is you have to just click on test okay so here you have to configure a test event so you can just give a event name here my event name or something of your choice okay these are by default parameters that you're going to pass and then you can just click on create so it has created event name test event okay so you can just click on this and you can start the test okay so now the response has been received as status code is 200 and hello from lambda so this is the thing that actually we have returned when we executed the lambda handler okay so now you must understand a few things here so let's suppose i want to show you something so import time okay i just type import time and i'll just give here time dot sleep of 5 okay and let us just save this and let's run the test see so now it failed as you can see a task timed out after three seconds why did it happen here like if you see here go here as well in the execution details you will find that it has timed out this is because the default setting that you have here the basic second thing that you have here if you did it on this portion you will see the default timeout that has been set is three seconds so you have to increase it to it can depend on what your requirements are i'll just increase it to three minutes okay and you can just save it okay so the next time when you are trying to execute it and it times out come here and then try to execute this okay so if i just click on test once again it should not fail for me even if i have increased the time of execution to five seconds see now it has passed again okay so it's a bit simple isn't it we are going to see like how actually we are getting the logs generated here so if you go here this is a summary that you have uh that this is the request id this actually what i was saying right aws request id you can print this as well using your context and here you also can print the memory use 128 mb okay or the max memory used okay and here if you see we are getting the log output isn't it so here if you see we have the cloud watch log group if you click here okay so here once you come here to cloud watch you can see here if you see the log group has already been created that is a slash aws lambda slash minus three function demo and for this we can have the logs generated okay so if you click here you can see the logs have been generated okay so these are the logs that have been generated but how did they get generated because we have given them the permission that comes by default so if you click on permissions you can see we have amazon cloudwatch logs and what are the resources and action that it can take so it can take three actions or it can perform three actions and it can execute two on two resources okay so what are the resources it can execute so it can create log groups it can create log streams and it can put log events so it can create the log group here and the log streams that you see here it can clear this log stream and it can put the data as well okay and can put the logs as well okay so three operations and by actions so it can create log groups log streams and create log events okay so by actions and by resource so these are the two resources so the next thing that we wanted to do was we wanted to pull the information or the data that we have from our s3 bucket isn't it so go back to the s3 so we have this s3 bucket right now where we have the data and this is the lambda function that we have created right now so we have to write the code now to pull the data from the s3 isn't it so let's do that so this is the bucket that i wanted to access the data from so i'll click on this and it is inside pythonic so i already have a sample text 0 1 so that is what i will be using to extract the data from so for this i will be using the bottom 3 module and i will tell you like how it exactly works so for now we don't need this code so you can just delete this and i'll start typing the code okay so first thing that we have to do is we have to import the three module and then i want to import json to format the response that i have so s3 is equal to so if you want to perform any operation you have to call the booto3 module dot client so which client you want to use so we are currently using s3 so i'll just type s3 okay and the next thing that we want we want to create the lambda handler handler isn't it and the parameters that we need is event comma context okay so these are the two things that we need now so now we need the bucket information so the bucket that i have is my hyphen new hyphen web hyphen bucket and the key the key is basically the path that i have so pythonic slash um i'll give sample 0 1 dot txt okay so this is the path that i have now there's the key that i want to access so if you go here so this is inside pythonic so python pythonic slash sample01 so you can add the path if your file is inside this directory itself you can just provide the file name okay or the data that you want so next thing is to fetch the data that we have so there is a module called s3 dot get underscore object okay that you can use to basically fetch the data from the s3 so that is what we will do right now so try and here i will write the data pull code so s3 dot get underscore object off we have to pass two parameters here so the first one is bucket which will be equal to bucket comma we need key key equal to key okay so that's it there's the method that you're going to use to pull the data get underscore object and we already have the permission added to read files from s3 so don't worry about that and the next thing is json underscore data is equal to data off so this is the body parameters that we have the body data that i have so it will be body dot read okay i have to close the accept accept exception as he print e okay okay so i hope the code is done now so i can just save it and i'll just execute it test okay so bytes has no okay i cannot do this so i'll just remove this not a problem i was thinking it will be some yeah see so i have the data now okay so i'll create a new file and i'll just add this disclaimer content and i'll just save it in desktop so this will be our next file so i can just keep it as sample 0 2 dot txt okay so now i have saved this and on s3 i can just go ahead and upload it upload a file add file then sample02 okay then just click on upload so now you see sample02.txt and here when i go back and i just change it to 0 2 and save it and i test it i should get the data okay so i got the data here ok so not a problem so now this looks pretty simple we have got the data from the s3 bucket that we have but this doesn't add up this is not we wanted to do isn't it so the next thing is the one that we wanted to do was our users are wanting us to have base64 encoded data and for that we actually created the lambda function and we wanted to give them the api gateway okay so now that is what we'll do we'll integrate api gateway to this so to integrate api gateway what we can do is we can precisely add a trigger here so you can just click on this add trigger this can also be done through api gateway itself you go to api gateway and try to integrate the lambda function that also is fine you can do that okay so now just create a api gateway trigger point okay so we'll create a new api so it will be a new rest api so i'll have a rest api now so you can just provide im as the security and just click on add and there are additional settings that you can give but i am not going to give it right now so just give click on add so once you've clicked on add you can see the topology has been changed so now the trigger point that we have is api gateway so whenever any rest invocation is done from api gateway based on the function that i have it will basically call the lambda function that we have here okay so now let's go to this invocation that we have okay so here in aws api gateways we will start off by invoking the lambda function that we have from our trigger point that we have that is api gateway okay so the first thing that we did was we had uploaded the file that we need on s3 then we created a lambda function where we actually wrote the code that actually fetches us the data from the aws s3 bucket so now we have to add the trigger point and here we are we are going to create a rest api if you see here the api gateway has been created and this is basically any request that you want to send but for a specific purpose we are going to create a method so that will be our get method isn't it so just click on actions and then click on get and then click on yes okay so now the invocation will be from the lambda function so this is the integration type so integrate with the lambda function just click on this one and don't select lambda proxy as of now and your lambda region is basically a ap south one and you have to provide the function name isn't it so go back here and you can copy this function name and you can provide it here and just save it you are about to give api gateway permissions to invoke your lambda function yes that is what we want so click on ok so now it has been saved so your get api is now created so if you see here we have four methods that we have already discussed before i hope you remember that so method request integration request the lambda function mys3 function demo integration response and the method response okay so now click on test so if you see here we have query strings that we can pass we have the header values that we can pass which actually has application json as you can have like a header that you can pass from here but the main thing that we want to check here is basically by clicking on test and seeing whether it is able to invoke or not okay so click on test okay so you now are getting the response so this response is coming from your lambda function okay that is 200 and that is the disclaimer that you get this is a disclaimer actually this is not a disclaimer but the text actually which we have uploaded isn't it so that is from our lambda function and here what it is doing is it is trying to execute this particular file and it is trying to fetch this particular file from this bucket okay but this is not that interesting because this is hard coded isn't it we don't want that we want our users to have the capability of passing the bucket name and the key value so that they can get any type of data that they want or any data or any file that they want from the aws bucket isn't it s3 bucket isn't it so that is what we will do right now okay so now uh we have to change some things in both the places so i'll just go ahead and tell you what we are going to change here so you see here these are the two bucket and the key parameters isn't it that is what we are trying to make it dynamic so the event type that you see here will configure the test event again and here we are going to pass the bucket sample02.txt isn't it so these were the so these were the inputs that we wanted to give to our function so now you can just save this okay i'll just remove this save it okay there was a comma so i'm sorry for that so now what happens is as i had already told you that the event actually can help you get the parameter values okay so that is what we are going to fetch it from okay we will remove this and event off that will be a dictionary isn't it so how we are going to access data from the dictionary we are going to use the key okay so event of bucket and this will be obviously event key okay so now this has become dynamic you can just save it and let's go back to the api gateway and let's see whether we are going to get the result or not click on test see because we don't have the keys that it needs okay so now we see the real test here okay so now if you just click on test now it will just pass on the values that the input has so if you go here to configure these are the inputs that are currently being passed so from the api gateway we are not getting the inputs anymore okay so this is a problem right now that we have so what do we need we need these values to come to the event from the api gateway so that is what we will do right now i'll just click on test once again to show you like yeah it works here okay so now the function actually works individually but for it to work from api gateway we need to do some modifications and that is what we will see now so go back to method execution and go back to method request okay so what do we need now we need query string parameters isn't it so they will be bucket okay so these are the query parameters that we are going to pass and that will be our key okay and you have to create the validation and you have to just check on these two check boxes to create the validation here it is telling that we have clicked on these two but our request validator is still not validating these parameters so you see here request validator and there is a cell icon here you can just click on this and you can select verify query string parameters and headers okay and that's it okay so you have added the request validator and the two string query string parameters okay so you can just click on method execution now paste it here and let us see whether we are able to execute the code or not no still we are not able to execute it okay so what is the problem now so the method actually is getting the value that it needs the bucket and the key but the integration request actually this the data that has to be formatted for this functionality to receive or the function to receive we haven't done that okay so click on request in integration and here if you come back to this point here where we have the mapping templates you can just click on this and here we don't have any templates as of now okay so now if you click on this add mapping template i have to tell the content type to be application slash json because the content that i'm trying to pass on as a parameter is basically our json itself okay so application slash json you can just type application slash json and click on create yes your current password behavior will pass all request payloads directly to the endpoint without transformation unless there is a match for the incoming content type do you want to secure this integration to only allow request that match one of your defined content types yes i want it to be application json isn't it so now we have given that it is application json but what type of json we want we want to capture the parameters and we want to send it to the function that we have isn't it so first we collect the information from the user from the request parameters that we have or the query parameters that we have and we have to then pass it on to the function itself so we have to add the template as well okay so the template goes like this so you can also use a template that you want based on your requirements that you have so this is a basic json template that we have where you can just pass the bucket as your input parameters i'll just use this don't worry about it it'll work just fine if you have any doubts you can just let me know and the second one is keys okay so you have to just provide the same way that you have provided for the bucket input dot param off key okay and this is the format that our function is going to expect the data from so now just close the bracket so we have the bucket here the input param of bucket and key is basically input param of key sorry params sorry it's params okay so now just what you need to do is you need to just click on save okay and then go back to the function that you have the get request that you have okay now click on test so now we have the function here my function demo and this is the parameters that i want to pass let's see whether we are able to run this or not click on test see now what we have done we have passed the inputs that we wanted to have from our query parameters that is bucket equal to my new bucket my new web bucket and the key that i have passed is pythonic sample02.txt so this is very interesting because now we have the data here in the s3 bucket we have the functionality here and we have the access point or the resource here but what we wanted to do we wanted to actually have the base64 encoding isn't it so let's modify our lambda function to pass on the base64 okay so now there are slight changes that we have to make here to the part of the code that we need so the first thing is we have to import base64 okay so this module has been imported now this comes by default so we don't have to worry about that and the next thing that we have to do is we have to format the json data that we are getting into base 64. so base64 underscore bytes equal to base 64 dot what is the method name that is b 64 encode so we're going to encode it and what are we going to encode we are going to encode the json data isn't it so now that we have encoded the json data what we are going to do we are going to pass it okay so once we have this we have to just pass the data here okay that's it so now we have encoded the data that we wanted to encode and now the next thing that we want to do is we are just going to pass it isn't it just now so now just save this and let's see the magic see now i'm getting the string here but it is telling unable to master response it is not able to pass that not a problem i'll pass on the string itself okay encoded string see okay so now you have the data back so this is the encoded data so now everything is complete so our users are able to upload the data to s3 user team b actually is able to access it from the api gateway which actually is taking the data from aws lambda which is in turn picking up the data from the amazon s3 and converting it to base64 and then passing it on as a response to the users okay so this was a very simple example on how we can integrate aws s3 and aws lambda and aws api gateways as a trigger point to our aws lambda okay so i wanted to make this demo so that we can cover both the aspects of api gateways and aws lambda but mostly the problem that people faces around having or passing the inputs to the lambda function that is from the rest apis by using the parameters so that is the point that i wanted to cover in this demo so that you don't have to face any challenges with that and i could have done this with simple code execution but we have to do something different than that i hope it was fun and it was exciting for you guys to learn as well as it was fun for me to actually design this and to do this demo so the next topic actually i'm shifting from the aws server list to jumping on right away to aws vpcs so we'll cover that first we'll cover that topic because it is a long pending topic and it is a very important topic for us so we will cover that first and the next topic that you're going to see on the channel for aws will be the introduction to vpcs okay so i hope you are excited for that so i'll meet you in the next one that's for aws vpcs and until then it's pytholic signing off [Music] you
Info
Channel: Pythoholic
Views: 10,502
Rating: undefined out of 5
Keywords: RoadToAWS, AWSSolutionsArchitectAssociate2020, Pythoholic, api gateway lambda, api gateway, pythoholic aws, aws, aws api gateway lambda python example, aws api gateway lambda tutorial, lambda function, aws api gateway lambda, aws lambda python, API GATEWAY DEMO, Create API ENDPOINT to fetch S3 DATA using LAMBDA FUNCTION, boto3 aws, boto3 lambda example, boto3 lambda tutorial, boto3 lambda function, boto3 lambda create function example, boto3 aws lambda, aws rest api tutorial
Id: _RpGkToww2M
Channel Id: undefined
Length: 37min 0sec (2220 seconds)
Published: Sun Sep 13 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.