How to build ML Architecture with AWS SageMaker + Lambda + API Gateway | HANDS-ON TUTORIAL

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello welcome to this aws tutorial in this video we will build a machine learning cloud architecture where you can use api gateway and aws lambda that will be fun first of all let's think about machine learning model first thing to do is to train this model once we have completely trained our machine learning model then we will have two very important components the first one is machine learning model binary file and the second one machine learning model and point which we will use later and all of these things will be made by amazon siege maker then let's talk about adobe's lambda edibles lambda will trigger the endpoint with a sum task and our task will be i make a prediction so and machine learning model endpoint will give a response and the response is the answer from the end point it's a predicted wow now let's look at the right side here is the users and the users want to input some data and get insight from artificial intelligence you can ask can they send their input data directly to adobe's lambda the answer is no so maybe can they send their input data directly to machine learning model and point inside amazon city maker the answer is also no we need one extra component in our architecture between these ones and it is amazon api gateway so look at this sample here is the user input data and his amazon api gateway functionality one of the features we can use in amazon api gateway is rest api rest api can receive user input data and pass it directly to aws lambda in our architecture edibles lambda has a direct connection with machining model endpoint so lambda is able to receive back predicted wallets from machining model to rest api so in our action plan we have three steps one two and three you can follow whatever in this video here you go so now we are on aws console and in this we need to create our machine linux model we need to train it and we need to create a endpoint for this model and in order to do this we need to use amazon switchmaker because amazon stitch maker is for build train and deploy machine learning models at scale so this is exactly what we need to do okay and in here we need to create an instance because the best way to do training and to specify hyper parameters to load initial data and to create endpoint that will be used for api gateway and lambda is notebook instances okay so we need to create the first one the notebook instance name let's be a machine learning model instance and notebook instance type let it be mlt2 medium this is not so powerful but for this video it's quite good and it is enough and then um we need to create i am roll and i'm creating a new one and because this role is allows adolescent maker to call s3 buckets so we will need it so i need to create this role right now it is creating yes it is succeed you created an am role and finally i can to finalize creating my notebook instance by clicking this button yeah success your notebook instance is being created and so far it is spending i need to wait a couple of minutes until it will be active so let's wait uh four or five minutes and finally my notebook instance is in service so now i can to open my jupyter perfect and at this moment my notebook is empty so i can use uh some speech maker examples and i think that for this video the perfect example is a breeze cancer predictions so um i'm going to use this yes create a copy create a copy for me perfect now this notebook is loaded in my instance and here you go so um i'm not going to go very deeply inside the notebook itself because it's not a point of this video but they may think okay we need to import some libraries it's a bottle free is a siege maker i initiate a session so i executed this cell okay and okay and then i import the python libraries it's a pandas numpy matlow slip time json and also with stage maker amazon common after this i need to load my data for training it is going from this csv file and this is the list of the features that i'm using for training the model and make up predictions so i'm going to load my data and see what's happening perfect and as you can see my uh training data has a 569 observation and 32 features this is how the data is look like and this is quite important because we need to know how to prepare our payload and apa gateway when we make a new predictions so keep it in mind this is a target 6b and m that our model will try to predict because it's a classification problem and then um i need to create a features for and labels for our training job that means i'm splitting the data into 80 percent for training 10 percent for validation and 10 percent for testing so here you go perfect then i i'm preparing my model for training so okay as you can see now i'm using s3 boycott this is why we need to create a i am role and just one minute ago and yes i need to use a linear validation data framework and here you go i'm going to train my model and one thing i need to highlight is the high hyper parameters you can specify your own hyper parameters this is my own and i'm using a 10 epochs and the loss is absolute loss so i want to execute this one so the job name is demo linear and the timestamp of today and then i going to train so the train is in progress i need to wait a little bit because the notebook instance is not so powerful for this uh video so let's wait uh i think two or three minutes finally after four minutes uh my training job is completed and okay so once i have um my machine learning model trained i can to host it so i'm hosting it in s3 with all artifacts so executing this line is a ln for my artifacts here is it and once we have a setup model we can configure what our hosting endpoint should be and in here we will specify ac2 instance type to using for hosting initial node number of instance and hosting model name so okay what's important to know instance type it says mlm for x large this is the endpoint instance type and then initial instance count i'm using only one because it's for demonstration but if you are expecting a big traffic you should use a larger instance type or a bigger amount of instances so model name is this variable and then yes executing so this is the demo linear configuration with the today timestamp and end point configuration a at n okay and i also executing this line yeah my endpoint is being creating because and in here we have just um specified some information for our endpoint and in this cell we are creating so let's wait another couple of minutes and let's come back okay and finally the status of our endpoint is in service that mean that we have just created our endpoint and it is ready for use okay now we can to play around and make some predictions and for example right here we can test our model and here's the performance metrics of our model by validation set it's quite good metrics and then prediction accuracy is 45 and baseline accuracy is 64 percent it's quite normal and uh okay i'm not deleting the endpoint right now because we will use it in our api so now we can come back and add the stage maker and here we go and we need to look for some inference look and here's the end points and this one that we will use for our api and adwords lambda so here's a creation time status in service and afn and the name that's perfect okay and now i need to close my notebook because i will not use it anymore so for now we need to create a lambda function remember why we need a lambda function yes lambda function will trigger our endpoint with a payload which is delivered by api gateway so let's go to lambda right here i open it in new tab okay and i need to create a new lambda function by clicking this button just like this okay and let it be my lambda function name is machine learning model lambda yeah and runtime let it be um python 3.6 okay this is a good option for our example so about permissions by default lambda will create an execution role with permission to upload logs to amazon cloud watch logs you can customize this default role later when adding triggers so i need to create um yes i need to create a new role with basic lambda permissions and once i grade this lambda roll then i will need to modify it a little bit so i will show you how to do it okay just for now i need to create my function perfect and here is it here's my function and it's named ml mode lambda and when you scroll down it is the lambda function uh python code so and you can modify this code by customizing your functionality and in my example in our example we used a predefined lambda function so i will show it right now how it looks like so here is it i'm just copying and pasting it from aws documentation website it is right here you will find this code example at the link in the video description so just looking good and i need to save my lambda function right now okay yeah i need to deploy it it's deployed and one thing i need to do right here you see here is an endpoint name okay and endpoint name should be specified as environment variable so i just copy and paste this environment variable name i'll go to configuration as you can see here's uh environment variables uh bob so i need to add one more uh environment variable it is a endpoint name from the code and the vowel of this standpoint name let it be um this endpoint that mean my lambda function will trigger this endpoint with a payload which just goes from api gateway so i need to save it perfect i'll come back to code here's it another thing we need to do with a lambda is permissions so i go to i am roll his im service i need to find the role which is responsible for lambda execution here's it ml mode the lambda row here's it i open it up and here's my permissions treasure relationship is looking good if i open this one here's the policy that my lambda function is being used and in here i can to edit my policy okay so i'm editing in json format and also i just copy and paste the code from official aws documentation and it's gonna be right here yes and this lines of codes what is telling this line of course telling to the lambda function that it allows to invoke switchmaker endpoint if i don't put it in this policy so my lambda function will not be able to invoke a siege maker endpoint so okay i pressing review policy and that's it i wanted to save it perfect no errors in here and with the lambda function i think we are done here is my lambda function and so we are ready to move to the next step and the next step is to create api gateway so i go to api and gateway so i need to create a new api for my machine linux model and lambda function will trigger this endpoint with the payload delivered from api gateway so i think in our example we can use the simple rest api okay i need to build a new one yeah create your first api perfect yes i need to create a new one api name let it be ml model api description my machine learning model api endpoint type let's be original okay so let's create it okay and now it's a framework for amazon ap gateway and in here i need to create a new resource and you can specify the name of your resource as you want for example in my case let me be um api machine learning model okay so i'm creating this resource and inside this resource what we can to do we can to create a method okay and a method we need to post because we need to pass our payload to a lambda function so we need to use a post method here is it here's integration time it is a lambda function and remember how our lambda function is named is it and our lambda function is machine learning model lambda okay i need to specify this in this field lambda function yeah you all you can to select it from the list that's perfect i need to save it yeah add permission to the lambda function yeah i'm agreeing with that perfect as you can see here this is the schema showing what is connected with my api gateway and now i am able to test it and now you can to test your ep gateway by passing the payload in your api gateway and the payload should be pasted right here in request body and okay let's specify our payload for making a prediction okay so we need to specify uh the name of our payload is data and then we can specify the actual wireless that is going to be pushed into the model endpoint and based on this input wireless the model will return a prediction in our machine learning classification problem so our payload let it be like this and one thing i need to mention right here here are the 32 features the same number of feature that we are using in our model training because the model should understand the payload in the same way it was streamed so here's the actual wallet that we push into our model endpoint so we need to test right now and here's it here's a status 200 that is good and his latency and the response is we this is the prediction to our payload it's a weird so classification model uh endpoint working perfect hey and i almost forgot to show how to deploy our api and this is how we can do it is to create a new stage for production for example and from this point we can to invoke invoke our url api from outside that mean you can bring it into the buy python code for example 3.7 i importing json and request packages and here i am set up my payload it is my url for my api api ml model and then it's a data payload and the data payload is the same what i used in my testing it is right here and then i can get a feedback from the api it's a prediction b the same as in testing phase so that's it from this part so that's it i wanted to show you right now and i hope that this video was useful for you and give me ideas what i have to do next if you want to learn something so see you there bye bye
Info
Channel: Data Science Garage
Views: 24,576
Rating: undefined out of 5
Keywords: sagemaker, api gateway, aws lambda, lambda function, rest api, ML Architecture, AWS Machine Learning, AWS ML, Deploy ML model, deploy machine learning model, deploy ml model, lambda function code, REST API AWS, IAM Policies, Machine Learning Workflow
Id: stD47vPDadI
Channel Id: undefined
Length: 21min 44sec (1304 seconds)
Published: Fri May 20 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.