Tutorial 2-Build,Train, Deploy Machine Learning Model In AWS SageMaker- Creating Notebook Instance

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello all my name is krishnak and welcome to my youtube channel so if you remember guys in our tutorial one we understood bit information basic information about aws sagemaker and we had also created the account in aws console uh so that we can use this aws age maker services if you have not seen that particular video the whole playlist link will be given in the description now this is the second tutorial now what we will be doing is that we will be building training and deploying a machine learning model uh in aws sagemaker and you can do it through two things one is notebook instance and one is through aws or amazon sage maker studio right so we'll try to discuss with both of them initially we'll start with notebook instance we'll try to go with step by step and uh yes it will be consisting of five to six steps of building training and deploying a machine learning model uh before that in this particular step we i'm going to show you how you can start a notebook instance in aws our amazon sage maker so first of all just go and search for aws console login okay i hope you have created the account the free tire account so just go over here as soon as you go over here you'll be getting this page and just close this this one i'll log back in okay and after logging in you'll be able to see like this so i will just go and search for amazon sage maker so amazon sage maker so once i click this you will be able to see this uh this this page that will be coming now as i told you in my previous class also we can either use amazon sagemaker studio or uh we can also use something called as notebook instance now right now i'm just going to show you with notebook instance and probably in the upcoming classes i'll also be upcoming sessions i'll also be able to show you with the help of amazon sagemaker studio now let's create the notebook instance the notebook instance basically means that like how in our local environment we will be able to run jupyter notebook with respect to various environments amazon sagemaker also gives you those kind of functionalities and what i'll do is that first of all i'll create a notebook instance now in this notebook instance i will just give my project name suppose i'm actually doing a bank application okay so let me just i'm just giving an example guys i'll tell you about the project in my next video what project i'm going to implement but in this uh in this we are just going to give you some example okay so i'll write project example okay after this make sure that you select notebook instance type to ml.t2.medium okay then this is the elastic inference now elastic inference if you don't know guys just go and click over here and learn more so in learn more you'll be able to see that what does this sage make a elastic inference do by using amazon elastic inference you can speed up the true power throughput and decrease the latency of getting real type inferences from your deep learning models that are deployed as amazon sage maker hosted models right so whenever you are going to use deep learning models at that time you can actually select and again here also you have various options like medium large large medium with different different versions that basically means you'll be getting a powerful gpu for doing most of your tasks right after this you'll also be able to see one option that is called as i am roll now this is pretty much important i'll tell you i'll give you a small information about im role in aws uh since since in a pro in since uh in a company there will be many projects that will be going on parallely and some some of the times you will be requiring different different services in aws like s3 bucket or different services with respect to different different region right this region now and definitely if we create an s3 bucket it will be global units globally it can be accessed from any region you want and sometimes it wants that one s3 bucket will be just specific to one uh im role or one specific project so it is very very much necessary that we provide an im role so that you can access that s3 bucket which is just like a storage unit for all the kind of for for a specific kind of projects itself like suppose in our example if i if i'm working on some bank application now in that bank application i have created an s3 bucket and that s3 bucket should only be accessible by that bank application so for that i can actually create this uh im role okay once i create the im role and once i give that permission to access only that s3 bucket so what you will be able to do is that from that particular instance you will be able to call or use all the data that is present in that particular s3 bucket now in this particular case what will happen is that now suppose i'll go and select the create a new role okay here you'll be able to see one option any s3 bucket or specific s3 bucket right now i'll just go and select for any s3 bucket because uh i just have one application currently going on so if i really want to create any s3 bucket in that whole s3 block so if i just go over here if i search for aws you'll be able to see over here if i search for s3 right so this s3 is nothing but scalable storage in cloud you'll be able to see that uh any type of buckets that is created right now i have created this bank application now if i select if i select this any s3 bucket that basically means this application will also be able to access this bank application right and you can see that objects can be public now based on roles we can provide different different permissions you know suppose this specific s3 bucket i want i can give my bucket name over here and only this application will be able to access that specific bucket other buckets it will not be able to access right now i'm just going to select any s3 so i will just after this just go and create the role okay so i have already created uh sorry okay i'll go and create the role now you can see that a new role has been created after this you just go and create the notebook instance once you create the notebook instance it will probably take some time let me go over here let me click on notebook instances you can see that i have created a bank application so once you do select on create notebook instance it will take around two to three minutes and once this status shows in service initially the status will be showing a pending mode right so once it shows in service then that basically means your instance has been created now in this instance you can upload your file you can start working on this so if you really want to open it you can click on open in jupiter so here you can see that it has opened in jupiter there is some options open in jupiter or open in jupiter lab both the features are actually provided by amazon sage maker so once you see over here you'll be able to see these files these files have been uploaded by me how did i upload i'll just go and select on new okay if you really want to create a new file right or after creating new just go and select what environment you want in this you have different different environment python by torch you have you have tensorflow 2 with python 3.6 in my case i'm not using pi torch i'm first project that i'm going to do a simple machine learning project so i will just go with conda python3 okay so here you can see that uh once i start this okay file you'll be able to see this this conda python 3 i've actually created this particular file now it is such that in the local also suppose if there is any jupyter notebook you can also upload it in oi by clicking this upload button okay so by clicking this upload button go to that specific path and upload it over there any kind of data sets can be uploaded this will be present in this specific instance right now once you have done this just let me know because in the next session what i'm going to do is that we are going to solve a simple machine learning project and then we will follow all the steps that is build train deploy we'll deploy a machine learning model after that we will be exposing this api to the front end application like lambda function and all and there are many steps that are going to come up so this was the first step where we had actually created an instance and our instance is actually ready so yes uh i hope you like this particular video please do it guys uh again uh if you have not seen the part one also this is basically the part one and i'll see you all in the next video so if if you're new to this channel please do subscribe the channel please do hit like for this particular video i'll see you all in the next video have a great day bye
Info
Channel: Krish Naik
Views: 97,280
Rating: undefined out of 5
Keywords: data science, machine learning, deep learning, artifcial intelligence
Id: HuDzjNTgS2A
Channel Id: undefined
Length: 8min 41sec (521 seconds)
Published: Sat Aug 29 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.