#2- Complete End To End Generative AI Project On AWS Using AWS Bedrock And AWS Lambda

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello guys we are going to continue our generative AI on AWS Cloud series and in this specific video we are going to develop an amazing end to end generative AI application uh using various services that are present in AWS Cloud I will give you a brief architecture what we are going to implement and then step by step I will be showing you how you can use various services that are present in AWS and Implement some amazing generative AI applications so what we are going to probably do in this specific use case uh this use case I'm just going to keep it as uh block generation so here you can probably see blog generation is there uh initially what we will do is that our main aim over here will be to create an API okay so we will try to create an API so we are going to use Postman to hit this specific API with some specific body uh that body can include query okay query from the user like what blogs we need to probably generate and as you know in AWS uh in order to create apis we use Amazon API Gateway so we are going to also use this as soon as we trigger this particular API it is going to hit our Lambda function what exactly is a Lambda function we'll also understand about that this Lambda function in turn will be interacting with Amazon Bedrock okay now if you don't know about Amazon Bedrock here our all the foundation models will be available right foundation models will be available foundational models will be available like uh Amazon has lot of different different Foundation models like um Lama 3 Lama 2 you know it also has paid from Cloudy anthropic specifically cloudy models and all are there we'll go ahead and see what are all these kind of find Ocean Models and how we can specifically use that and this Lambda function will be responsible taking up the query hitting this particular Foundation model and getting back the response once we get the response in this particular case will be our entire Blog the 200 words or 300 words blogs that we are probably going to create and finally what we are going to do is that we going to take that specific blog and save it in AWS S3 in the form of text file or any kind of files okay we can also convert that into a PDF file and save it in Amazon S3 with the recent Tim stamp okay so this is the entire thing that we are going to implement and understand this kind of use cases is even available in AWS documentation uh and many many companies are implementing the generative a application in this way as we go ahead with respect to the series in the future we will also be developing some amazing applications that are related to rag you know different document Q&A how we can actually use Vector databases uh I will be showing you all those things and we will try to use all the ecosystem that is provided by AWS Cloud one of the important um you can probably consider that in most of the companies they are specifically asking this in interviews they definitely want you to say that how you can probably Implement anything in AWS cloud with respect to the generative a application so uh so many different things uh we are going to go ahead and see so first of all uh let's go back to your AWS account so initially you really need to have an AWS account and make sure that you have it then only you'll be able to use all those Services now first of all what I'm actually going to do is that I'm going to go to AWS bedrock and just give you an idea like how does all the foundation model looks like and what all Foundation model is specifically provided by Amazon Bedrock Okay so first I will go ahead and click this and here you can probably see that my Amazon Bedrock will get loaded once it is getting loaded you can probably go ahead and click on get started now with respect to this you can see over here Foundation models are specifically over there and you can actually do chat text image so uh it supports multiple models itself now which all companies model is basically there in this Foundation model here you have a121 Labs that's from Jurassic 2 Series uh you have Titan BY Amazon you have cloudy by anthropic you have command by coare you have Lama 3 you have mistal you have stable diffusion so you can actually work in images also and you can generate your own image as you want right let's say if I really want to use Lama 3 so I will go ahead and see this here we can probably see all these things are there like L 2 13B Lama 2 chat 13B and even Lama 3 is also there I will also show you that you can also go ahead and check it out in the playground chat with it and probably get it now the first step uh before we start all these things right our first step is specific spefically to make sure that we have to take the model access understand with respect to different different region yeah you'll be having the access of different different models so you have to request for it okay here you can see that in my in my region that is I've have selected Us East one you'll be able to see that I've granted access to all these models anthropic cloudy 3 models are not given because for this you really need to provide some use case uh details uh and then you have to submit it then the entropic will give you the excess itself like what kind of use case you are basically doing but according to mine I already you can probably see that I've got this acccess granted in order to get this acccess you just click on manage model access select which all M which all models you specifically want as access and just go ahead and click on Save changes okay I've already done that so here you can also see that uh the safe changes button is over here so here you can probably see safe changes is there you can go ahead and probably click on this what El models you specifically want just go ahead and check it out and then and sometime you'll be getting the exess model OKAY the model access will be granted so this is the first step that you really need to do okay now once you specifically do it uh then I will show you how you can go ahead and invoke all these C Foundation models and how you can go ahead okay so this was one of the services that we are going to use the next Services is specifically Lambda function what exactly is Lambda function we'll try to understand so here I will go ahead and search for AWS Lambda so here you can see what is Lambda console in AWS uh quickly to test an event it is a serverless manage application in okay I will just go ahead and search what is AWS Lambda okay so if you probably go ahead and see the definition with respect to AWS Lambda um it's saying that run code without thinking about servers or clusters see instead of AWS Lambda I can also use a ec2 instance in AWS and then probably set up all the things install all my requirements over there set up all the libraries develop my entire application and then probably with the help of API Gateway I can probably interact with that and within that E2 instance I will try to invoke my AWS Bedrock that also you can probably do and in the going upcoming examples I'll also show you that if you want but right now we are going to use AWS Lambda so AWS Lambda is a compute service that runs your code in response to an event so suppose if I hit an API this is going to get triggered and then it will go ahead and call the AWS Bedrock so the benefits of Lambda is that no need of managing Serv it is automatically scaling and you have to probably pay as you go like based on the number of request and then performance optimization will also be taking place in the AWS Lambda itself uh many many companies specifically use that if you have a smaller application you can definitely develop this um so let's go ahead and let's try to see that how we can go ahead and work with AWS Lambda so here you can probably see AWS Lambda as soon as I click on AWS Lambda this will get open here you can create multiple functions I've already created one function I will show you uh how to create this so let's go ahead and create the function over here so as soon as I probably create the function here you can see I can select author from scratch user blueprint container image whatever things you specifically want here what I'm actually going to do is that I'm going to give my function name and my function name what I can do over here is let's say I'll give my AWS app okay with bedrock okay so this will basically be my AWS function uh function name uh Lambda function name uh runtime I will go ahead and use Python 3.12 uh just to make sure that I have the recent updated Python and along with that whatever by default is there you can go ahead and select that but here I'll just keep it since we are developing it for the first time and I'll go ahead and click on create function as soon as I probably create the function here you'll be able to see that my AWS bed app Bedrock uh Lambda function will get created and again uh this takes time some amount of time based on the internet speed and uh the kind of application that you're developing so here you can see that it has got created AWS Bedrock I will show you here are some options like layers add triggers add destination now see my main name is that as soon as I probably create an API Gateway right because I need to hit the API then only this uh function should get triggered right inside this function I will write my entire code how to invoke the Bedrock AWS Bedrock Foundation models and all right so if you go down here you'll be also able to to see some amazing things like Code test monitor configuration allias version so if I go ahead and click on configuration here you can see there is some timeout configuration that you can probably give there are some environment variables if you want to probably give you can also give it over here directly environment variables I hope I've shown you in vs code we create a EnV file we create a key like open AI API key and all here also you can probably write it down and you can call that open API key in this particular code so one General configuration what I will do is that I'll keep this time out to 3 minutes because just to keep it in the safer side so that it does not get um hampered over there and here I'm going to write my entire code okay now before I write my entire code uh whatever code I'm going to write I'll be writing over here but again this does not look very comfortable writing the code because here I will not be able to get much uh you know suggestion with respect to the code it is better that I write this entire code in the aw uh in my code and then I'll copy and paste it over here okay so what code I'm going to write it down over here I'll be showing you that specific code with respect to this okay so let me just go ahead and open my vs code over here and here is one of my I will just open my terminal okay oops let me just go ahead and open my terminal and I'm here I'm going to write my code completely okay uh command prompt so first of all I will just go ahead and and create my environment so let me just go ahead and write cond create minus P VNV environment python as you know I have taken 3.12 okay so this is the environment that I'm going to create because I'm going to write my entire code over here with some requirement. txt and all so let me just create this I'll create one file which is called as requirements.txt so whatever is required uh for my project and that I have to actually put it in my Lambda function all I will be writing it over here okay so requirement. dxt let's wait till this entire is getting created uh so here you can see a new version of this this and I will go just go ahead and click on yes and automatically my entire cond environment will get created so there are two important things that I'll do first uh let me go ahead and write the code uh for this what all things I actually require I'll be writing it let me create a file also app.py okay and re. txt the first Library I actually want is nothing but boto 3 so I'll keep this boto 3 like this and whatever is the recent version of the boto 3 that is only supported by Lambda function okay not the previous one because uh the recent version of boto 3 has lot of lots of functionalities to invoke Foundation models from AWS Bedrock okay so you should know all these things okay so let me quickly go ahead and write my code over here so first of all I will go ahead and write import boto 3 and I'm just going to write import boto core doc config okay so this is two important libraries that we specifically use understand boto 3 is used to invoke the foundation model okay so here I will just go ahead and do one thing okay so let me just go ahead and see this is my VNV I will just go ahead and activate so cond activate VNV along with this what I'm going to do I'm going to just write pip install minus r requirement. txt so that my boto3 gets installed at any point of time okay so this installation will take place let it take place till then I will go ahead start writing my code okay now what I'm actually going to do my main aim is to create a blog right I really need to write a blog I need to create a blog and for that uh uh what all things I will be doing I'll be showing you step by step uh let me just do one thing I will keep my notepad over here the reason why I'm keeping the notepad so that uh uh step by step uh I've from the documentation I've I've taken all the steps what is basically required and uh when we have to create what what services needs to be created and all I will be writing all those things right so uh I will create a function which will basically be called as definition blog underscore generate using Bedrock okay so this will basically be my function name and uh when I create this function I will also use this I will uh give my query or message or what blog topic I can also basically say blog topic blog topic and this will basically be a string and uh this will also return a string okay this will also return a string right so this will basically be my function which will generate the blog here uh what kind of blog I'm going to use that I will be discussing sorry what kind of foundation model that I'll be using so first of all let me just go ahead and create all my prompts since I'm going to create a multi-line character so I will go ahead and write this and create a my blog okay over here okay so sorry my prompt I'll write my prompt and before I write my prompt uh let's go and see which Amazon Bedrock uh model I will be specifically using Lama 2 Chad 13B I can specifically go with this and here you can prob see this is my model name OKAY model ID okay you you you really need to remember this because these are all the things that I really need to give the content type will be application slj and body this is the most important thing what is the prompt that I am going to probably put over here what is the other information that you basically provide with respect to the body right so this all information definitely you really need to provide similarly if you go with Lama 2 chat uh uh Lama 2 chat 70 billion parameters so here you can see this is where your this is where you place your input text okay and similarly if you go ahead and explore Lama 213 billion you can also see other information with respect to so each and every model how the API request should be you should be able to get it from the AWS Bedrock itself okay so all the information are specifically over here and you can basically check it out right so whatever model you want to use okay you have to make sure that what body what API request is specifically there based on that only you have to probably create everything okay now what I'm actually going to do is that I'm going to create my prompt okay now in this specific prompt I'm going to basically write an S character and let me just go ahead and provide like this s so the first thing is that I will be using this keyword inst as you all know in Lama 2 models we specifically use I'll go ahead and write human okay and uh human is basically giving this particular query okay I'll say write a 200 words blog on the topic on the topic and the topic will specifically be my blog topic over here which I will be giving from my input right whenever I hit that particular API okay and uh after this uh what I can actually do is that as soon as I give this particular instruction to uh as a human being over here here uh the next thing is that I will have my assistant right and now I will go ahead and write about assistant and this will basically be closing the int this is the keyword that we specifically use for most of the Llama models right so very simple write a 200 word uh word blocks uh 200 word blogs on the topic block topic and this will basically be my assistant now as I said I have to worry about the body now my body has what all things right you really need to go go ahead and check it out from here my body has a prompt message it has a max gen message Max gen length message it has a temperature message and it has a top underscore B so these all parameters I really need to set it up so quickly I will go ahead and write it over here so first parameter will be nothing but prompt and this will basically be my prompt undor text okay so sorry uh have not created my prompt yet okay this will basically be my prompt okay now after I probably defined my prompt the next parameter that I have already written is nothing but maxcore genore length Okay and this will basically get assigned to 52 okay and then you can probably see this will basically be my temperature variable which will be my third parameter uh in my body that I will be sending and another one will be topor P which is another HP uh another parameter that is specifically required in my body so this basically becomes my body now also what I am actually going to do is that uh with the help of Botto 3 I'm going to call this Foundation model so that is my uh metal Lama 2 uh 13 billion chat model and then further things I will try to go ahead and do it okay now before I call that particular model let me just go ahead and create a tri catch block so that it is always good that we really need to have a tri catch block so that whatever errors we specifically get so I will go ahead and write bedrock and let me go ahead and write b3. client and here I'm going to basically call Bedrock undor runtime and if you have already seen my Bedrock videos I've already shown you how you can invoke the Bedrock uh we have to specifically create the runtime the region uh of all the foundation models that I'm actually going to call will be nothing but Us East one okay so this is the region that we are specifically going to use and the next thing that I'm actually going to give is my config so inside my config uh or let me write it like this so this will basically be my config uh inside my config I'm going to use my boto3 well sorry boto code. config do config and here I have to set up some of the parameters right read timeout okay read timeout and here I'm going to probably take up 300 seconds and I'm also going to give one more parameter which is nothing but retries okay and this will basically be retries and this will be set to a variable which is called as Max attempt uh maxcore attempts will be nothing but it will be three okay so here I'm going to keep up all the values so uh these are the basic configuration that I am specifically giving uh with respect to my number of tries uh all the so in short this variable will be used to call my Foundation model okay now in order to call the foundation model I will write bedrock do invoke uncore model and here I'm going to give my body the body is nothing but I'm just going to write json. dumps okay json. dumps and again I have not imported Json so let me just go ahead and do that so I'm going to write import Json okay so once I probably write import Json so here json. dumps and here I'm going to be probably give my entire body uh along with this I also need to call which model I'm basically calling so for that I have to mention my model ID and this time the model ID is nothing but it you have to refer it from here so this is nothing but meta 70 billion let's go with 30 billion because the request again I have to pay based on the number of request that is probably going right so model ID here I'm going to just paste it and I'm going calling the 13 billion chat V1 model okay so once uh you do this uh then what we can do we will get our response uncore content and let me just go ahead and write response. getet body so for that response uncore content uh response. getet and here we are going to basically write body. read okay do read so we are specifically going to read this specific response uh and one more thing uh that you can do right and understand one more thing guys very simple whatever response you are probably getting I'll just go ahead and import the response also because I will be using this okay so uh response. getet body. read once I read that particular data what it will happen is that I will just go ahead and load this with respect to my response content okay now here you should understand right when I invoke this particular model and when I give my body over here I'm basically calling this part particular model over here itself that is my Lama 2 13 uh 13 million chart V1 and then with respect to this particular body whatever body you are specifically getting we'll read that particular response and let me do one thing let me just make this particular spelling correctly and as soon as I load this particular response content let me go ahead and store it in my variable response uncore data okay now once I load this uh let me just go ahead and print it if I really want to see my response uncore data response uncore data okay if you want to probably see it uh this response will be having a variable understand one thing that uh whatever response we are basically calling it is giving the response from our entire model the model that we are using the foundation model and now here I will go ahead and write this will basically be my blog details okay blog details and this blog details uh blog details will be available inside response unor data with respect to my uh key variable which is called as generation so this model whatever response it is basically giving you it will be saved in this particular key variable within your response data which has been converted into Json and when you write response. getet body. read in short you are basically getting the response from this okay and finally I'm going to return the bloore details so this function here you can see and let me just go ahead and write my accept block also accept Okay accept exception I'm just going to create my exception exception as e and we are just going to print the error understand one thing guys whatever printing that we are specifically doing AWS lambdas make sure that all the logs will be logged in the cloud watch logs right so that is also very good advantages for you so that you'll be able to see that so errors generating the code uh okay error generating the blog and I can probably display this particular error which is basically over here and U let me return it with a blank screen okay so here we have basically done almost each and everything we have displayed the data this is the block details that we are specifically getting and this is the function that is responsible in generating the blog okay now let me quickly go ahead and do one more thing let me go ahead and uh See by default if I go inside my Lambda function uh it has a function which is called as Lambda Handler okay so let me just go ahead and Define this Lambda Handler because my API Gateway whenever it's say sends any post event uh it will be captured from this particular trigger and it will hit this particular function first so we have to capture this event along with the context and then we can probably uh retrieve whatever query we are specifically giving see here we are giving the block topic right that block top topic will be captured in this specific event okay that is what is my plan okay so here what I'm actually going to do I'll just go ahead and create my my event so event json. loads and I will take this event and let me do one thing let me keep a variable right which is basically called as body because whatever post request we are sending this will be available in the body key itself okay you'll be able to understand it and then this will basically be my blog topic so I will let me go ahead and write my blog topic my blog topic will be nothing but I will just go ahead and write event off I can keep a variable called as message or I can also keep a key called as blog so you'll be able to understand inside this body I will be giving in a Json format where my blog topic will be in this particular variable name okay so blog undor topic let's say here I go ahead and write blog topic is equal to machine learning then I should be able to get this okay um then uh once I probably get the blog topic then I will go ahead and write generate uncore blog and now I will go go ahead and call my uh generate blog using Bedrock okay so this will basically be my function and here I'm going to probably give my block topic name which will be nothing but it will be your block topic that I have actually created so in short uh this function when I'm calling it it will be responsible in giving me the entire blog by calling this particular uh model that is lambat to okay so once I probably get this generate blog now I'm going to go ahead and write if if generate unor blog okay now as you know that I'm going to save this particular blog in my S3 bucket okay uh S3 bucket the reason why I'm saving it I'll try to save it in a txt file so that I will be able to access it whenever I really want so let me do one thing let me go ahead and import from date time I'm going to import date time okay so so the reason why I'm importing this date time because I really want to create all the txt file in a way where I consider the current time okay so current uncore time is equal to date time do now and here I'm going to use strf time in the form of hours months and sorry hours minutes and seconds okay and then I will go ahead and create my X3 key key name okay S3 bucket key name so S3 key how my file needs to be saved okay so I'm going to create an F string and let me just go ahead and go ahead and create something like this blog output folder so this will basically be my blog output folder and inside this I will go ahead and write current time do txt file okay so this will basically be my txt file which will be saved in that particular S3 bucket now let me give my S3 bucket name so S3 bucket name and let me go ahead and write AWS Bedrock Bedrock course one okay so let me create this particular Bedrock name okay and now I've given all the information now I can call a function okay and let's say uh first of all let me just close the if Clause so here I'll say print no blog was generated right no blog was generated perfect now the next step what I'm actually going to do uh over here is that once I get my key and bucket I should be able to call another function I'll create another function and let me go ahead and write this particular function and here I'm going to write slave uh save blog details okay in S3 okay so this will basically be my function and um with respect to three I I require two information one is my S3 key S3 key and the second information is my S3 bucket okay so in short I'm taking up all this particular information and uh here you can actually see that I'm I'm saving it in the S3 bucket now the further code I can probably write it over here again over here what I will do I will go ahead and write boto 3. client okay so this will basically be my boto 3 let me see I've imported boto 3 or not so this client will be nothing but my S3 bucket okay and again here I will go ahead and create my TR catch block and uh I will say that hey go ahead and create s3. put object is nothing but with respect to my S3 bucket uh S3 key uh body uh body also I need to save it and the body will be nothing but uh let me just see over here okay body will be nothing but uh whatever is one generate block that I'm actually getting okay so here I will go ahead and give my generate block which will be assigned to this okay so in short I am putting I'm taking my entire information block that is generated in this particular bucket with this particular file name so here I'm going to probably call my this function that is nothing but save block details and with respect to this particular save blog details as I said I require S3 key the second parameter that I require S3 bucket okay and the third parameter I require is nothing but my generated blog which will be in the specific name okay so done this is good enough everything is probably given over here but again understand with respect to Lambda Handler uh I should be returning something because once the execution is done just for my verification I will be returning something like a status code if there is no error so I will set up a status code which will be nothing but it will be equal to 200 and body will be equal to json. dumps and I'll say blog generation is completed okay just to get me a message okay so done this is good enough uh I've got each and everything over here uh it looks absolutely fine with respect to the functionalities whatever is required I have actually given it over here and this looks absolutely fine with respect to to whatever information I see over here right and uh again uh understand uh okay this response variable I will save it over here okay so since I'm hitting this particular bedrock and this response I can remove it okay so that is those variables so let me quickly copy this and let me do one thing let me paste it completely over here right so here I'm just going to paste it and this will basically be my entire Lambda function with respect to all the details and whatever things I specifically require again to write it over here it would have been difficult but now there you could see that how easy it is right now we'll go ahead and click on deploy okay so once this is probably deployed uh I can go ahead and create my API Gateway and all okay one more important thing understand that uh this Lambda function requires the recent version of boto 3 and if you probably go ahead and see what default boto 3 version is basically present in this Lambda is the older one now in order to do create or have a new boto 3 Library installed what you really need to do okay so here you can see I'll be showing you one very important thing I've actually created a zip file okay so you have to probably follow this step okay and this will basically be helpful for you okay let me do one thing let me just remove this okay so here you can probably see uh I have created cre a folder which is called as boto 3or layer okay now understand one thing what I am actually doing over here this is important for you all to understand okay unless and until you don't understand this so I will go ahead and activate my V andv environment or okay V andv environment is not there okay what I'm actually going to do c over here okay cond cond deactivate okay Now understand one thing as I said my Lambda function requires a new version of the boto 3 installed see whenever we open a Lambda function by default the older version of boto 3 is there now in the older version of boto 3 Because at that point of time uh you did not had a Amazon Bedrock all the foundation models right so what we need to do is that we need to update the packages with respect to the boto 3 now how do you do that okay okay how do you do that and this I would definitely explain you in a better way see what I have done is that I've created a folder which is called as python okay now inside this python what I have actually done see it is very much important now inside this python I have installed I have installed a library which is called as boto 3 so if you probably go ahead and see this right here you'll be able to see boto 3 boto core all the information is specifically there right now this will be a task for you just tell me how can you install a library inside a folder okay just to give you pip install boto 3 minus t right and if you go ahead and write this right what it is basically going to do is that the entire boto 3 Library will get installed inside this particular folder and once this folder is created what I can do I can make sure that I will go to this particular folder and I will make a zip file you can compress this particular file and you make a zip file and you can name anything here I have given boto 3or layer. zip okay if you want to install any packages in Lambda first of all you really need to create a python folder install all the packages that you want let's say by default over here if I go ahead and write import pandas pandas may not be installed in Lambda so if you really want to install pandas what you really need to do inside this python folder you will go ahead and write pip install pandas minus t python folder slash automatically the installation will take place and then you convert into a zip file the zip file name can be anything but inside the zip file there should be a python folder with all the packages okay so I'm going to specifically use this okay so I created over here boto 3. layer. jip now I'm going to go down and uh let me do one thing uh over here you'll be able to see layers okay so I'm going to click on layer add a layer now inside this particular layer this basically says that if you really want to add any kind of packages you can add it in the form of layer okay so here you can see AWS layer choose layer custom layers and all specify a layer so what I'm actually going to do I'll just go ahead and select my custom layer C over here you can also create your own layers also right you can create your own layer information so one of the layer you can probably see over here it is written as U boto 3or bedrock and this is the layer that I have already created okay now in order to create the layers I can go ahead and click on this particular layer I'll go back I will click on create layer see what I've have done so okay this is my Lambda function understand this thing okay uh if I probably show you my functions this is my function okay now I've already written the code I've deployed that particular code now what I need to do is that I need to create a layer right so that I update all the packages okay so in order to create the layer I will just go ahead and click on add a layer and over here what you do just go back to this particular layers and click on create a layer and here I will go ahead and write boto 3 updated layer okay and description I don't want I'll upload a zip file I hope everybody knows where the zip file is so this is my zip file so I will go ahead and click on upload okay and I will take this path over here and I'll paste it okay I will go ahead and write boto 3core layer. zip okay and then uh there is also like which all version you really want to be compatible with so I'll go ahead and write python 3.12 python 3.1 python 3.10 okay so these all layers I really want to put that okay it is compatible with and now let's go ahead and click on create the layer okay so once this particular layer is basically added uh created then we can add this particular layer to my l function and this is just to make sure that all your packages are up to date that is the main main reason why we are specifically doing it so it is going to take some amount of time again based on your internet speed and based on that we will try to see that what all things we can actually do okay so let this thing get created and let me have a look with respect to this once it is getting created okay now this looks that it is created now I'll go to my Lambda function okay uh create a function let me go back to my function this is my AWS bedrock and let me go ahead and now add a layer which I have already created it okay so now it should be my custom layer I will go ahead and select this boto 3 updated layer version I can probably give it as one and click on ADD okay so as I as soon as I probably give as ADD you will be able to see that over here my layer will get automatically added so one layer has got added so my code is ready my layer is ready now all I have to do is that I have to probably go ahead and create my API Gateway so that it will get triggered or integrated with AWS uh with my Lambda function and I will be able to trigger things okay so now I will go ahead and call my API Gateway so API Gateway is over here this is how my API Gateway looks like now let me do one thing let me quickly go ahead and integrate this API Gateway with my Lambda function okay so now my API Gateway is opened over here now quickly let's see this uh as soon soon as I probably get this particular API Gateway I will go ahead and integrate it and then whenever I trigger this particular API it should be hitting my Lambda function which is in turn interacting it so I will go ahead and create my API and here you can see I will be using an HTTP API uh over here you'll be able to see can should we add an integration my API name and all are there uh let me just give my API name so here Bedrock demo API I'm just going to write it down review and create okay I will just go ahead and create it okay now once I create it after that I will start adding routes so here you can probably see I'm getting routes over here so let me just go ahead and create some of the routes so here I will go ahead and write post request and here I can go ahead and write blog generation will be my API and let me just go ahead and create over here let me select this post here there are two option one authoriz this product your API against unauthorized request so if I really want to keep some kind of key for authorization purpose I can do it I can basically attach it over here but right now I really want to show you that how the post request will basically get handled so here I'm going to basically attach integration right now so attach integration I will go ahead and click on and uh with respect to this attach integration I am going to attach to my Lambda function now inside my Lambda function I will go ahead and choose my Lambda function so here you can see AWS app rock this is the Lambda function that I've actually created and I let me just go ahead and click on create so once I probably create this here you'll be able to see that my Lambda function is done okay now the most important thing okay one thing is that here this particular API I can also deploy to multiple Cloud platforms and sorry in uh see whenever we do a bigger project there is De environment there is QA environment there is uat environment there is production environment so similar kind of environments also you can create so for that just go to the stages okay and by default this is basically the default stage but I will just go ahead and create my stage name so I'll write Dev environment and um I don't need to add the stage variable and all so I'll just go ahead and create okay now inside this particular Dev environment uh you'll be able to see that um here right uh okay let's see this so this will basically be my URL and along with this URL if I really want to call my API so this will basically be my URL so I'll just go ahead and deploy it so no stages found let's see okay Dev is there so I will just go ahead and deploy it okay so I'm deploy this entire URL and this is the API that I have to specifically use for interacting with my lamba function okay this slash whatever will be my route name I can actually use it so this if I go go ahead and I'll copy this also okay just a second I'll just copy this um okay it has gone to this okay no worries I'll go back to my API Gateway and here now you'll be able to see all the API gateways that I have specifically created I have also integrated with the Lambda function that is available over there okay and this is how the entire interaction will take place see it's more about writing a code understanding about each and every services and that is why all these things are amazing right so recent date is Ni l so let me just click on this okay post over here I've already deployed this so let me do one thing so this is the/ blog generation if I probably go ahead and see with respect to stages if I go ahead and click on dev so this will basically be my API okay okay done each and everything is done this is integrated uh this is in turn integrated to my Lambda now one thing that I have to do is that I also have to create my S3 bucket still I have not created my S3 bucket because with respect to the code that I've actually given in my AWS Bedrock my S3 bucket name should be uh let's see what is the S3 bucket name that I have to write so it should be AWS Bedrock course one so let me quickly go ahead and create my AWS Bedrock AWS Bedrock cross one so I will just go ahead and create my bucket and let me just go ahead and write my bucket name okay and just let me go ahead and create the bucket okay so this is the new understand one more thing guys this uh AWS Bedrock course one that I'm basically writing uh it should the name should be unique okay so this is the name that I'm actually selecting a Bedrock course one okay and let me just go ahead and create the bucket because throughout the region it will be it has to be unique okay now let me just go ahead and update in my name over here in my S3 bucket name with respect to this okay now I'll just go ahead and deploy it so my AWS S3 bucket is also created the code is written where we really need to save all the files and all everything is ready now uh let's quickly go ahead and test it and I hope everything will work fine so I'll just go and open my Postman okay create my new file it should be a post request let's see I don't know whether it should work everything should work or not if I'm getting any errors definitely I'll let you know so let me do one more thing uh let me just go back to this um so this is the URL I've already seen I've already deployed it in it uh let me do apis block demo apis so this should be/ blog generation okay so this will be my URL I'll go back to my Postman and you have to be patient guys understand you have to really be patient because there are multiple things that you'll be able to see now let me just go ahead and set up my body my raw body will be with respect to one key and if you probably see the Lambda function what is the key name that I've actually created uh let me go ahead and see it I'll open the Lambda function and U once I open the Lambda function with respect to my function name a Bedrock okay and here down if you probably go ahead Come on load it quicker come on it's taking time to load I don't know why so code is getting loaded I've already already deployed this code also you also have to deploy the code in order to use it okay so Lambda function is getting loaded let me see this is there this is there okay over here you can see this will basically be my block Topic in the body so let me just go ahead and write block topic block topic and let me just go ahead and write the block topic will be machine learning uh and generative AI okay okay so this will basically be my key name uh inside my this this entire thing will be a event it'll be in the event itself inside that event there will be a body variable and then we will be able to explore this blog uncore topic okay so let me just go ahead and click on send hope so it works fine so it is hitting hitting hitting till then uh message not found why it is saying message not found because blog generation is there let me see um blog blog generation blog generation let me see it blog generation let me go back to my API Gateway and for this also what I can actually do is that go to my configuration or sorry go to my monitor and inside that I can also see the logs and it will be available in the cloud watch so here you can click on view Cloud watch logs and here is my entire Cloud watch vogs that you'll be able to see okay but uh right now it is not working let's see why it is not working uh we'll have the complete details that will be available over here now it goes to my log groups inside that groups specifically where we are actually working with again I'm not going to delete this entire part of the video because I really also want you all to see the logs okay to specify log mod does not exist in this particular account why okay the logs has been not created let me just hit it once again blog _ topic message not found status 404 I think so there is some problem with my API Gateway stages SL blog generation is it correct SL blog generation this looks fine my post request is this one uh the integration in the background save request let me just go ahead and see my stages my depth stage is this one fph let me see whether I've deployed that or not deploy successfully created deployment the deployment is active for Dev this looks absolutely fine now let me just go ahead and hit it fbh this one message not found still message not found let me see that what is the problem in this so guys one simple mistake that I've actually done over here is that uh see when I pasted this entire thing right there was an extra space that was added and Postman does not resolve this particular space and it it it keeps the space like that so I'll just remove the space now I'm going to search for the topic on generative AI so I will go ahead and write this 200 words block now let me just go ahead and click on send and here you can probably see blog generation is completed let's see whether everything has worked fine or not I'm just going to reload this log groups and obviously you know how to go to this particular log groups go to configuration okay sorry go to Monitor and just click on view Cloud watch logs so that you'll be able to see the logs itself right now let's see whether the logs uh will be able to see the buckets also or not I'll also go ahead and check the buckets so AWS Bedrock so right now nothing has got created so I think there should be some kind of error over there but we'll have a look uh whether there is an error and we'll try to fix it okay uh so all the things usually happens over here so here you can see AWS app bedrock and there is one log uh so with respect to this particular log let's see AWS Bedrock some no blog was generated uh generating the block an error occurred while when calling the invoke model uh is authorized to provoke invoke model and resource this because no identity based policies allows the Bedrock invoke model action so guys from the error that you probably see over here it is not provided uh it is basically not providing you the excess so definitely the role that we have actually created for a Lambda function right we really need to change over here right so what we can do see initially it was in code right now let's go back to the configuration and select the role name so if you select the role name over here okay let me just go ahead and select this and now within this particular role name we can provide further access to it okay so let's say over here only the a AWS Lambda function is basically there so what I can actually do is that I can provide more access more more permission policies so that I will also be able to access the AWS Bedrock so to just to see like what kind of policies we can probably provide it you know and over here right now what is basically provided only AWS Lambda so let's see that how we can still add more permissions so here what I'm going to basically write I'm going to basically select uh attach policy and again based on like how your IM console basically runs right uh I will basically provide it to ad administrator access again if you're specifically working in the company you can provide different different access based on the services that you're using but just to give a brief idea about it so what I'm actually going to do is that I'm going to add this particular administrator access so that you'll be able to see that particular access over here now once that is added uh let me just go back over here and let's see whether it is updated or not uh so I'm roll this is there okay perfect everything works fine so let me just reload this and now I think we should be able to work out and see that yes now we'll be able to get the access so here you can see all access is basically given now I'll go back to my Postman and again hit the same request now let's see see if it is taking more time right that basically means my entire application is working fine and it see we are able to see that yes it is taking more amount of time now I think our blog will be generated 200 words blog will be generated and we should be able to get or we should be able to be storing that entire txt file in S3 bucket so let's see I will go again back over here again reload my cloud watch so let's see whether my cloud watch has basically created or not yes new Cloud new log has been created so here is my entire block that has got generated generative AI the future of creativity and Innovation here you can see everything is working fine code save to S3 bucket I can also see in the logs now it's time that let's go into my S3 bucket and see whether I will be able to find it out or not so I will go back to my S3 bucket let's go ahead and hit this and then I will be able to see this AWS Bedrock course one and here is my blog output and here is my text file right if you want to probably see this particular txt file I'll just download it but usually when we are working with a Services we can probably see that and here let me open this and here is my entire blog right and this specifically we are using the meta uh lama lama 2 1 million parameters gener AI is a revolutionary all the blogs has been generated again the main aim over here was to just to make you understand the entire flow because in the upcoming videos the kind of projects that we are going to deal with will be quite Advanced will be quite better because there we are also going to use more things like vector databases and all but all in all this is an amazing project and these all services are specifically used step by step if you are also following it you will also be able to do it so I hope you like this particular video this was it for my side I'll see you in the next video have a great day thank you wonder all take good bye-bye
Info
Channel: Krish Naik
Views: 13,860
Rating: undefined out of 5
Keywords: yt:cc=on, end to end genai in aws, aws cloud tutorials for llm, aws bedrock tutorials, amazon api gateway, geenrative ai tutorials, data science tutroials, krish naik rag tutorials
Id: 3OP39y4dO_Y
Channel Id: undefined
Length: 54min 35sec (3275 seconds)
Published: Fri May 10 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.