Amazon Bedrock Knowledge Bases : Build e-Learning App- Knowledge Base, AWS Lambda, API GW, Claude FM

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
welcome to another video on Amazon bedrock and generative AI in this video I'm going to show you how to build a simple surus e-learning application using Amazon Bedrock knowledge base Cloud Foundation model Lambda function and AWS API Gateway this video is part of my bestselling Udi course on Amazon Bedrock which you can check from the link below you can also check out some of the other videos as part of this YouTube playlist on how to build a chatbot at R Q&A using rag text summarization and some others now let's go ahead and check what we're going to build as part of this use in this section I'm going to show you how to build a serverless e-learning app using knowledge base for Bedrock Cloud Foundation model AWS Lambda and AWS API Gateway now let's quickly take a look at what we're going to build as part of this use case now let's say you have a large organization which is trying to move its applications to AWS and it wants to build an e-learning app which will help its employee understand more about AWS services so now we have an employee who's trying to learn more about AWS now we're going to build this application on AWS so you have the AWS Cloud now to build this application the first thing we're going to do is we're going to create an S3 bucket where we're going to load all the documents around various AWS Services which the employee needs to learn to carry out the migration so once we have created this S3 bucket the next thing we're going to do is we're going to create a knowledge base now knowledge base helps you build rag based applications and it carries out certain tasks behind the scenes say chunking creating Vector embeddings storing it in the vector store now if you are not aware of these Concepts please look at some of the previous videos I've created where we build the HR Q&A app okay so once we have created the knowledge base we're going to use cloud Foundation model for this use case now let's go ahead and see how the entire flow would work so we have an employee here and he could pose a question to the e-learning app say he asks which EBS volume should he use if he requires high iops so this question will go as a prompt to the Amazon API Gateway and then Amazon API Gateway is going to pass this as a an event to the AWS Lambda and then AWS Lambda will invoke what we call as retrieve and generate API which is provided by knowledge base along with this user prompt and then generate a response from the knowledge base which will be the contextual information stored in this PDF bucket so that contextual information would be passed to Cloud Foundation model and then based on this context and intelligence of this Cloud Foundation model you're going to get a response which would be passed to the AWS Lambda function and then that response AWS Lambda function will send back to the user via the AWS API Gateway now let's take a quick look at the demo of what we're going to build in the architecture section we discuss the first thing we're going to create is an S3 bucket so you can see here I've created an S3 bucket UD me knowledge base and I have uploaded a lot of different PDFs such as EBS fqs ec2 fqs and some other documentation which the employee might require to help him learn AWS the next thing I did was I created a knowledge base and you can see the knowledge base here youed me Bedrock knowledge base which will use this S3 bucket as a data source and then created a Lambda function UD me knowledge base and in this I'm using retrieve and generate API to retrieve the context from the knowledge base next I created the rest API e-learning knowledge base using the AWS API Gateway okay now let's go ahead and test this out using the postman so I have the invoke URL here let me just copy it from here okay now I'm on the postman so if youve not used Postman before it's basically an API testing tool which you can download for free from the internet okay so I will use the method and the method I'm going to use is get let me just paste the URL here here okay now I can provide the key and our key is prompt now let's ask it a question which is the best AWS compute service for even driven workloads and let me just do a send okay I've got a response so it says elastic container service would be the best it compute service for the event driven workloads since it allows you to run containerized application so basically what it's doing is it's getting this context from probably this ECS FAQ based on the question that we had posed okay now let's try another question so let me just change this let's say which EBS volume is good for high throughput okay and let me just test it out let's do a send so I've got a response and it says throughput optimize HDD and cold HDD volume are good for high throughput workloads and then it gives me some more details and again the data source for this is probably this document here the EBS FAQs now what we can also do through this retrieve and generate apepi is not only get the answers but also the citations of where that answer has been derived from so I made some minor changes to the Lambda function and now you can see it's going to provide more detailed answer and it will also provide the source from which it has derived this answer so I'll keep the question same and let me do a send and now we can see it's giving me a response so it's giving me a lot more information so here is the text which we got earlier on throughput optimize and cold HDD but now you can see it's also giving the citation of where it has derived this text from and what is the URL so it will also show the URL so you can see it's showing the URL from which it has got this answer so it seems to be the E2 fq so if I just go here it's basically this ec2 fqs from this particular document so this is what we're going to build as part of this use case I'll see you in the next lecture thank you so now let's go ahead and create Amazon Bedrock knowledge base but before we do that there are a couple of things I just want to tell you so the region I'm going to use for this section is Us West Us West 2 so if you're following please use this region um because some of the services are not supported in the US East at this point of time maybe few months down the line they might be available in all the regions so you can use any of them but for now I'm just going to use us West 2 okay the second thing is that you cannot create Amazon Bedrock knowledge base if you have logged in as a root user so what what you'll have to do is you will have to create an IM am user so you can see here I have created an IM am user Rahul admin so that's what you'll have to do if you don't know how to do that let me just quickly show you so in the search bar just search for I am and go to this service then click on users on the left and just click on create user let's say demo you me and then just click on next then you can either add it to a group but I will just attach the policy directly so click on this attach policy directly and just give it admin access and just do a next and just do a create user now you can see I have this demo UD me user just click on this and just go on this security credentials and then click on this enable console access so just click on this and select enable and just do an apply and then you can see it has given you this console signin link so just copy this or download this CSV file these three data points would be in the CSV file then use this console signin URL and this username and password and then just click on this do a sign out and then login through the URL and username password that I just showed you so once you have done that then you can go ahead and create the knowledge Bas so as I already said I have logged in as an I am user and not the root so let me just go to Amazon Bedrock so I'll say Amazon bedrock and just click on this so now I'm on the Amazon Bedrock service so what I've done is I've also created a step by step guide on how to create this e-learning application okay the first thing is identifying the data source so we have to create an S3 bucket and upload all the relevant files which we want the user to be able to access as part of this e-learning platform so I'll just go here so I'm already on the Bedrock so let me just do a duplicate search for S3 okay I'm on the S3 service so let me just create a bucket and you can see a region I have to provide us West 2 so I have to give a bucket name so let me say you me and let's say knowledge base and so I given it name you me knowledge base maybe I'll just put a underscore here and let me just do a create bucket okay so it's saying an invalid character so just remove that and do a create bucket okay my S3 bucket now has been created so as part of this use case I've downloaded all the AWS documents around say Amazon ec2 fqs ECS FAQs Lambda and all these documents which we will use as our internal data source so what I'm going to do is I'm going to go back to this S3 bucket so let me just search for UD me and you can see I have this UD knowledge base here the bucket I just created so I'll just do an upload so I will just do an add file so I'm this desktop folder knowledge base and let me just select all files and just do an open and do an upload so you can see it's uploading all these documents okay good now I think we are done so now let me go to Amazon bedrock and click on these three lines and under orchestration you can see I have on the left knowledge base I'll just click on knowledge base now before I create this knowledge base if you are following along I just want to highlight or warn you that there's going to be some cost involved in creating knowledge base because we're going to create AWS open search serverless Vector store and that cost about 50 cents per hour and then there's going to be small cost on S3 RDS Etc so you can assume for every hour of knowledge base and open search that's running it will cost cost you an hourly charge of about between 50 cents to $1 now if I just click on create knowledge base and you have to just give a name so let me call it UD me bedrock and knowledge base and you can just give some description uh e-learning use case okay and you can you have to create an IM Ro so you can either create your own or use an existing service role so I'll just create and use a new service role so you can leave it as it is then you have tags I'm not going to put any tags so let's click on next so the first thing as we had discussed earlier when you're creating a data injection Pipeline and that's essentially what a knowledge base does so you have to first select or choose a S3 locations I'll just click on browse S3 and we created this udom me knowledge base I'll just click on this so once we have selected the data source and as we saw the next step is to select the data transformation or chunking strategy so if I just click on this Advanced setting you can see uh one is on the kmma side but there's also a chunking strategy how it breaks down the text into smaller segments before embedding so either you can use default chunking which is about 300 tokens or you can also select fixed size chunking or no chunking so we'll just go with default and I'll just click on next the third step is to pass these data chunks through an embeddings model so right now it supports Titan embeddings from Amazon then embed English from kohir embed multilingual from kohir again so I'll just go with uh Titan embeddings and then finally you have to select the vector database and for that you have four options so either you can just use the default one and it will use Amazon open search servess Vector store and that's where I said it's going to cost about 50 cents per hour it's a serverless store or you can choose your own so if I just click on this you can see you either have Amazon Aurora redish Enterprise Cloud pine cone Etc so I'll just go with the default one and I'll just click on this so what it'll do is it will create the vector store for me and then I just click on next and then you can just review all of this and just click on create knowledge base so now it's going to take about 4 to 5 minutes to create this knowledge base so I'll just pause the video for now okay I just waited about 5 minutes and now this knowledge base is ready so one important thing we have to do is once it shows this knowledge base is created successfully is to do the sync so just click on sync and it's again going to take a few minutes okay I just waited few minutes now you can see it's showing us snc completed for data source now the easiest way to access this knowledge base is from the console itself and we earlier discussed that this is a rag based solution so we also have to select what is the foundation model we can use so on the right you can see I can just click on select model but if you are accessing it through the UI for now it is only supporting anthropic so you can see model provider anthropic and then I can select the model type so let me just select this one and then I just do apply okay now I can ask it some questions and as a combination of this Cloud Foundation model and the data we have in our knowledge base it's going to be able to respond to our questions okay so let me ask it a question which EBS volume in AWS can I use for high throughput so you can see uh we had actually uploaded in the S3 bucket the FAQ for EBS now let's see what's the response it gives you can see retrieving and generating response so it says the throughput optimized HDD and cold HDD EBS volume types can be used for high throughput workloads which is correct now let's ask it another question which EBS volume for high iops okay and let me just click on this and you can see it says provisioned iops SSD uh i01 and I2 volumes are designed for workload that require high iops like transactional or database workload so it will also show you like here you can see uh show The Source details so it's providing you details from where it got this answer so you can see here under Source details Source chunk one and Source chunk 2 so providing you where it's getting that response from so if I just click on this and it will show you the source of this response so it is this Amazon ec2 FAQ that we had uploaded of as our source data so hope this is useful I'll see you in the next lecture thank you
Info
Channel: GenAI with Rahul Trisal
Views: 3,039
Rating: undefined out of 5
Keywords: aws bedrock, aws bedrock tutorial, aws bedrock ai, aws bedrock workshop, aws generative ai, aws bedrock mini project, aws bedrock hands-on, aws GenAI, amazon bedrock, amazon bedrock tutorial, aws, lambda apigateway, amazon bedrock ai, aws lambda, knowledge base bedrock, aws knowledge base, bedrock aws tutorial, bedrock knowledge base, aws serverless project, aws mini project, lambda api gateway bedrock, aws bedrock knowledge base, amazon bedrock knowledge base
Id: wN3wmbqLTX8
Channel Id: undefined
Length: 17min 30sec (1050 seconds)
Published: Fri Mar 15 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.