Must Frameworks To Learn In Generative AI #langchain #llamaindex #chainlit #awsbedrock

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello all my name is Kush naak and welcome to my YouTube channel so guys many people I hope you're working in the generative AI field now you are learning multiple things you have probably seen so many tons and tons of llm models you have seen fine-tuning techniques you have developed application that is related to document Q&A you have created multiple applications using different different uh apis let's say Opia API you have used L models from Google gin Pro mistol Lama 2 but one of the most common questions that I'm continuously or regularly getting from people is that with respect to different generative AI Frameworks now some of the Frameworks to name are like Lang chain they are llama index there is chain lead there is hugging face you know people do have a lot of confusion which Frameworks they should definitely go ahead with you know because there are some of the functionalities in all these Frameworks that may overlap right so considering this in this specific video we are going to talk about various Frameworks we are going I'm also going to show you all the documentation page of the Frameworks also which will also help you to decide if you are given a specific problem statement which you should probably go ahead with it so please make sure that you watch this video till the end because if you are really interested in understanding generative AI from its complete core and probably tomorrow if you're working in this industry right in this specific sector whenever you're given a problem statement this is the first thing to specifically decide that how you are going to implement that entire project what all libraries will be required which framework is better to use so everything will be very much meaningful if you are taking this particular video seriously so let's go ahead and here I'm going to share my screen I've written some of the generative AI Frameworks over here again there are multiple Frameworks like even there is a framework like hugging face and all also okay so let's let's consider that I've written some important Frameworks that many of the people specifically use okay now in Lang chain also if I probably consider uh okay before going ahead you know let's talk about some of the companies let's say uh if I talk about llm models okay if I talk about llm models now see this okay now this llm models there are various llm models obviously you know uh there are companies like who are in Amazing Race open AI there is Google right then there is Frameworks that are specifically from Facebook sorry llm models from from Facebook not write Facebook I'll write meta okay so here is meta then you have mrol similarly there are many many models as such let's consider one more model which is like cloudy 2 okay now all this companies are in that specific race to create the best llm model okay just to take as an example even there are stable diffusion that are specifically creating you know text to videos text to images I'll say and obviously opening recently it has also come up with this Sora model right this is nothing but text to video okay so every companies in the are in this specific race right to create the best llm model which can based on some metrics it can be decided that okay fine this is the best model uh that can be used for most of the task okay now to all use all these models much more efficiently okay we definitely require some Frameworks okay it's not like I cannot directly use openi apis itself I can definitely use right see open a API if I probably consider or all any of these particular apis okay if I probably consider this llm models that has been created by any company okay let's consider the functionality is very simple you give a request and you get a specific response okay now when I tell you with respect to this particular example let's say that I'm telling hey write me an email probably to summarize this entire or uh take this paragraph and try to summarize this entire text in 100 words so I will give a request with that specific prompt as soon as that request goes with all the text in this llm model I will specifically be able to get the response okay summarizing that entire information now this is one way and similarly all the models that we have let's say Google uh we have Google gin Pro now Google J Pro 1.5 has also come I'll soon make a video on that I'll soon make a video on Sora you know there are a lot of technical terms that will be coming soon right in meta you have Lama 2 right so if I probably consider Lama 2 model now soon Lama 3 will also come Mistral also you have cloudy also you have mral 7B model is also there all these models are designed in such a way that it will be able to perform most of the different different NLP use cases task right and all these task are very much important to create some important application like chatbot it can be a module in some application with respect to chatbot also right so different different applications will definitely be there now understand in a real world use case okay in a real world use case just giving a request and probably interacting with the llm and getting the response is only not the task when we probably consider a huge application right because there a data pipeline will be there we'll also have to probably fine-tune these models and obviously I've also started a playlist with respect to fine-tuning right and soon I'll be coming up with techniques like Lura clora so all this will be some important mathematical framework that you really need to understand i' right now I've just completed quantization give me some time because it is better that I create a deep dive video with respect to each and everything of this right so once this probably llm model it is providing a functionality with respect to request and response but as I said when you're creating a bigger application there is a lot of different dependencies with respect to fine-tuning with respect to creating application like document Q&A which with respect to creating application that are dependent on some thirdparty tools right like databases and all right at that point of time you require Frameworks right you cannot directly use this open AI or uh Google gy models because they just they're just responsible in providing you taking a request and providing a response right so when you have that kind of dependencies this framework comes in very much handy okay just imagine there is a saying that if llm models are gold right let's say in a one of the mountain we have found out this gold right now everybody will start digging that mountain for digging that particular Mountain you know you require some tools some hammers right different different things so that tools and hammers can be acted like this kind of Frameworks now whenever we use this Frameworks like Lang chain lindex chain late or hugging face you are specifically using this for multiple purpose let's say consider in the example of Lama index if I really want to work with Q&A application right document Q&A or if I have data set which is present in some external data source right external data source at that point of time I can use this particular framework that is called as Lama index in between to perform most of the task like data inje right I can use data inje because they will have multiple tools for performing data injection in case of data ition let's say I want to read the file from CSV PDF right I want to read the file from text plain text anything it can be right and now this same functionality will also be available in Lang chain right if I probably consider after taking this particular data set I want to perform some Vector embeddings right Vector embeddings now for Vector embedding I have different different Vector store of vector databases right I can use chroma DB right I can use chroma DB I can use FS I can use use Centra right in different different databases now for connecting to this particular databases there will be different techniques there will be different apis that will be available in this specific framework right so in every each and every task that we probably create in most of the application that involves llm models that additional things can be very much helpful when we use this kind of tools right or this kind of Frameworks to do this this is completely different this Frameworks actually helps us buil that generative AI application that is required in a higher picture to solve that use case that we are trying to solve right now choosing between this okay if I probably consider there are also multiple examples you will be having agents you'll be having tools you'll be having some other dependencies that will be available along with this while you're working with this right let's let's take an example okay I want to create something called as text to SQL right text to SQL now in this kind of of framework what will happen is that I will give my text I will give my text I'll say hey tell me in one of the class like in the data science class who is the person having the highest Mark now this text needs to be converted into a SQL query let's say I want to convert this into a SQL query then this SQL query will probably interact with any of our databases get the response right get the response and based on the response I will be able to see the response in my front end right now who will be creating this query this query will be created by the llm model right so to create this query by the llm model again there will be different different dependencies because I need to probably set up my prompt right if this llm model is not able to cater most of the amazing queries then what I need to do in this llm model only I have to perform fine tuning now again for this F finetuning I need to create my data set right I need to create my data set now for this data set I may use different different tools like data annotation labeling multiple things can be used right and once we probably create this again we need to upload this data set along with my LM model then again if this is in the CSV format then I can go ahead and use Lang chain I can go ahead and use Lama index Lama index so multiple Frameworks can be used right at the end of the day think of the Raja picture right with respect to llm only this task is there this task and fine tuning is there but who will be able to do all this other task right there are multiple things over there now let me talk about one more thing there's also one more framework which is called as Amazon Bedrock I don't know whether you have seen my tutorials or not I've already created in this now what Amazon Bedrock is basically solving you need to understand this thing Amazon Bedrock says that hey I will provide you all the LM models there are multiple llm models which I specifically have and this you can pay me pay as you go because you don't even have to worry about deployment it will be hosted in the cloud all these llm models in the form of API in the form of API will be given to you right so if it is given in the form of API you just need to give the request you'll get the response okay request and response so you're giving the request you're getting back the response now this is super amazing right with respect to llm models here is your API you're giving the request and you're getting back the response right now you don't even have to worry about scalability you don't don't even have to worry about deployment so this again fixes the llm request and response part only but it is not fixing the other part the other part over here like I want to upload a data site I want to probably see this will also help you in fine tuning but again with respect to other thing like data preprocessing data inje data transformation it is not helping you out so for that particular case I can use all this kind of Frameworks right so I hope you got an idea about the importance of framework with comparing the llm model now let's discuss about some of the important Frameworks that we have like you have Lang chain now the best thing about Lang chain is that this is probably the most amazing framework that has probably come up over here right so if I consider over here if you just go ahead and see okay so there are multiple things with respect to Lang chin now initially only this was there now they have also come up with different templates Lang serve chains as rest API lsmith Langs Smith is basically for deployment if I probably consider lsmith see lsmith is a platform for building production G llm application see Amazon Bedrock is already doing but if you want to use all these functionalities and deploy at one place it actually help you to Pro build this entire production grade application uh the llm application itself it lets you test evaluate monitor chain so in short what lsmith is doing further step as you know in machine learning also it is it will be able to take up this llm Ops this is the next step now this Frameworks will probably go to this next step because they really need to focus on the pipeline pipeline how each and every component when you're developing an application is connected right and this is the future right now mlops with respect to machine learning projects is there right so if I probably consider just an example with respect to Lang chain if I say in model iio what all things are there see it provides you Concepts prompts different types of prompt different types of llms everything is provides you and the best thing about this framework is that langin also supports multiple llm models uh Lama index also supports multiple llm models you can also work with Google gin Pro you can work with uh different open source llm models like Lama 2 you can also work with open AI anything is there so that option it has completely kept open right if I take talk in terms of retrieval right so see this is the main steps that specifically happen whenever we have some external data set we have the source we load we transform we embid we create a vector store then we retrieve right and this retrieval is again based on that cosine similarity concept so over here you have techniques like document loader like how many different techniques of loading the document are there right so with Lang chain also this many are there Lama index may have more than this right so CSV is there file directory is there HTML Json markdown is there right then you have text embedding models like how you can probably create this Vector stores retrievers like how many different types of vector stores are there Vector store back Retriever and symble retriever long chain long context multiv Vector retriever parent document ret self querying everything is there if I probably consider with respect to indexing different different indexing is there now you may be thinking Krish which want to specifically use now it depends completely on your use case and that is where your more time will go on right it has already created this particular framework selecting the right thing is the most important thing over here now along with this you can also see that we have now Lang Lang serve so Lang serve is nothing but it helps developer deploy Lang chain runable and chain as rest API the same thing I've already shown you with respect to Amazon Bedrock yesterday also I showed you one platform which is called as Vex right it actually helps you to create that entire Pipeline and then it actually gives you in the form of pipeline uh in the form of API right similarly Lang saap will be able to do that same thing right and it is soon going to come you'll be able to see that also I'll create a video once it comes then you have lsmith which will actually help you to evaluate your language models uh and intelligent agents to help you move forward from prototype to production perfect now similarly you have Lama index again different different functionalities are over here the best thing is that how you can combine all this Frameworks to create the perfect llm application with respect to generative AI similarly you have chenlate right and chain lit also see you have integration with Lang chain and all now what is the m purpose of chain lit is that this is an open source python package to bring production ready conversational AI so at the end of the day you find the best features that it has it is better to know multiple things tomorrow in a company right if 6 months is being taken to complete the project one month initially one to two month the discussion will be about what to use what when right what kind of use techniques you can specifically use to create a most efficient generative AI application right so here you can probably see it helps you to build fast co-pilot data persistence it has this kind of functionalities but understand every framework has some of the most important thing over there then you have Lang chain it has support to Lang chain Auto Jane open AI assistant Lama index tomorrow it will come with Google gin Pro then multiple things it will come then here also you have custom front end even you don't have to write your own separate code for the front end I think it provides you custom react front Ender with the help of react JS you'll be able to create it then if you are more into fine-tuning techniques then again hugging face is there in hugging face you have lot of different different open source models which you can call even paid models you can call and this also provides all the features that has this Lang chain whatever deployment techniques you specifically want or any llm pipeline that you really want to create so at the end of the day it's more about using the best thing for the best purpose right there will be multiple Frameworks that will be created try to identify the importance of each and every framework and try to use it and this is the the key behind developing any llm application right so yes this was it for my side I hope you like this particular video please do make sure that you hit like share with all your friends subscribe my channel I'll see you in the next video have a great day thank you and all take care bye-bye
Info
Channel: Krish Naik
Views: 29,064
Rating: undefined out of 5
Keywords: yt:cc=on, aws bedrock tutorials, llamindex tutorials, labncghain tutorials, chainlit tutorials, generative ai tutorials, krish naik generative ai
Id: nQV5_Eb-mhk
Channel Id: undefined
Length: 17min 42sec (1062 seconds)
Published: Tue Feb 20 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.