Semantic Kernel in Azure Prompt Flow For Your Plugins

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] oh it's good going to crush my brain what are you doing I need to massage my brain I just found another open-source library for creating llm applications called semantic kernel and I'm confused very confused pretty easy you see this USB charger that connects the power of electricity to charge all different devices this is exactly what semantic kernel does it connect the power of your large language model let's say GPD model to your own functions and codes or plugins to execute them for example you have a plugin to write an email you have a plugin to play music or anything like semantic Kel is an end to an orchestrator that let your llm application execute some task better to to say create your own co-pilot but wait wait um we do have prom flow in asml servers that I can create LM applications there too so how these two can get along with each other yeah you can actually code semantic Kel inside prom flow would you like to show you a demo I can I can actually do that easily what oh I would love to see a demo actually sorry yeah you know what I need to C pilot for massage can you can try my back a little bit just here let's go before we start make sure you subscribe and hit the Bell icon so you will get notified for the next video thank you all right before we get into s semantic kernel with prom flow let's just start with talking about what is semantic kernel well this is not going to be a dedicated video just about semantic kenel because I believe semantic Kel by itself requires dedicated multiple videos actually about it so if you're interested for getting new with more details about semantic and I would love to have your comments so make sure you write down the comment section I'll do my best to dedicate more videos just for semantic Kel if there are enough so make sure you have your inputs but for now we need to have a quick intro about semantic canel longest story short it is something similar to Lang chain and I have created multiple videos about Lang chain so semanic can is also another open-source STK mainly initiated by Microsoft but it's pretty active in open source community that what you can do you can use semantic as an open source SDK to orchestrate connection between your model like here for for example J GPT model GPT for model VI your programming languages codes like python C again let's say you have created uh plugin what is plugin plugin is just bunch of functions like bunch of python codes that do something they calculate math they write down an email or they call another API so those are some action items that you have defined them as a code let's say python or c and now you need to sort of call them execute them by based on the inputs or the order that your GPD model open a model generate so you need sort of an orchestrator that okay I see gp4 told me that I have to write down an email so I have to call apis codes or functions that are related to writing an email plugin as an example so this is the job of semantic kernel again this is a very high level definition there are much more capabilities we can talk about a semantic Kel this is actually the official documentation of semantic youel that I went through that and they are mentioning a couple of other capabilities but simply it's all about something like Lang chain as I told you about and it's becoming pretty pretty popular this is something similar to what you have seen with Microsoft co-pilots that they have connected to they have connected to a plugin they execute some task so it is like you creating your own co-pilot and you have this semantic inel here to be able to talk with all those apis functions plugins so on and so forth now then why do we need prompt flow withand kernel again what is promp flow I dedicated a video just about promp flow and what it is it's a new service on asml that it will enable you to create large language model application it will give you code and also of low code visualization of the inputs and steps that you have for llm for example you receive an input for your chatbot you call opena you Call some functions and then you have the output of that chat but it's a very nice and recent tool again this video is not about prom flow if you don't know prom flow I will definitely recommend you to check the video that I created about prom flow before continuing this video I will add that video to the top right of the screen here and also in the video description so now we do have prom as a tool to create llm applications and now we have semantic Kel to sort of orchestrate connecting my Foundation models or large language models to my programming languages like python but then why do I need to combine these two or better to say what's the proposed value of using prom flow with semantic Kel well semantic Kel is an open source Library it's a code and prom flow is a tool so semantical as a code can be used inside this tool but again what is the proposed value let's talk about that so as mentioned here the main value here is that promp flow is a tool Beyond just just developing application it's about evaluating application deploying application monitoring an another application so when you have a co-pilot created or a chatbot created that you use semantic Canal inside you need to evaluate this solution before you deploy for production maybe you have to ask thousands of questions as a sample data set that you have from this uh solution and then evaluate the results of that make sure these semantic canel open a models everything is working properly and then when you're done you need to deploy that as let's say endpoint like a machine learning model that you create a real time endpoint and API out of it so people can use your solution for these capabilities prompo will shine and that's why we need to add promo as extra salt and pepper on the top of semantic kernel and this is what we're going to do so what we're going to do in this video I'm going to create a promp flow that it received the question from an input from the user it's a math question then I have semantic kernel which going to call a function that does math calculation it uses python for doing math no open a and then the result going to be back to show to the user so what I going to do here I receive the question which is the math from the user I will call openi model to interpret that question to say it's a math what is it like what's the request coming in and based on the openi model output it is now capable of calling a math function that I Define something like a plugin in semantic anel calculate the math question get the results back to the open a model or to the user and then we're going to do batch evaluation with testing this solution over multiple math questions this is the capability of prom flow and then we see the aggregated results okay so let's do so and by the way recently when I showed you prom flow in the other video you saw that I opened Azure ml but this time the nice thing here is that let me show you this there's a new extension of prom added to vs code that means I can have the similar experience that I had back in the time with Azure ml prom but this time in vs code I can even test it locally and then deploy to the cloud so this is great people who loves vs code you will love this extension if you're developing llm application by prom so let me tell you where you need to start to see what I see here first of all clone the repository that I have created well I didn't develop that repo I cloned that from semantic interel documentation GitHub and also prom full of GitHub but I completed the task merged them all together to make it clean and ready for you you just need to clone it you don't need to do it from scratch but I'm going to show you how I did it from scratch so there is that repo I want to add that repo link of mine to the Discord channel in video description on the reference section click on it you will see the GitHub repo that you can clo it but let me show you here what I'm talking about this is what I'm talking about semantic interel prom flow under my GitHub account again I clone from the references and then I'll I have it already clone in my desktop and then I will show you in the vs code and I will go through the files and stuff to see what they are okay just wanted to tell you fi this is the rep I'm talking about that I will add it to the Discord Channel okay going back to the vs code the first thing that you need to do is make sure you install promp flow and okay you say pip install and by the way as you can see I am under the folder that I have cloned uh the repo okay so make sure you come to the path that you have cloned the repo that I showed you and you would say pip install prompt flow I'm not going to install it again because I did already after you're done install another one which is called prom flow tools so two libraries prom flow and then promow tools when you're done then also make sure you have installed the promow vs code extension how you just go to the extension type prom flow and you will find this I have already installed it and as you can see I have this nice prom flow icon added here and also I have proml icon added here too which is great in order to make sure that you have installed that successfully you just need to type PF stands for promow and V which is version let's run it make sure I have it installed in my machine there you go it's already there you're good to go if you see such a thing you're all good so in order to create your first flow as you can see this is my flow that sort of calculate the mass and stuff I'm going to come to that shortly but let's just's forget about now and by the way you can see I am I have cloned the rep I showed you and these are all the files okay so in order to create a re a flow in promow you don't need to code from scratch because on the backand side of this flow uh there is a code that I'm going to show you but the nice thing here is that if you just in prom flow type PF and then you say flow and then you say hey I want want to initiate a flow with this name for example perform math this is what our uh flow going to do right with using semantic Kel this will create a u a bunch of codes I would say boiler plate code uh sort of like a blueprint code for an empty raw flow that you kind of start modifying it so when I executed that that's how I I created all these files you can see here for the perform math there is one I going to show you what these files are but that's how I created that because you have already uh cloned my repo you don't need to execute this command line because I have already executed and generated these files and I cloned it to my repo so you don't need to run it again but just wanted to show you how I created them so after I ran this these files got generated but let's see what they are for example the main one flo. daml that's actually the code behind the flow that I showed you so my flow had input output and just a p code which which is the math planner and the nice thing is I can uh change my flow with code here or click on visual editor that nice ADL PL for experience that you have coming in here so what you saw is the yaml file behind this flow the input output what is the input of this what's the output of this it's on the yl file right and there are a bunch of other files for example under prom flow I think flow details and at the same yeah this one going to talk about some metadata about each node of this flow that we have here or bunch of source codes for example I have to actually delete some of the sour codes there extra and let's see what they are there's requirement. Tex is this one going to uh be the libraries that you need for running this flow here we just need prompt FL for run that and that's it so now I have my initial flow created but again when I had this created with the command line that I showed you I I had nothing here like there was a hello.py example here I I had to delete this so I clicked on it I clicked on delete and I created this by myself how you come all the way up here you click on plus sign you say that I going to have a python code here you click on python then this got appeared make sure what I did I named it math planner and the math planner. P was created for me and then I had to of course change the code inside the math plan I got it from the the Repository a reference of semantic Kel and what it does here so let me explain what this math planet does that we're going to call it so in semantic Kel when you have a plugin we do have a plugin we're going to call it math plug-in that calculate mathod I told you we need to have a planner because this planner will decide if I going to call that math Plugin or not so it's going to plan for me that's what semantic Kel does and because we're going to call this as a tool inside promp flow we need to make sure that we use this this decorator here so it indicate that hey this is a function that I going to call it by promp flow and you can see it is importing semantic kernel for connecting to Azure open if it's a chat completion it will do this for connectivity if it's a text completion it will use this for connecting to open and how it knows my API key API base for connecting to my Azure open I going to talk about that okay we do have the planner defined and but we have to give some definition to let the model know what what this planet going to do or how it's going to call math plugin I'm going to say that hey I have a math plugin I'm going to show you later what it is it's actually here so you need to use this to solve this problem given by the input that's it so what happens after if planner decide that okay I need the math plug-in because that's a math question it goes under plug-in that's how it is actually importing my math function it goes plugging under plugging there's a math. p and this one look at that instead of open AI solving a math solution which as you know these large language models are not good in calculation and math I have this being done by python for example if the question now now we need to describe what this function does to let the planner decide or and let opening actually help to Res resolve this for example if the user ask a square root of a number I have to clearly Define this definition and the input then there is a python code that calculate the square root there's a python code that add couple of numbers there's a python code for example you can see to do the sum so why we're going to do this because let me show you something if I just simply calculate from open AI like gp2.5 model here hey what is this weird number I added randomly multiply by this I got this answer this is pure chat GPT that answer a math question you might feel like it's a good answer but actually it's not because look at that I use calculator in my windows and I as the same sort of same this the same thing I multiply these two numbers and you see the the number is not that bad okay let me come to the calculator it starts with 55 then 761 but here is 55 778 so there's a difference that's why they're not accurate but now with this solution we are using this planner to upload that execution to a plugin which is a math that uses python a proper calculation so let's see if that really works properly so going back to my initial flow now you know how I created this I showed you and it connected to thisp math planner and because of the math planner. P it understood that oh I have to put some inputs here so what's the input input is coming from the user question on the top look I asked the same question that it asked from Chad GPT the vanilla chat GPT which is here to see if it's going to be close to the calculator or not it's going to have the same mistake and then it's a chat completion this is the deployment that I have created for GPT 3.5 if you don't know what it is come here to asual P make sure you use the same U name and remember I told you how it knows to connect to my actual openi that's the place I created openi connection how now just click on the prom flow extension after you install there's a place called connection come on the ASI and click on create a connection that's it when you click on create it will ask you that hey it create actually yaml file for you and it will ask you give me your key okay actually let me show you if I can update there you go it will create this for you and you need to just replace that with your Azure openi base keep this as an azure keep this the version uh put your key here give it a name test that's why test C appear and that's why I have it test here selected and then click on create or update that's it now I was ready to actually ask my question so let's actually run this let me click on Terminal so you will see the outcome again I'm going to ask the same question that we did here um where is it there you go to see what going to be the difference I copied this I paste it here let me click on run all so it is running all my flow it received the input calling the planner and based on the question it will decide for the math plugin execute that math but this time not with the llm with python code and see what going to be the answer okay let's wait a bit there you go I just got the results and as you can see it is 55761 let's open the calculator there you go now this time I got the exact number I got from calculator because we uploaded that math calculation from llm to a math plugin that I had this was hello world of semantic curent example inside proml and just think about this how many other type of tasks not just math calculation you can any other task you can upload out of your LM because llms are good for reasoning are good for decision making deciding on which task going to get executed by semantic canel help and planner but they are not good in such these tasks like calculating a math or or executing an API they cannot do but they can help you to decide which API you need to call to fetch a data for example here I called my math plugin okay so what's the proposed value Again by prom now I have this flow I can test it with batron that's why in in the report that you cloned I I show you something I have something called Data J so if you click on it I have some I would call it labeled samples that here's a math question and here's the answer another math question another answer and you see the questions are not directly math question it talks about pizza it talks about some weird things the reason we are doing this to make sure first of all the planner of semantic can works properly and it can understand that okay I need to call math plugin and then here's the output so I'm going to give all these to this flow that I created to make sure if the answer is eight my flu also calculate the answer as eight or how many of them it will calculate properly that's another proposed value of prom flow to let you do batch evaluation how we going to do that you come to your flow again you see this one I think that yes batch run and it's going to ask you okay give me your file you click on this you click on data Json L which is this one and it will start creating the batch valuation for you because I I already did I'm not going to do that again but when it is done you can see the results how just come to the prom flow again under bachon history you see that I have already one created before and got completed you click on that you will see yours appears as well then when you click on this click on visual visualize and analyze it is now visualizing the results of all the batch data that I have so let's give it a Time and let's see what it's going to show us there you go now these are all my questions remember my uh sample questions that I have what was the ground throughs and what we predicted for example it was five we predicted five perfect we got one a score the third one I think we predicted nothing I have to check why we didn't give a good a score but for the rest we almost calculated all properly except one before the last one you see there's a huge difference so you can have hundreds of different tests right so now I was able to create this batron and executed that for all the sample data set that I have but how about this I want to do an evaluation another proposed value of prom flow so let's say I want to say the accuracy of this solution for calculating math by semantic on openi it is 80% or 90% so how can I do that evaluation like how we do for machine learning models well in prom flow based on the type of the work you're doing it can do actually evaluation for us there are different ways again I I fully explain different type of llm application evaluation in prom flow in the video again I added to the description check that out but here for this specific math calculation I have again grabbed an evaluation type from promab repo that's why when you come to the report you downloaded there's something called evaluate accuracy math codes and that's why here I have have another flow which is now my evaluation flow I click on the flow yaml click on visual editor and that's how you see it is uh receiving that batch valuation as an input to aggregate that and give me the final accuracy and there a bunch of python codes behind each that they're available here they're pretty simple they just grab the the sum of all your correctly s predicted based on the ground through that we showed you all our sample data set that and now when I I have this evaluation flow how I applied over my patch one results so very very simple because now you have this evaluation code in the repo what you need to do come to that terminal make sure you go to the pattern path that you have this eval accuracy math to code so the pattern path I have is the perform math I'm here and what you need to run is this are going to show you just simply need to say hey I pay prom flow I want to create the R and this going to be a flow but this is evaluation flow where is it this is the path to my evaluation code so make sure you copy the path that you have under until to this folder this folder going to be the same name for you when you CL my repo but copy that with uh with the path you have this is mine and then it we're going to tell that by the way the data we use for Baton is here data jonl just skip this you don't need to change anything for here we are just telling that which column is ground truth VAR is the prediction it's coming from the previous run the batron but where is that batron that's the name of the batron yours going to be different so how you grab your batron and what this is the last thing you need to change and you can give it a name to your per from Matthew Val any name you want doesn't matter but how did I grab this come to your prompt flow again extension here remember I I ran this batch flow that I show you the results of that let me show you this one right just right click on perform math and in this and this time click on view locks so I lo KCK again view locks and there you go when I saw that this batrom was created it gave me the wrong name that's how I grabbed it actually to my code here let me show you again there you go that's it so you ran this in your terminal I ran it before you paste it here what going to happen you will see your evaluation or your endn evaluation will be created as a neon so that's mine let me this time visualize this one instead so you click on it you click on visualize and let me show you how it look like there you go this is what going to happen now it is telling me based on your batch end to end evaluation you got accuracy 0.5 with the rate of 0.1 because of course I did have some errors as we discussed right and this is the more detailed explanation so the aggregate result got appeared here with this aggregator flow which was another flow we ran it so to wrap up we use semantical as a planner to decide when we need to call that math plugin which is a python code and because of the user question that was a math question promp flow was able to call my semantic kernel code and get that executed this we evaluated that with batron using prom flow and then we called evaluation flow to assess that batron and give me the finding accuracy so we saw that how prompt full and semantic and and nicely came together to give me an end to an LM application Beyond just development we can take it further to deploy it as an endpoint to give you a chatbot that does math perfectly or anything you want based on your plug-in definition so again if you're interested to have dedicated videos just about semantic kernel uh I would love to know make sure you put that WR down in the comment section I as always I welcome and respect them and that would be me helping me to make sure that I create something that you love to hear about it as always thank you so much my friends and I hope you enjoy this video that's all my friend are you tired are you willing to give up well you have reached your limit and this is the time you need to push don't waste this beautiful opportunity God is with you drink big my friends believe in yourself and take action till next video take a good [Music] [Music] care
Info
Channel: MG
Views: 2,031
Rating: undefined out of 5
Keywords: AzureML, PromptEngineering, Azure, MachineLearning, LargeLanguageModels, AI, DataScience, DeepLearning, PromptFlow, AzureMLPromptFlow, ArtificialIntelligence, DataEngineering, NLP, AIModels, MLPlatform, MG Azure, MG Cloud, MG AI, MG ML, mg ml, prompt engineering, chat gpt, coding prompts chatgpt, semantic kernel tutorial, semantic kernel planner, semantic kernel python, semantic kernel skills, semantic kernel copilot chat, semantic kernel build, semantic kernel langchain, semantic kernel demo
Id: rUfTaZDL_z8
Channel Id: undefined
Length: 27min 59sec (1679 seconds)
Published: Tue Oct 03 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.