Building HelloDallE - Connecting Azure OpenAI with Semantic Kernel

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] [Applause] [Music] [Applause] [Music] [Music] [Applause] [Music] [Applause] [Music] hey good morning and happy Friday yes March 8th 20124 thanks for joining us on the dev talk show I'm Chris Gomez along with Rich Ross hey Chris and uh yeah this week went fast so it was good though it was a good it was a busy week um but it's it's the end of the work week and we're going to do uh hopefully do some fun code here yeah yeah I think so yesterday we were working on uh building oh look we have our guest we have a guest we have my dog who decid sometimes sits in the frame but actually not very often no but today has decided at least while we're on full screen with us side by side you want to be huh uh oh live live production what can you do right that's it uh especially without like a producer to say dogs on set so in any case yesterday we were working on an app that uh I guess this is your name for it Hello Dolly right and Dolly is it not your name yeah it was my name yeah it's the project right so you know project names are always cooler than the real name so that was the right right and and Dolly I think I'm correct in saying is a text to image model by open AI just like uh and again I I don't know that I have all the terminology yet because all of this generative you know uh these generative systems and models are are kind of new and as a developer we're learning from the beginning too so like chat GPT is you know an example of a of a like a natural language I don't know I don't know if I'd call that the model that there's large language models and then this is a text image model but but in some respects we're using you showed yesterday in in in yesterday's show that we're using a little bit of both yeah to get from a prompt uh uh your app you take some some words as descriptors maybe your mood maybe a color maybe some other descriptors that you might add later that's going to get turned into a prompt and then that's going to get turned into an image and all hosted in a Blazer app so it's really bringing a lot of stuff together yeah definitely and I think that uh oh I wonder if you hung up if I hung up but see that there hard to say uh let me see if I can figure out if it looks like I'm not the one frozen so what I'll do is go to full screen until we figure out what happened there with Rich uh oddly enough only one of the screens is frozen it does look like you're on rich just that the screen that you were talking on is Frozen but the other view that I have of you I can see you moving around so in any case we'll see what happens there um yeah so yesterday if you watch the show you'll see we did a little bit with the uh we did a little bit in code and then we were running out of time and it didn't seem likely that we were going to be able to implement the rest of it in the Blazer app so we went straight out to the Azure open API service and the Azure uh uh AI Studio might be the name of the product um and Rich deployed some models a predictive model with uh the G GPT model that's predictive that basically took these descriptor words for example let's say the mood was happy and the color was blue in fact I think in this case it might have been the mood was happy and the color was red and then I think he might have added in uh winter or something like that and it created a prompt suitable for Dolly um and that prompt was to say uh that uh to create an image of a snowman uh drawing with a red crayon I believe and sure enough that's what Dolly did ran that a couple times and so um and so that that at least showed that we can do this we even though we were doing that in the portal in studio we showed we can do this now the trick is go get going and getting and implemented in Blazer and the rich is back we hope I don't know what happened yeah just froze out all of a sudden there hopefully my description was okay I don't know if you could hear any of that I did not I was I was troubleshooting but I'm sure it was spot on so uh yeah I kind of went through what we did and how we were implementing into Blazer working with blazer working with components and working with how these components when their state was changing that was being uh passed as an event up to a boore a a I don't want to necessarily called it a parent component but another component that was responsible for saying now I know I know when you sub when you click the submit button I know how to call out to these services but we were running out of time so we just went ahead and used the services on the portal just because I think you wanted to tie it all together like here's where we're heading this is yeah show the vision of where we're going right so take that away from being that manual bit we did yesterday which was pretty cool we got some interesting images out of it just by throwing a rough prompt up there and that's something else that we can refine over time as well and say okay how do we make the prompt stronger because we just kind of wrote it on the Fly the other part of it is that's really cool when we think about the AI components is I you everything doesn't have to point to One open AI large language model right I could use one model like we did with the the um GP t35 turbo instruct to do the completion and then use a separate model to do the image bit and kind of chain those things together and that Foundation is what we're going to put into the project today right so we're going to start to wire up the connections to open AI but we're going to use an SDK uh created it's open source but it's created from uh some teams at Microsoft that are um that's called semantic kernel and what that does is essentially um probably can share my screen because there's a GitHub out here all right we'll get that moving and uh I gotta oh I think that's how it freezes you whatever you just did there um because I can still hear you I can still hear you but uh and in fact like see there you are trying to come back well that's the other one I know it is but anyways so while the other monitor's Frozen there and I'm not sure what what seems to set that off um but we still have Rich right he's still live I don't know if I should if you want me to pull you off this monitor or not yeah because I want to do the other it's my year oh okay all right so yeah little bit of a technical difficulty as we are live and you know a show without a producer and hopefully the hardware all holds up and right now it's fighting us a little bit but we uh I Know Rich is trying to get back on stage here and also show his system where let me see I actually um I don't I don't know that I have the latest code because I could always bring that up but while we're waiting let me see if I can get that going whoops not what I meant to do there at all um okay let's see if I can get that going and all right so now you see this is actually my system here and I'm sitting here in P shell inside windows terminal and a prompt uh extension I like to use called om my Posh which apparently has an update which is fine I will update another time the nice thing about oh my Posh is I get it through wind get so you can see it's already trying to tell me how to how to update it but I'm not going to do that right now um so I know that there is a GitHub repo for this which I don't believe I actually have on my my system so if I clone that down then I now have it which is great and it's one of the nice things how the world's evolved here to just let me go and grab code even if Rich isn't here to help me out now he's back in let me see do you want me to bring you back in here looks like maybe you're okay possibly you are back on stage yeah can you hear me I we can hear you yes absolutely awesome so uh yeah we're playing around with some new hardware and that caused some things to crash so I am bypassing those things okay and just basically taking my camera feed straight out of uh um straight as it comes into the into the box so we should be now all right I hope yeah and so anyways uh we were switching over to your screen right which we can do yeah uh that all right we'll see if that happens let's see settings present uh screen share do a window this one sh all right so far so good right cool all right that looks like it's working um oh there you go you've got the The Good the good screen going okay so this is slightly larger slightly larger exactly as designed um so this is what we're uh going to build or used to uh connect our open AI service that we set up uh yesterday and share that that out through our application that we're building so what's great about this is it's it's an SDK but it gives you the um ability to use you know the language that you're interested in whether it's C or Java or I think the in the last month or so they just released a python uh version of this because people are asking for it uh but it also abstracts away the connections to those backend llm services so if you want to use Azure open a grd if you want to use um the the the C the uh Company open AI their service directly you can do that too or hugging face or other ones as they continue to come online you've got the ability to essentially bring any one of those Services into your application essentially writing at once this way and then being able to you know by using different connectors bring in the right service for the right uh thing that you're calling so that's what this uh Essen provides there some great getting started here uh so whether using C python or Java uh and then you know there's a very um uh kind of getting started uh uh app um uh uh console app that you can use to get uh to get up and running uh so we're not going to do any of that we're actually going to um go right into our project and start uh putting the pieces together to to make all this happen so this is the GitHub repo this here is the documentation around it uh so there some great getting started and then uh kind of where we're going to focus is in this section here around prompt engineering because what we're doing is we want to um put together the prompt like we did yesterday send it off to Azure open Ai and then be able to um to use that to uh um get back a a complete prompt that is the um what we'll then pass to do if uh and all that if you saw yesterday all that makes sense if not hopefully before the end of day today we get to show most of that um yeah you know I have to admit that that even we had talked about this project uh offline and it wasn't until I think we went in and you showed um how we were going to take these little descriptions and turn them into a full-blown prompt and then put that into dolly for an image that's when I was kind of like oh I get it now um yeah no it's it's uh and that's another reason for building this out because then people start to see that oh we can incorporate all these things together um and make something really cool happen so let's um let's start going down through the process of uh getting our system wired up with uh um basically getting it wired up so we can talk to open AI essentially which is what we want to do um we're going to need to and I probably have to um go uh dark on this when we do it but for right now um so you're using the aspet user Secrets feature here so that you don't have to put this stuff either into Source control or just any place where especially since this project is on GitHub right so you certainly don't so if you need me to pull away here as you're fill I I'm letting you fill it out yep I'm not going to put the values in but I'm just kind of showing I do like we're go ahead I'm curious how close the completion suggestions here are to what you're actually going to do and so again we've used Secrets Jon on the show before um if you're an asp net developer and you're not aware of it this is not a visual studio feature you know you saw him rightclick on the project file and say manage us your secrets and yes that is the Easy Button for getting a user Secrets Json a secrets. Json file uh not just referred to in your project but also created for you to save this stuff but this is available you could be you can create this from the net CLI um it doesn't matter what your IDE is doesn't matter what your editor is doesn't matter if it's neovim this is a feature of net it is not a visual studio feature and uh and now so what you're doing is you're putting in settings for uh an Azure open AI settings kind of like a parent object and there's a resource name and an API key that we need for this and I can see on the right you've already got an interface kind of ready to go um yeah just the implementation now of of picking this stuff up right yep so we should be able to load on hopefully um I this should be in the in the programs file I'm pretty sure let's see yep so we actually have our ad user secret and it's coming from AI settings uh and we can bring that in okay um there are probably going to be a couple more that we have to add to our secrets. Json but I think uh because I think semantic colel asked for a couple of other things and that's kind of where start putting these pieces together I want to get um get uh get things wired up when it comes to the semantic kernel part so that we can then come back and uh um and get to uh and then have all those values in there so that's kind of where my head is around doing that so let's add a uh um so one thing that I noticed you're using add user secrets of T um which may be I I could also call like the generic version of AD user Secrets now that feels to me like you are combining um a feature of settings which is uh pocos or objects as your target for the settings to be dropped into along with add user secrets so when you added user secrets to the app did did you go and add line 15 yourself after yes okay that's fair enough because I believe I believe you can say manage user secrets in Visual Studio or create the secrets file and wire it up in your project in you know separately and you don't actually have to add it to configuration because it's already in configuration by default but what you're doing you're going one step further and you're saying I'm telling you that you can go get this stuff out of a a plain old object yes okay cool that's neat I've never done that before that's cool right so we um yes and then by getting them we get the you know we can put the hierarchy in there for well this is you know our open settings and if we have another higher level group we could set up a different class that does the same thing yep um okay so this is the part where I'll be I'll be honest I've done this you know once a little while ago so we're going to um kind of take our time and walk through wiring this uh this um this component up so um and let's see well I think the first thing we have to do is um uh which we didn't do yet in tools um and I always forget what the name of those are so it is you can always go out to there we go uh Microsoft semantic colel core or Microsoft semantic colel yeah use that one I caught a glimpse of some of the multi-targeting there it looks like there's net standard support and also NET Framework support yes which is kind of interesting when you think about it to be able to have uh um a version that goes back and talks to uh uh to framework which is pretty awesome so we can actually make some of those older applications you know that still exist and are still probably part of your business you know give them some intelligence as well right no every day I talk to customers that still have Mission critical framework apps they're still supported as long as you've there's a list of versions that are still supported and it's mostly the very latest and then a few somewhat obscure older ones that go very far back in time but uh but it's not a horrible list and and it it shouldn't be too hard to take your framework out and make it at least to bring it up to date with most current standards there's some edge cases where I think some developers feel a little stuck but most people should be able to you know if for some reason you have to stay on frame work right should be okay which there there are use cases I I do I do see them yeah definitely those are things that uh can't get around the you know it it's great it's working let's not mess with it so yeah yeah um or maybe the device that we're using the company that made it I don't know maybe they don't exist anymore right and so there is no update coming for the SDK something like that who knows right it's uh all right so we have what I'm doing here go Ahad sorry uh so what I'm doing here is um already got some intellisense around semantic kernel so it get brings in that using statement so we're going to create uh um basically a internal object inside of uh um inside of our class here and then what we're going to do is um we're going to make that equal to um create Builder so the the kernel object is essentially the object you use to communicate with your other you know to basically um communicate with whichever uh API or llm service you're connecting to so it's it's kind of like the um uh you know putting together like we do in the in the program file we put together all the services and things that need to go into the build uh uh object there we're doing something very similar here so we're can do create Builder and then off of here we're going to do a DOT um add AZ azure open AI uh we don't want text generation we actually want to do the chat completion uh capability right because that was what we were doing yesterday or um uh last time essentially I think let me just double check [Music] that sorry pretty sure that's what we want to do okay that's what we're going to start with because I'm pretty sure that's what we need to where we need to be um and then this needs uh some items in there to uh to essentially make it work so we need to be able to point to either some kind of a a configure configuration object or we basically pass in things that set up the object so we set up something like deployment name so remember from the session we did yesterday that we had to create model deployments and then we named them when we did that so we're basically explicitly telling this uh uh service when you go ahead and connect connect to that this specific deployment because you could have multiple GPT 35 turbos deployed essentially each with different names so they all get um kind of for resource leveling right our applications are hitting this one yours are hitting the other one that kind of thing um okay we also need a Endo so that is uh so that's um essentially the URL that we're pointing to to get to uh um to our service so now that we know the now that we have the end point to the service and we know the deployment model or the deployment name that we're put that we're using we also need um our API key and then once we have all that we can essentially build it and then we're we're done now this is the part that um to do this the right way right this is where we do that dependency injection bit from our our eye configuration right so we have I coniguration and I know we've we've talked about that before about being able able to essentially because in our uh in our program file we already have or will have the configuration values inside of there to communicate with um to communicate with our uh our service so we can because they're created at startup here we can then use them to inject them at runtime into our class with with all the values kind of pre-populated for us so we don't have to go and talk to the configuration service every time it's already kind of been done for us essentially if that makes sense I think so unless I have it totally wrong um and maybe that's and maybe and um and this is maybe it's not here because we don't have to bring that up well we probably should that should be that should be fine because we should be able to do go back to these settings and add the the pieces that we're missing so this is actually going to be deployment name we got that there and there and then we also need a we got the API key uh we need our endpoint URL or our endpoint essentially so let's do oh that was so helpful okay cool all right so all of that basically comes across so we're good there we' got all three of those pieces um yeah that would be yeah but it's really not it's actually see this is the stuff that probably should have put in place ahead of time because that's the stuff that always takes forever the setup in the config yeah and I'm I'm happy if you have other suggestions about how you do this I'm happy to hear your thoughts uh around that no I'm new to the semantic kernel here and I was looking at some of their some of the code and what you're doing right now is you double checking back and forth like you're adding open AI chat completion which takes a model an endpoint and an API key MH and and I mean it doesn't actually take those as objects right there're strings where you're basically telling it hey here's the model name here's the endpoint here's the API key and then open AI chat completion is another one in this sample which I'm not saying that's what you're using just I'm just looking at how these different samples go right now do you need okay I keep waiting for when you need me to pull it off the screen yeah why don't we uh because then because with these with just these three values I should be fine I can basically then close this up and not see it anymore and then everything else can uh um everything else can happen in the class okay that was what I was hoping to get to um so what I am doing is um I'm going back into my AI Studio from yesterday I'm going to the same project and I'm going to go and so when you're in AI studio and you create one of these well is the is the API key at the deployment level meaning like there's one per deployment or and that's no so it's a little bit even higher level than that like yeah so for the service itself there are there's an end point so the URL that we can hit to get to it and then there's a key that we have to include every time we make a communication into that yep and that happens at the service level so once we have it defined every deployment mment would have the same information so we could have multiple deployment names inside or separate section that's all of our Azure openai deployment names but then it brings in the settings from Azure open AI service to reuse every time we wire up a semantic kernel instance so yeah we might think about changing that up a little bit but let's see so me to well we being not you we you say we think we might change that up a bit that's not oh that's something it's my project so it's something I think probably you know if you think about re redefining the oh okay I understand CU what I was trying to get to while you were working and I didn't mean to interrupt because I said I was just thinking in my head I said okay so when I think about the the twitch developer uh experience you you go in and you create an application which is just a fancy way of saying listen I want a client ID and I want a client secret to talk to the API but I might want to segment those and say Hey I want one for this app and I want one for this app and I'll make slight tweaks to the different permissions that they have and uh so I was just wondering you know if in this case is this API key for your whole access to the service uh of the open AI service and all the deployments underneath it or is it like a per and and you know sometimes those things evolve over time too with different Services as uh you kind of begin to see how people use them so we'll see but all so all those secrets are getting dropped in there H all right one last one paste and I suppose unlike some of the other demos you know even while we're sitting here working somebody could grab those keys and run up a nice uh AI generation bill right doesn't take long yeah yeah uh yeah those that would be very bad if that happened yeah um although I think now that if uh we still have cool so I think you even if we it's not that smart it's not seeing those okay interesting okay all right so we've got that we got a deployment name we got those those pieces in there all right let's go back to the project um and we're fine if you want to bring my screen back up I just want to make sure that because I can't really tell so I want to make sure that the secrets file is closed and we're good and and here we go uh at least I think so there we are all right awesome all right so and this is where I may need a little bit your help because what we're doing here is really just net stuff um we should be able to load essentially those Azure open AI settings from the um from our application and bring those settings into our uh um yeah bring those settings into the configuration and then I should be able to reference them via the the configuration uh dependency injection I should be able to reference them here and use them for my application yeah if you have a class so what you're saying is you have a class definition that matches essentially matches the Json that you expect right right so um I'm trying to think of there's a way to do this where you uh I'm thinking um I mean you you can do it manually which I don't think is the whole point I think the point is you're saying you want to bring it in uh think so if I ran this and we did um where is our H page we sit down here since we still have our submit prompt up and running if I just said uh um new what no why did you do that there we go that probably needs the on oh well that's interesting see this is the part where I get stuck okay um if I was to do this that's got to come from somewhere it's not right it's not in this class and what are we in right now we're in home. Riser so I suppose you could always uh I mean you could always use the dependency injection I think and uh and have this injected right right but why does why does if the configurations aren't used directly here in the home razor why do I have to inject them here well mainly because you Ned up you Ned up a class that takes the conf fation right like it's part of the Constructor right so um I mean so is there and then this is the other thing right if I got rid of this I need some way to inject that into this glass so that it runs I think that's kind of where and and maybe you know we are going off on a tangent here but okay so um I think you can for now leave this alone okay and put and and basically so what's happening is I think your semantic colonel helper is going to be a service in fact you put it in the services folder kind of think it that way seems like that's where you're going so I think that you could say that semantics kernel helper is something you knew up that takes configuration we could always look at that later so I think the way you had it was fine the problem is is now what you want to do is you want to do you want a kernel per do you want a kernel per user like is it scoped it's not scoped it's more um it's going to come and go depending on what's being called right this one particular instance to start off with is going to have a colonel that knows how to talk to chat completion I'm going to have to have another one that's going to talk to the image service so that's got to be wired up differently so I'm going to you know po I'm just trying to figure out if the helper is going to have maintain any state or if when you make calls to it those calls are stateless stat and it's okay if it's a Singleton so yep the point is is that I wouldn't make it a Singleton because then it would be there across everybody's yeah it just depends on if there's any state from call to call I think um but the point is I guess is that I think what you want to do is um let's see we already have it as a service so you're going to want to go you're going to want to go uh let's see back up to program.cs and add it as a service and say like builder. services and add this to dependency injection right so so you're gonna want to yeah oh look it's even trying to help you well that's the open AI one um you're right semantic kernel is the one you're doing right and it's when somebody wants an i semantic kernel helper actually it doesn't have to be an interface you're saying when somebody wants a semantic kernel helper all right did I spell this wrong H I did that's terrible oh is it spelled wrong in the class oh that's all right you can fix it yeah I really budged that um naming is hard and spelling is even worse um let's see okay okay so you have this stuff yep and you're going to go back to dependency injection and say hey when somebody creates one of these they uh okay now go back to the semantic kernel helper yeah okay so it doesn't have access to config yet right or it does it does not that's just a local VAR variable so what we're saying is okay so yeah yep and then we go back to here right and save that let's go back to this one and then we wire in something that gives me I configuration I think you're just going to say builder. configuration I think I forget because you actually are resolving it here [Music] um okay so I know we need to do this because it it's part of you know and we're going down a path of trying to make sure we extract out out you know and build things in the right way by injecting our configuration settings at runtime which is the right way we should be doing this um the if I if I understand the way this goes this user Secrets should create an instance of open AI settings that has those values in it because we have that open AI settings class which is here with all the appropriate property names that match to the project I or the file I can't show you anymore right so those if we run this we should get essentially this with our configuration values inside of it okay it's just we then need to be able to reference this to um we just then be to be able to reference this in the uh in the other space so let's just make sure that's so oh because we don't it's just a class that we don't knew up anywhere that's why we should be able to look at the configuration object at the end of this and see see that it's there yeah you just have to be careful cuz those are the values you're trying to protect everybody from right all right let's see they were too quick for me all right uh redo this our app has configuration with 122 sections in it okay yeah so you know the default configuration brings in a lot and uh um what the only thing that you got to be careful of is is that the values might be in there so um if you're scrolling by them real quick they they certainly may be there so you also may have to drop in further to see them so in any case that's what I'm thinking yeah although it does look it does show you everything there right um yeah maybe drop the screen for a second okay so interesting they are there but they are those values don't oh wait a minute where's yeah okay there is a it's interesting there is an open AI settings colon aoi endpoint so they do exist uh um we going to try one thing and if that doesn't work then uh I think we're g to all right you got to go top of the hour right so we got to do something um yeah that's right I do have to go top of the hour today all right let's just do this for all right we're still good you see what we're doing here right no because I I just the screen is just really small oh okay so I can't oh because it's down there it's way down there for me it's way down in the studio it's it's really it's not something that I can um easily look at okay I can't show you this but this is a really cool feature if you take if you highlight something and um hit quote it wraps it in a quote Oh if you have text highlighted yeah I don't think I knew that no that's I did not know that either that's pretty cool all right so if we do this of um this now is just going to be and you can show my screen I'm good okay endpoint okay uh awesome all right so I hardwired him into the file we're not going to check this in but okay we'll get past the the configuration stuff after um but essentially we've got a way to to kind of kick off the the building of the of the colonel essentially so um and then from here we can we do want that to return string we want to do oh um really what's it complaining about colel doesn't it's complaining that Colonel does not include a method for invoked prompt what does k uh include it does it's invoke async I think or invoke yep okay there you go why did you know copile can be immensely helpful and it can be immensely int or intrusive as well um all right invoke prompt might as well be consistent because that's kind of what we're doing oh because the function is not a string okay what is get what is coming back is I'm getting a function result okay now everybody's happy so this is good um and then so inside of here we do want to we're going to so the prompt we're going to so this is really the um trying to think go right way so this is the um uh it's not response is getting overused here so users this is the string that you build right yeah exactly exactly y so just to kind of keep straight between all the different responses and things that we're passing around so this is the actual user answers collection right it is well um not a collection uh uh CSV technically comma separated values um okay and then inside of here we need to um start to create out what uh what our prompt is um so so make this private because we don't want to see this um right so what we're doing here is we're basically saying that we're going to um start to create out our uh our prompt and there's some structure that semantic kernel provides to us uh to kind of help with that so let's do this this is our and this is where we can start to do what we did last time when we manually went through and created it so this is where we start to create what's that what is it that we want the AI to do for us okay um and actually sorry this not going to go here it's going to go here and you'll see why in a second this should be prompt um okay I um behind that all right you're just seeing a static image of where things are give me one second um cuz you might remember well sorry it's okay you can't see what doing I have a trick now so that we could uh um do some of the stuff off screen uh here back to this all right now you can see my screen and what we're g to do is essentially what we're doing is we're giving um so we're telling it you know there are some instructions that you uh as the llm need to be aware of when you're doing this so fortunately I can word wrap so we can see things a little bit better this is essentially the same prop that we used from yesterday right uh but but the difference is going to be Umstead of this we're going to include oh cu the dollar sign sorry oh I see um yeah you got it as an interpolated string okay MH so essentially we're dropping that inside of there and then we can see um so much like yesterday this is what we want them to complete and and return back to us um yeah that should that looks pretty close Okay um so let's see so we should get a prompt and then we'll get a function result back and then we shouldn't really return a function result so let's see what a function result is uh otherwise the homepage has to know about semantic kernel and right now we're kind of extracting it out so it doesn't we should be able to return essentially The Prompt that is um yeah essentially The Prompt that got generated it's just an object though it's not really because an object could be anything oh okay we could do function result. getet value and send that back okay oh I'm sorry it's the value this is a this or a base type is the type expected to be passed as the generic argument to get value so get value probably requires a type right from the sounds of that right you need to supply a type in there oi all right but we should be able to pass that in as well okay so then if I understand this right we're going to this and then what we're getting is that result that work we before get type no Well's the same um value type sorry uh okay and then from that we can get far um does that really not work it keeps forcing me away from that that uh yeah you have to know the type so you're expecting a string back uh I don't know it could be any I I would think it's a string but I guess what I'm trying to say is this will tell me what the type is yeah but I'm not sure you can I'm not you can do that without reflection or something similar to reflection uh um that's harsh all right let's you could always say get value of string saying that I'm pretty sure we're getting a string um do string there we go take okay okay we could try that and then invoke prompt function result all right we don't need this anymore because we're figuring that out later okay let's do this this okay all right let's try it out what could go wrong this all right you have your oh need that anymore why is that still there there we go huh interesting sure okay no break points so happy and blue uh home where's home home. Riser yeah there we go grab that all right oh we never call it because we ah cuz you commented out yeah and now is Colonel what's coming back from there you cement it colonel helper which uh is respelled okay and then does that take arguments or no oh yeah yeah yeah that too okay now and then you have to call it right yep colel dot um invoke is that the public one wait a minute yeah you did have one called invoke prompt or yeah this is uh they're backwards um boom all right well there's well that's fine prompt will be uh persistent because it's here it's here as part of the thing so we can pass that in so I'll just call both EV methods for now save that and that should give us a response of far although we'll probably see it way before then and void to an implicitly typed well I think because invoke promp a sync uh returns a task you have to await it right I mean you don't have to certainly the easier way which is fine cuz that was an async method yeah I did put the weight at the end but I guess that's not what it needs okay okay and then does invoke prompt Ayn need to uh no because prompt is okay yep all right yeah yeah not elegant this is you know we're now you have to take my value does that have to go somewhere or is that automatically going to be picked up by that Dolly comms in the four minutes we get left okay it will but right now no um all right yeah let's try this okay so we got here this is basically just going to wire up our prompt and prompt yep so it did include those um those values in there at the end kind of almost to the edge of the screen there okay um all right let's see what happens yeah we're not so I guess the huh interesting yeah I kind of lost the the thread I think so we're here I wonder if it blew up yeah that's what I'm thinking where were errors oh because of this um just say ex equals ex that's my trick when I just want to see what it does oh and then I'll and then I figure okay I'll figure it out later but at least I can stop here yeah and uh that's fine it's not going back so okay the check completion operation does not work with the specified mod model choose a different model okay huh well at least we know it was something yeah well it also we also know it connected right so we kind of got to the um to the space that we were looking for um I just wired it up with the wrong model so let's see can I do this here we're going to have to restart anyway so let's so I can do text embedding text audio text to image we're not there yet that's interesting that's new um text Generation all right I'm game does that seems to like it dud we're one minute over yeah oh that's cool okay little better yeah it generated your prompt yeah right and then and then next time what you would do is pass that prompt to the dolly generator exactly exactly which means we've got to come in here and wire up a separate this one's a text an AI text generation right um we want to do one that's text to to image essentially so okay fun cool okay so yeah no it generated a prompt I couldn't quite read what it said but it said generate a image of a smiling face or something like that I don't know yeah it's actually pretty cool uh oh that doesn't do it because you had happy and blue so generate an image of a smiling face surrounded by a beautiful blue sky okay all right well definitely it Incorporated happy and blue we kind of got there yeah bit of a detour but cool no that's the first time I've ever seen semantic colel and and while you were working I was trying to pick up what I could from uh that GitHub repo which is interesting because besides having documentation there's also some um some uh net interactive enabled notebooks that I think you could use to kind of get started and and see um you know basically you could run these and the code runs and and you can just see what it does and so it was interesting I didn't quite follow it too closely but there's some interesting examples running prompts from file there is a dolly sample in there there's the chat GPT with Dolly sample in there so yeah all right and some I'll be honest some of it I kicked the tires on probably late last year uh and they've continued to pull the documentation together and the examples together and the notebooks are a really good way to step through and kind of see what's going on we kind of jump through quickly here is cuz we had an hour to kind of wire this up um um but it uh I yeah this actually this actually got farther than I thought because this prompt component here can become much more complex okay um and even some of the some of the things I tried previously were much more involved creating a much greater prompt that had additional values in it at different variables uh so that you can kind of shape the response that comes back out of the out of the llm that you're talking to so the things we talked about yesterday temperature and top PE and things like that being able to inject them as well when you send that prompt in so if you hit F10 a couple times we're going to run by that get value um oh because I just want to see what it does if you hit it this should get you back out one more time should get you back out to uh okay and then let's exit and see where we pick it up back in okay oh we're not picking it up okay oh wait yeah we are my okay okay so you so it it what you wanted happened you got your prompt and then from here you can move on what I'm curious about now is I I'd like to see you just before we go run it again and see how different The Prompt is oh yeah yeah because this time it was an image of a I think it said smiling face and a beautiful blue sky so if we just go click the button again we should copy this okay surrounded by a beautiful blue sky all right so that's what we got the first time so if I just F five we should get back to the page yep now that it it's got it got a its interactive server components got a little concerned yep but okay yep all right and we can F5 F5 and now our result is although I think and maybe this is what you wanted to do I I switched the values and we didn't to but as you can imagine it's going to generate an image of a cautious green chameleon camouflaged in a tree while keeping a watchful eye on its surroundings and that's because uh you said cautious and green right yeah yeah yeah and let's see so I think this is still up and running play ground oh I'm sorry not here uh images uh where did we do this yesterday because we did this um well there's no deployment is selected right now oh this oh there you go um oh cuz that's not can I just I can't take that shoot there we go I'm kind of curious what this looks like that's why I'm that's pretty interesting well yeah it's neat stuff it's pretty impressive so all right so the prompts being generated yeah that's good um that's a good thing um I got to work on some configuration stuff but once I nail that down then uh we can come back and uh look at pushing that into uh pushing that up into the image and then what do we do with the image once we get it so yeah some fun stuff coming all right cool all right well hey everybody watching thanks for joining us hopefully you learned something and now you know about semantic colel and Dolly and we learned a little bit about the open AI service and it you know uh like everything on the show you know sometimes sometimes we're just trying stuff out and see where we get but a lot of people say that that that's that helps them understand how you figure out how to do something instead of just being stumped because I certainly have been there where I look at docs or maybe I don't even know where to begin and that's can be really frustrating so we got quite a long way um there's some interesting stuff they do in the semantic kernel there I I really want to explore more in my opinion I'm interested in their their how they return that function result which is something like I I kind of like because that's that kind of uh instead of doing something which I think traditionally programmers did where if it didn't work they might return null I've personally tried to get away from that and return something more uh meaningful with more context like a result object like that um not always the easiest thing to promote uh at work but maybe if we do more of it here you know it might be something you take to your job so in any case definitely yeah so in any case and you have the ability to to send this out or is that me uh if you could I I actually can do it now because we're not using the other thing okay I can take care of it all right well hey thanks to everybody for joining if you missed any part of this it is on YouTube at youtube.com thedev talk show uh we have a morning hour playlist with everything that we've been doing in these mornings uh just you know just before stand up which is something I have to run to so there you go all right so for Rich I'm Chris and thank you all for joining and we'll see you next time on the dev talk show
Info
Channel: The Dev Talk Show
Views: 63
Rating: undefined out of 5
Keywords:
Id: dcctuIX5uZw
Channel Id: undefined
Length: 85min 40sec (5140 seconds)
Published: Sat Mar 09 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.