Okay. So one of the biggest challenges
as you're building LLM apps is both testing it yourself, and getting it
in the hands of users to get feedback. So I think there are sort of two main
things that you want to think about here. You want to think about building quick
UIs, that people can test things in which I'm going to talk about in this video. And then you also want to think
about automated testing which perhaps we save to a future video. So the first way of doing this is you
want to basically get your idea, your app, your product, your model, into the
hands of end users as quickly as possible. this allows you very quickly to
just get feedback on what people like, what people don't like. Often when you're coding and making
something for the first time. unless you're making it for
yourself and often even when you are making it for yourself. when you actually start to use it, you
realize that a lot of the assumptions you had were perhaps totally wrong. So this is where you're looking
for a good, quick framework that can basically allow you to build a
UI quickly to test something out. So up until now, we've had Streamlit. we've had Gradio. And now a group of coders at Google have
released this framework called Mesop. Now Mesop is not an
official, Google product. this is basically a product that
a number of the coders have been working on in their 20% time and
probably on their spare time as well. The whole idea of this is to be able
to build a web UIs quickly with Python. And really it's aimed at, engineers who
don't have a lot of front end skills. if you think back to Streamlit,
Streamlit has become very popular. snowflake bought Streamlit
for a lot of money. early on Streamlit was actually
used inside of Google, who rules one of the sort of early adopters
of Streamlit . Now, my guess is now that another big company owns it. They probably don't want to
use it as much internally. And also it does seem that, Streamlit has
gone in a different direction a little bit since, snowflake acquired them. and then on the other side, you've got
Gradio that was bought by Hugging Face. it makes sense for people inside
Google to want to have their own thing that they can control. And this is where Mesop comes in. Now they've opened source this,
so you can use this straight away. And it's still a bit off
being a version one in here. They've got it to a version 0.8 in here. But let's have a look at it. And then I'll what I'll do is go
through the key elements of it. And then we will also look at building
a chat bot with LangChain, with Groq, with some memory, how you would
actually do this to be able to use this. Okay. So in their blog posts, they basically
talk about, why they're doing this, the whole sort of idea that a lot
of developers just don't have the front-end skills for doing this. The other thing is they wanted
something that, is very easy and quick to put together. And I think this is where this
becomes really interesting. The whole idea of having something that's
already got components that you can just basically use as well as make your sort
of own low level kind of thing in here. So if we come in and have a look at the
components that they've got already. We've got really high level sort
of components like this sort of chat ID here where we can
basically just type in something. And we can see that, it can respond back. Now, how does that actually work? Here's the code for it? You can see, there's not
a lot of code in there. Basically, you're just
setting up a page, et cetera. and then you've got a chat implementation,
and you give it a function that will determine what to do with that. We'll look at that in a moment when
we look at the LangChain example. And you should notice that in here, one
of the key things in here is that this is actually building a flask app on the fly. so you've got, all these, elements
that are basically being built for you in the backend and just giving
you a nice, simple UI at the front. So they've got some
high level, components. So we've got chat, we've
got text to text, examples. We've got text image examples. If you want it to make some
kind of, image, manipulation tool or something like that. you've also then got lower level
stuff of where you've got, boxes you can see, if you want to go and
design your own thing a bit more. You've got sidebar navigation. you've got text, you've got marked down. You've got a whole bunch
of these different things. You've obviously got things like
buttons, text inputs, all these sorts of things that you can use. And on top of this, you've also got
a bunch of demos that they've done. So you can see in here, if we look
at like the LLM playground, We can see that we've got something
now where we can select the model. we can select the region. We can play around with the
temperature we can do set a whole bunch of different things in here. And we've got a full example
of the code of how to do this. So they've got a number of these different
demos that you can try things out. So if you want it to do build something,
that's text to image, you can just come in here and take that code reuse it. If you want to build sort of
small elements you can see exactly how they're doing that inside
of these bigger demos in here. So I think in many ways, it's
self-explanatory of how you could put these things together. they've got a getting started Colab. Let me just run this through. One of the nice things that you can do in
here is you can run all these in CoLab. So they've got a very special thing
where you can do sort of me.colab.run. So Mesop is just
basically imported as M E. You've got labs to basically give you
the different components, in here. And you can see that, this will
basically set up the sort of flask server running in the background. And then we can just add pages to this. So we can basically coming here, we
can define a page with a decorator. We give it the name of the page. We just pass in a function for that page. you can see setting up a simple chat
interface it's pretty easy in here that we can basically just define
that we're going to have a chat, we're going to basically set up that chat. We're going to have this mel
.chat So that's basically where we're getting the components. putting that in. the main thing we need to define
for the chat is the function that will handle the inputs, from the
UI and, do something with it. Now you can see in this case, all that's
going to do is basically just return. Hello. And then whatever I ask it. So if I ask it, hi, how are you? You can say sure enough,
it just now returns. Hello. hi, how are you in here. Now, we'll look at actually hooking this
up in a second to do some proper things. Overall though you can set up multiple
pages quite easily and quite quickly. and put a bunch of things in them for
when you want to test something out. So this makes it a great way
just to get going, get something that you can test out quickly. And build a very quick prototype, whether
that's not just in CoLab, but you can run it locally in VSCode, et cetera. and go through it. All right. Let's jump in and have a look at,
hooking this up with LangChain and Groq just to make a simple
chat bot with some memory in here. Okay. So in this example, I'm just going to walk
through how you could set this up with LangChain and with Groq to basically have
a chat that's got memory, that's retaining the memory as it goes through, and that
we could then basically test that out. okay. I've got some packages
that are brought in here. I brought in the Mesop package. Brought in the LangChain packages. The LangChain Groq. I was going to do something with
Duck Duck Go maybe we will do that in the future for this. so all right. I basically start off just by getting my
Groq API key, and then I'm going to set up the actual Groq chat part in here. So I'm going to basically
set up a Groq LLM in here. And then gonna set up a
prompt format for what this. okay. I've got my system prompt that basically
says you are a helpful assistant. Please be brief and concise. And to the point, answer in as few
words as possible, but still give the user the key input thereafter. Your name is Isabella and
you are 28 years old, right? So that's coming from
the system prompt here. the model that I'm actually going to
use is Llama 3 70 billion on Groq. and then I've got a human input here
where are we going to say, use the conversation memory below to help answer
the most recent query from the user. So then we're going to pass it in a
memory which is going to just be sort of a list of everything that's been said. And then we're going to
pass it on a user query. And then we're going to
prompt it for getting the assistant message back in here. so all right, once I've got those. I basically just bring those in, into
the chat prompt template from messages. set up a LangChain
expression language chain. So I've called this conversation chain. You can see now I can basically invoke
this by passing in a text message from the user and then a string, which is going to
be actually I've just changed is now to be a string that's going to be converted from
a list of messages that we've had in here. So if I set up that chain. you can see that we asked
it, how are you today? It then replied, I'm doing great, thanks. and then we've got all of this
other data that we got back. All right now. we don't need all of that. That's coming back. What we need is that the actual message
so that the message is going to be under response dot content in there. All right. So next up we want to,
build the actual MOSOP part. So we're going to bring in Mesop as me,
we're going to bring in the components. So Mesop labs as Mel. we're going to run that to just get
that started and you can see sure enough, it started now a flask server. Now you can see I've been
running it already, but okay. We've got this going, if we were running
this on a local machine, we could just click it and open it up straight away. In Colab we'll actually run sort
of inline with the cells and stuff. All right. To make our chat is pretty simple. we've got the chat interface. And we need to pass that into a function. So the function is going to be
called, transform in this case. And what it actually
passes in is two things. It passes in what the user
just pressed submit on. So like the last user query and it
passes in a list of chat messages. that it's calling history in here. So unfortunately, you can't simply
sort of debug with this in CoLab. but you can see what I'm doing here
is I'm going to basically set up a new string because what I want to do is when
I pass it into LangChain, I'm not going to pass in a list of, chat messages. I'm going to first convert it to a string. So what I'm going to do is
basically just say, going through this list of messages in history. for each one, we're just going to add
the role and then the content of that. So the role will be user will be assistant
kind of thing as we go through this. Then, what we're going to do is
we're going to run the conversation chain, which is going to call Groq. We're gonna pass in the
prompt that we got up here. It's going to go in here and then
we're going to pass in that history string that we've just made there. and this is going to give us, a
response back and then really, we just want the response.content. So that's what we'll return
back to the UI for this. Now it will handle the state for this. it will handle the state of, you'll
see now when I run it, it will handle the state of storing, the
different, chat messages in there. So this is one of the good
things about Mesop is that it can handle the state for you. So you can see if I come in
here and I say, hi, there. Sure enough, we get hi. I'm Isabella. Nice to meet you. How can I help you today? let me try and test it. All right. So let's say I tell it. Okay. I'm Sam. my favorite color is blue. Let's see. If we can get that. okay. So it says, nice to meet you, Sam. Blue is a lovely color. what's on your mind today. Okay. Let me ask it something about. Tell me about the company that made you. All right. it's basically I'm an AI assistant. So I don't have a specific company
that made me, it's interesting that it's not saying Meta here. I was created through a process of machine
learning and natural language processing. All right now you can
see We've gone past this. So you can see now if I ask it
something like, what is my name? it responds. Now, remember we asked for a
very succint, answers back. So it's obviously getting that from
the memory in the conversation, right? If I ask it, what was my favorite color? Blue. Okay. So we can see that now we've basically
got a chat bot going we've got this going. Now, if we want to log
these out somewhere. we could just add this in
here that you know, here where I'm doing this print part. We could actually just put in something. to call another function, to do a logging
function to save it to a database, to do a whole bunch of different things. It's very simple to get
something going really quickly. So you can see in just this small amount
of code, we've got a UI going for our chat bot, that we've got going here. So if we wanted to test out a RAG and
we want it to be able to do some things, Maybe in the future what I'll look at
doing is we could have a chat going where we can actually see the RAG outputs on
the side and stuff like that as well. So there's a whole bunch of
different ideas that you can do with this as you go through them. So hopefully this gives you a
taste of what you could do and how you could get started with this. I'll probably use this to show, some
sort of local bots and stuff like that, running with Ollama in the future. I do feel, this is a nice, simple way
that you can get a UI going, that you can get something that you can start testing. and you could actually then put
this on, something that you run in the cloud quite easily as well, so
that other people could test it and you could see the results there. All right. any comments or questions, please
put them in the comments below. if you found the video useful,
please click like and subscribe. And I will talk to you in the next video. Bye for now.