ChatGPT was a game changer in AI. Many tools like ChatGPT have been developed
in recent years. Today, I'm going to talk about the Groq API,
a new free AI tool. Let's take a look at what we'll learn in this
video. First, we'll discuss what groq API is. And then we'll cover how to use this API with
LangChain. Next, we'll build an app using this API with
Gradio. At the end of the video, you will learn how
to use Groq API. Okay, we've seen topics we'll handle. Let's go ahead and start with what the groq
API is. As you know, latency is an important metric
for optimizing real-time AI apps. This is where groq comes in. Groq is a company GenAI that aims to reduce
the latency in LLM responses. When we look at the ArtificialAnaliz.ai leaderboard,
we can see that it shows 15 times faster LLM inference performance. Groq currently makes available models such
as Llama 2 70B and Mixtral 8x7B via their APIs The Groq API is similar to the OpenAI API,
allowing you to use language models. To test this API, you can use the groq playground. If you have used the openai playground, this
interface is similar to it. You can set the model parameters in the right-hand
menu. We want to select the mixtral 8x7b model and
use the default settings. To predict, let's write a system message. You are a helpful assistant. After that, let's type a user message. Tell me a joke There you go. This is very fast, right? Let's generate another response. Write a Python script to output numbers 1
to 100. Let me click the submit button. As you can see, the playground works very
well. You can also use the groq API to build AI
apps. Let me show this. To leverage the API, we're going to use LangChain. Let's open Colab and then start coding. First of all, we're going to install dependencies. Let's write, !pip install, for "quiet" mode, let's use -q for "upgrade", let's set -U Now, we're going to tools we'll use. langchain langchain_core langchain_groq For building the app, let's type gradio Let me run this cell. Awesome, our tools are ready. Let's go ahead and set our groq api key. To do this, let me click on the secret key. And then press the add new secret button. Let's write GROQ_API_KEY in the name field. To get this key, let's go to the groq cloud
and then click on the api keys. We can create a secret key for free. To do this, let me click the create api key
and then give a name. Let's say my_key. After that, click on the submit button, and then copy this key. Don't worry, I'll delete this key before loading
this video. Next, paste the api key here. Okay, we set the api key as an environment
variable. To use this api key, we're going to import this. To do this, let me copy these codes. And then paste it here. Let's assign this api key to a variable. groq_api_key = userdata.get('GROQ_API_KEY') Let me change this name. Nice, our api key is ready to use. What we're going to do now is create a chat
object. To do this, let's import ChatGroq from langchain_groq. from langchain_groq import ChatGroq Next, let's create an object named chat. chat = ChatGroq( First, let's set our api key. api_key= groq_api_key, After that, let's set the model we'll use. model_name="mixtral-8x7b-32768") Let me run this cell. Nice, our chat object is ready. Now, let's create a prompt. To do this, we're going to use ChatPromptTemplate. First, let's import this class. from langchain_core.prompts import ChatPromptTemplate After that, let's create a system message. system = "You are a helpful assistant." Next, let's get a text variable for the human
message. human = "{text}" Now we can create our prompt using these variables. Let's say, prompt = ChatPromptTemplate.from_messages() [("system", system), ("human", human)] Great, our prompt
is ready. To generate some text, we're going to use
a chain. First, to get output as a string, let's import
StrOutputParser. from langchain_core.output_parsers import
StrOutputParser What we're going to do now is create a chain
to connect our components. chain = prompt | chat | StrOutputParser() Nice, our chain is ready. To test, let's generate some text. To do this, chain.invoke() After that, let's set the text. Let's say, {"text": "Why is the sky blue?"}) There you go. Our chain works very well. What we're going to do now is build an app
with Gradio. Gradio is a tool that allows you to create
ML apps. First, let's import this tool. import gradio as gr To generate the text, let's create a function. Let's write, def fetch_response(user_input): After that, let's create our chat variable. Let me copy these codes and then paste them here. After that next, let's type our messages. To do this, let me copy these commands. Paste them here. After that, let's give our chain. Let me copy this variable. And then paste it here. Next, let's set the output variable. To do this, let's write output = chain.invoke({"text": user_input}) Finally, we want to return this output. return output Let me fix this output. Nice, our function is ready. Let's test this function. To do this we're going to get a user input. user_input = "Why is the sky blue?" After that, let's pass this text to our function. Let's write, fetch_response(user_input) Let me run this cell. There you go. Our function works very well. Let's go ahead and create an interface. To do this, let's write, iface = gr.Interface() First, let's pass our function. fn=fetch_response, Next, let's set the inputs. inputs="text", And then we're going to set the outputs. outputs="text", Next, let's give a title to our app. title="Groq Chatbot", Lastly, we want to write a description for
our app. description="Ask a question and get a response." Nice, our interface is ready. Let's start our app with the launch method. iface.launch() Let me run this cell. There you go. Now, we can write text here. Why is the sky blue? We're going to click on the submit button. the output started creating. Here you go. Here you can see the generated text. Awesome. Our app works very well and fast. Let's write another input. Let me click on the clear button. Let me write, what is 4+4? Let's submit this input. There you go. As you can see, it is very simple to create
an app with the groq api. In this video, we covered how to use the groq
api with LangChain. Let's take a look at what we did in this video. First, we installed our tools. And then set our groq api key. Next, we initialized the mixtral model and
created a prompt. To generate some text, we used a chain. Lastly, we built an app using Gradio. The app we created works very well. Yeah, that's it. The link to this notebook is in the description. Hope you enjoyed it. Thanks for watching. Don't forget to subscribe, like the video,
and leave a comment. See you in the next video. Bye for now.