SECURE OPENAI KEYS in FlutterFlow Applications

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hello! My name is Vlad, I'm founder and Tech lead  at RushApps. At RushApps we create scalable and   reliable mobile and web applications and if you  have any particular project in mind feel free to   contact us. So, objective of this tutorial is to  learn how to really secure your open AI or any API   keys that are crucial for your applications. Why  is it necessary to secure your API keys in your   applications? API Keys actually play a crucial  role in almost any web or mobile application,   especially those that are used to get information  from third-party sources, such as Open AI. As you   know, Open AI charges you per request and if your  key gets exposed it can become a really stressful   financial burden for your business and if you do  not secure your API Keys properly, it will be very   difficult to replace them inside your application  and this financial burden will carry over for   a long period of time and in this tutorial I  want to show you how you can actually secure   your API keys in your FlutterFlow applications.  Let's get started! Let's start. So, in order to   demonstrate you how all possible variations of  definitions of API calls work when you create them   in FlutterFlow, I created a small application. It  actually executes the same Open AI API call (it is   "Completion" API call) that will ask Open AI about  details of a particular meal. The Prompt will be   "tell me about this meal" and name of the meal  will be passed so there are three options: "not   secure" this one is just plain API call that you  define in FlutterFlow when click "create API call"   and pass your token right here in the header. The  second one is "secure". This one utilizes "deploy   private API" function of FlutterFlow and later on  I'll explain you what is actually done under the   hood and how it affects work of your completion  API call. And third one is "Edge function"   in Supabase which is basically the same as Cloud  function in Firebase that is being utilized when   you deploy private API and I will tell you how it  differentiates from all of the previous API calls   and why is it better to utilize the last approach.  So, in order to demonstrate you what is actually   transferredwhen you call an API using all of those  three options I opened a network tab in Chrome in   order to understand what is contained inside the  request. So, let's call this "not secure option"   in order to understand what is being passed to  Open AI and how is it passed. So, it is here,   let's actually close this again so it will be  clear and you'll understand it much better. So,   here is the completion API call. First of al,l we  can see that request URL actually says that we're   asking API open ai.com completions endpoint. Also  it has "X request URL" which is just a plain API   endpoint "api/openai/completions and actually  the most interesting one is the authentication   header that is also being passed here. So it is  here: "authorization bearer" and the key that we   defined that here. Actually, even if we store it  somewhere inside of the variable in FlutterFlow,   it doesn't make any difference because even if  it is stored inside the variable, it will be   exposed in network traffic and it will be used by  this request. So, as you can see, it is very very   not secure to use this approach simply because you  can access this API key using network interseption   method as I demonstrated just now and also if you  actually download the code, the actual code that   has been generated by FlutterFlow and that you  deploy to App store or Play store and navigate   to API calls folder (it is the folder where all  API calls are being stored) you can see that this   authorization header is also stored in a plain  text and if a hacker or somebody decompiles your   application, he actually (he or she) can actually  see this bearer token here. She can understand   that it is an Open AI key and they can use it at  free will. And the most important part is that it   is "baked" into the code. What do I mean by "baked  into the code" is that you providing this key in   the code itself make it much harder to rotate  this key. For example, it is not uncommon that   API Keys even in secure workflows sometimes might  get exposed but in this situation you want to be   able to rotate this key to change this key and to  easily replace it. And in case when you type your   code right inside the client's application that  is being deployed in App store or Play Store,   it becomes much much harder to replace those keys  simply because you need to create an update each   time you want to change this API key. So, as  you can understand, this uh option is not quite   secure. It is good when you test an application  simply because it provides you a very very easy   way to set up this API call and if it does not  involve and it does not need a lot of security   on the testing stage, you can Define it like this.  But, when you actually deploy your application to   App store or Play Store, you need something much  more secure in my opinion. So, let's actually try   something different. Let's try this private  API option that is provided by FlutterFlow   and in order to understand what this option  actually does, I have this schema. Let's take   a look at it. So, when you toggle this "deploy  private API" option , Flutter Flow actually   does plenty of work. It actually creates a Cloud  function in Firebase that is stored in Firebase   and that actually executes this API call that I  demonstrated you earlier. So, not the application,   not the client's application directly asks OpenAI  about what it needs, it asks this Cloud function   that is being stored inside your server to form  this request to ask something OpenAi and then   OpenAI answers to This Cloud function and Cloud  function in return gives you just an answer and   that in theory is actually much safer and that is  actually how all of the mobile applications use   the API Keys and that is how they actually execute  those API calls. So, let's actually take a look at   what is being transferred on network tab when we  create this Cloud function using FlutterFlow's   "deploy private API" option and understand whether  it is really secure or it has to be improved. So,   let me find this private API call. So, it is  called "privateAPIcall "and let's take a look. So,   "request URL" it is a cloud function so it is  quite good because it is not easy to understand   what is actually going on in this function. As  an authorization, bearer token of your client's   application is being transferred and it is also  good simply because this bearer token ensures,   first of all ensures that the client actually  authenticated into your application so for example   this user actually logged in and the application  server provided it with a limited in time key   called bearer token to communicate with back end  of the app and with this token we just ask our   server to do something and in this case we're  asking it to execute an API call for us. So,   actually it looks pretty legit and in theory  it can be used. In theory it works like this:   so your server serves as a proxy for an API  request and even if a hacker or someone thatwants   to steal your API key won't have an idea that and  won't have any instrument especially using network   interception to understand this. But, the most  interesting part in the actual payload. Payload   is in our case a list of variables that are being  passed to this uh function and as you remember,   as I said, we have this one variable: "meal  prompt" that is being passed to OpenAI and   this meal prompt asks OpenAI to describe some  meal that is dynamic and we can see that it   provides this meal prompt: "Italian pasta" and  it actually provides the API key also which is   actually not the intended Behavior when we're  asking Cloud function to do something. Of course,   it obfuscates the headers and for the hacker that  uses network interception it is really difficult   to understand what's going on here. However, if  this hacker has an access to your application   or has an access to this uh Network tab and  understands that when he clicks this button,   something is being passed to Open Ai and see  this API key, I think that this hacker will   understand that this is indeed an Open  AI key. So, we can actually redraw this   schema a little bit simply because the key is  supposed to be stored inside the server but   Flutter Flow actually provides this key when asks  to execute this Cloud function and this is indeed   not the behavior that we're expecting when we're  deploying those private API functions. Moreover,   if we actually look at the code that is being  generated. So, this one "create completion call",   we can see that this key is also baked into the  app itself so we're losing a very very important   part of our security: the ability to easily  rotate API key if we understand thatit has been   exposed.We still need to update our application  in case if it's been intercepted. And it is not   obvious that this key is being used to call Open  AI, however if someone wants to get a key and he   decompiles your application, he might guess that  this is an Open AI key especially if he inspects   your application and understands that it has an  Open AI features involved. So, actually I think   that this approach of deploying private API is  good, however it misses one important part is that   this key, it should not be passed as a payload  and obviously it should not be baked into the app   itself. It should be stored inside the the server  that executes this API call and we're approaching   very important part of this tutorial is how to  write Cloud function in Firebase or Edge function   in Supabase and how to write it with all of those  concerns that I mentioned earlier with the key   that is being stored on the server, inside some  environmental variable and that has been safely   stored and the function that actually executes  this call and provides your client's application   with a call back with an answer from Open AI.  So, a part where we write a cloud function is   Firebase is not a part of this tutorial. Actually,  I think that uh FlutterFlow Team did a very good   job on creating those Cloud functions just using  the definitions that you provide them in your   API calls. The only thing that is missing is in my  opinion is using Firebase key storage options. So,   in my opinion this key should not be passed in  each of the requests even to Cloud functio. In   my opinion it would be great if they limited a  little bit this functionality and said explicitly   that you need to additionally add this key to your  Firebase safe keys storage and the code that is   being generated by FlutterFlow will address this  storage and get API key from there. Obviously,   it will be more time consuming for you as a  developer, but it will be much safer in terms   of safety of your API keys. So, coming back  to writing Cloud functions I can recommend   a couple of Articles on how to write Cloud  functions in Firebase. This one and this one.   Those two articles explain how you can create a  cloud function in Firebase, to deploy it and then   you can easily call it from FlutterFlow. I will  provide links to those articles in the comments   on YouTube and feel free to read those articles  and create the cloud functions by yourself. So,   and now please let's come back to our idea  of creating an Edge function in Supabase. So,   it is basically the same as Cloud function but uh  it requires actually a lot of prerequisites that I   provided in the article on our website rushapps.io  that explains in much more details what you need   to install before creating this Edge function.  So, you will need Supabase CLI and in order to   actually install this Supabase CLI, you'll need to  install Docker. Docker is basically an application   that allows to create the virtual machines ,  so-called virtual machines that contain all   of the necessary applications that are needed  to run your code inside this small container,   this small virtual machine if you want to think  of it this way.So, you'll need to install Docker,   you'll need to install Ubuntu if you're using  Windows simply because Docker won't work without   Linux and Ubuntu is a Distributive of Linux that  is available in Microsoft Store. Third one is that   you'll need to install Node JS simply because some  of the packages use uh node and you'll need this   to be installed to run your code. And you'll need  to get your OpenAI key from OpenAI API dashboard.   I've already received it and placed it inside  my environmental variable inside the code. So,   also another very important part that I want  to mention is that you also need to take a look   at the video that is provided by Supabase Team  that is actually very very very important to   understand how to prepare your environment to  deploy Edge function and this video also explains   how to create an edge function that requests  OpenAI to do something with the prompt that   is being provided. This one I will also provide  it in the description below. And you'll be able   to understand in more details uh what code in my  uh Edge function mean. So, let's come back to the code. It's this one. I'm using visual studio  code as my IDE and if you also use it,   just please install Deno extension in order  to understand the code that is being provided   here. So, please create the project as it was  described in the video that I mentioned earlier,   the video by Supabase Team. You'll just  need to create a folder inside the disk   where npm is installed and you'll need to run  this command that is called supabase init. It will create this Supabase folder that  will create config for your local Supabase   development environment. After calling this  function: "supabase init", run this command:   "supabase functions new" and name of a function in  our case openai and it will create this subfolder   and index.TS file for this function that will  be actually prefilled with some information.   It will be prefilled with this curl function  and it will be prefilled with those commments,   with this setup guide etc. So, this is how you can  actually get all the prerequisites for your local   Supabase Edge function. And, let me explain to you  what is going on here in this code. I've already   written it and you can actually access it using  the article uh on our website rushapps.io that I   will provide in the comments below. Here I just  want to briefly describe what is going on. So,   this import is a little bit different from  what you are used used to see. those Imports   are let's say Deno specific. They require you to  provide an actual link to the resource and in the   Supabase Team video is explained in much more  details why does it look like this. D and xsr   and e HR you can just leave it as it is so this  part with course headers and this CORS handling   function is very important for production side  of things. Of course, you can write your function   without CORS handling simply because when you're  running your functions on Local Host, CORS is not   checking that your request is correct. However,if  you try to invoke this Edge Function using your   phone for example or any device on the internet,  it won't work and it will throw this CORS error.   So in order to avoid it, I also included a part  that skips this option "options" request and   passes all the necessary information in actual  "post" request that has been generated below. So,   and the main logic of function is provided here.  So, in completion config you specify the model,   you specify the input variable that you define  here. Model, you define Max token and temperature   zero means that it will be much more deterministic  and the model will be much less creative. This   part is actually making the request as an  authorization token. It grabs OpenAI key   that you created, that you need to create inside  your environment variable and in order to create   it please create .enc.local file and place your  OpenAI key here and please uh use name of this   variable in your code uh and a little bit later  on I will show how you can actually insert this   key also in your production Supabase database.  And I think that's it. This request is quite   simple. Of course in your case I suppose those  requests will be much more complex and you'll   need to write much more code how. You can just  copy and paste this code into your IDE and in   order to run it. you need to run this command:  supabase start. Before running this command,   please make sure that Docker desktop has been  opened, then run this command supabase functions   serve openai or a or name of your function.  And that's it. You'll see something like this:   serving functions on your Local Host functions  functions name and in order to check that this   request is being executed copy this curl function  and execute it in a bash script and as you can see   it has been executed and we understand locally  that this function works fine. Now we need to   deploy it in order to deploy it uh we have to  do several steps so first of all we need to uh   log in into our supl uh into our base account  uh then we need to link this uh the project uh   that we want to deploy our function to uh to this  local uh development database uh and then we'll   deploy it so let's actually uh try it when you  uh enter this command superbase login it will   open browser and login automatically uh in my case  uh I have already linked this uh but in your case   you'll just uh need to open your browser uh and  uh allow super base uh to log in okay so that's   it let's run another command actually as far as  I remember there is a much easier way to do it we   can just type subas link yes and you can actually  select the project so let's select the project uh we can skip password and uh we  linked uh our local super based   project uh to our production uh super  based project and now let's deploy this function super base functions deploy  name of your function in our case open Ai and it's been deployed uh so we can check check  uh that it's been deployed uh in our uh super base   TP we can navigate to Edge functions and we can  see it here so and we can see the URL so uh one   last thing that you need to add is uh as you  remember uh We've passed our open AI API key   using environmental variable and in order for our  code to work in production we we also need to set   up this open API key uh with the same variable  name in our superbase project so in order to do   it uh you need to navigate to project settings  Edge functions and enter this uh key here so   you need to enter this secret name and uh provide  your uh openi secret here and that's it so uh the   last part of uh this uh tutorial uh is to uh show  how you can actually uh create a custom action to   invoke uh this uh Edge uh function and uh this  uh is actually provided here and you can grab   this code uh on our website I'll just uh explain  uh how it works uh so uh basically uh it works   just like in the example that is provided by  uh superbase so here is this example so you   need to call super base functions invoke name of  function and uh body of function and in our case   we uh do exactly this so we use superflow client  functions invoke uh open a uh and uh in body of   request we uh provide uh a variable that uh we  set up here as you remember this cury variable   with our varable variable uh with our argument  prompt uh and uh this prompt will be based on   M name and uh some text uh that will be something  like please describe me this meal uh and uh meal   will be dynamic uh so and uh in this part we  just parse the answer so open AI response uh   is of type functions response and to uh get the  actual Json uh the that is provided by open ey   we just need uh to uh call the data uh and then  we just par it uh so in order to understand uh   how it works and why uh does it par list  uh the first value of list and text let's   actually navigate uh to our code and navigate  to the answer that was provided here so let me   just make it a little bit bigger uh so as can  see uh this Json has a following structure it   has created model and choices uh so your answer  is provided uh in uh this uh uh choices uh Json   uh parameter and as you can see uh it has just  uh one value that's why we just access the very   first uh the very first item in this list and  this is indeed uh a a type of a list and inside   this uh list item we have this text parameter  that's why we uh explicitly access uh it uh   using this uh construction then we return uh  text uh text S a string and in case if there   was an error we just uh catch an error and uh  that's it uh you can try uh you can try to uh   copy this code and uh uh let me show how  uh it is how it is connected uh as action here so we just call custom action and  provide uh a text combination as prompt   so we just explicitly ask tell me more about  this meal and we just provided a mill name   uh and you'll see uh that uh in our case of  Italian pasta it will the final prompt will   look like tell me more about this meal it  tellan faster so uh let's finally navigate   here and call an edge function let's uh close  it again simply because there were a lot of packages so we can see that uh it calls open AI  Edge function so we have the result uh you can   see uh that it is uh being displayed uh inside up  and let's see what's going on so uh here we can   see that it safely uh passes in a payload only  the cury uh that uh we want to ask open Ai and   uh headers look uh just like uh in flut flow's  private uh deploy of uh API uh functions so we   have uh the resource name our Edge function  post uh and everything else is the same so   we just pass bar token in order for super base to  understand that this action uh has been performed   by the user that uh has uh logged in uh before  uh so and uh that's it uh that's uh how you can   actually uh create secure cloud cloud or Edge  functions uh in our case in order to execute   uh such calls uh such API calls as open AI one  and I hope that uh this will allow you to save   uh a lot of money uh and uh time when using such  API Keys as open a uh because they cost a lot of money that's it I hope that you like this  tutorial and if you did please click like   button below and subscribe to the channel we're  planning to post many more videos on flatter   flow and up sheet and other services that can  help you to enhance your business operations   and if you have any particular project in  mind feel free to contact us thank you bye
Info
Channel: RushApps
Views: 463
Rating: undefined out of 5
Keywords:
Id: MJSliww60e4
Channel Id: undefined
Length: 37min 41sec (2261 seconds)
Published: Fri May 03 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.