How To Automate Data Entry Using AI - Claude 3.5 Sonnet!

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
just this week anthropic had released CLA 3.5 Sonet which is the best large language model out there it's truly better than open AI GPT 4 omn model as it outpaces it in various benchmarks like coding mathematics content generation logic and so many others the claw 3.5 Sona model is so powerful that you can fully code out an 8bit Style game within a couple of minutes which you can see on the screen right now another area that the model excels in is data this is where most models are having problems dealing with large amounts of data and it can struggle as models tend to hallucinate and can't really focus or keep track with a concise context window however with the new 3.5 Sonet model data management and extraction just got way easier like you saw from the title of this video I'm going to be showcasing a practical use case with the new Sonic 3.5 large Dage model as I will be utilizing it as a backbone for my automation where I'm going to be showcasing how to extract data by utilizing the cloud 3.5 Sonet model with the power of the large language model it will easily Elevate data extraction workflows easily to help me create this I will be utilizing a no code platform that will easily help you automate any workflow with the help of AI this is where I would like to introduce Factor shift a tool that I love creating workflows with on this channel I have made multiple tutorials on Vector shift where I automate email flows create AI agents that can carry out various tasks and so many other things where I truly recommend that you watch all these different videos which I'll leave a link to in the description below but Vector shift is definitely a great framework that will enable you to build AI Solutions like an AI search engine an AI assistant an AI agent a chatbot and various other automations that can solve so many problems in your life as well as within your business so with that thought guys let's get started and showcase how you can utilize the new claw 3.5 Sonic model with the help of vector shift to automate data extraction now before we get started I'd like to introduce world of AI Solutions this is a really big update that has been launched for my channel and this is where I have compiled team of software Engineers we have machine learning experts AI consultants and this is basically a team where we're going to be providing AI solutions for businesses as well as personal use cases where AI Solutions can be implemented to automate certain things or to help business operations now if you're interested take a look at the ti form Link in the description below so let's get started what I want you guys to do is head over to the vector shift. website where you're going to be greeted with this nice web page and what I want you guys to do is click on the get started button this is where you're going to be prompted to create an account with your email address or with a Google account or GitHub so once you have done that we'll then get started with the next step now before we get started I definitely recommend that you take a look at the patreon page so that you can access the new subscriptions that will be releasing this week if you would like to book a Consulting call with me you can do so with the link in the description below once you have signed in you'll be then sent over to the pipeline page where you're going to be able to manage all your workflows within this page over here you have a marketplace where you can access Community built automations as well as various sorts of workflows that have been created by different members in the vector shift Community you have a storage system automations where you can manage different automations that you create you have a chat bot section where you can manage your chat B you are going to be able to evaluate all these different metrics and analytics you have Transformations as well which is something that will allow you to transform various sorts of workflows now what I want you guys to do is head over to the PIP line page and I want you guys to click on the new button where we're going to be creating a new workflow for this data extraction automation once you have clicked on the new button you can see that there's various different templates that you can actually utilize for various use cases for example if you're going to be automating your email you can have it so that you can utilize one of these templates that can help you get started with that you have various other templates for chat Bots assistance productivity finance and strategy and so many other areas so I definitely recommend that you try these different templates out cuz then this is an easier way for you to get started with the automations that you're going to be creating on Vector shift but for our use case we're going to be just creating a basic pipeline from scratch by just simply clicking on this button and this will basically send us over to the dashboard or the UI where we're going to be utilizing this drag and drop UI to create our automation so in essence what we're going to be doing is utilizing the new clad 3.5 Sonet model and what we're going to be doing is utilizing its capabilities in data management to extract data and then have it so that the large language model is going to structure the data into a desired format that could be then sent over through one of the Integrations such as Google Sheets and what is going to be happening is that it's going to automate this full-on process within Vector shift so that Vector shift can do it recursively in a certain desired input for example we can have it so that it could do this on a monthly basis a weekly basis a daily basis or even minutely so we're going to set that automation up within Vector shift and to do so we're going to first start off by placing a large language model node we're going to also place a general input node where the queries are going to be sent into this Automation and we're also going to be pasting in uh Google Sheets integration this is because firstly we're going to have it so that it's going to start off by sending a query in it's going to then be processed through different nodes which I'll explain later on and then have it so that it's sent over to two Google Sheets which I placed the wrong Noe for now like I said before for a typical large language model it's hard for it to process large amounts of data but for the Sonic 3.5 model it's able to process large amounts of data without hallucinating and to Showcase this I have this contract that is quite lengthy it's a contract for a company that has multiple client agreements to work or buy from the company now this document has a lot of characters and a lot of numbers and I will be having it so that the Sonet model will extract four categories this is where I'm going to have it so that it extracts the duration of agreements for certain clients it's going to secondly focus on the limitations of liabilities for particular clients it's going to focus on the billing start date and lastly the contract value where I will request it to break down the full pricing structure for each particular client this way I will be able to get a good extraction of different values and characters from the Sonet model so to process all four of these categories I'm actually going to be placing down four large language models or large language model nodes to process each particular criteria further in detail and this is something that I'll explain further in The Next Step so for now let's just Implement four large language model nodes for from anthropic select the model to be the new 3.5 model and then we're going to be giving it a system prompt and connecting it with the input node and there we go I have it so that all four of these anthropic large nishal nodes have their own specific like system prompt which is going to be specifically focusing on extracting a certain type of category for example this first system prompt that was given to the first large Dage model node is to extract the duration of the contract now the first portion of this like I would say system prompt is exactly the same for all four of them it's just each of them are going to be extracting a different category this one will be extracting the limitations of liabilities this one will be the billing start date and lastly this one is going to be focusing on the contract value so what I did is I basically created a prompt and this is where you just simply create a parenthesis and then put contract or name it whatever you want to connect the input node this is where I connected all of these input nodes or sorry these anthropic large Engish model nodes to this input node where the user query is going to be coming in and then what I did is that I have it so that the large Lish model node is connected to an output because you need an output for each response for example this extraction is going to need an output and all of these outputs are needed for each of these anthropic large language model nodes next we'll need to set up a Google sheet and a Gmail node this is because we're going to be having all of our responses outputed to Google Sheets as well as Gmail so what you can do is you can set the action to write to sheet if you don't know where to add the integration you can go over to Integrations add Gmail and add Google Sheets you want to keep it as uh right to sheet but you can also have different other options set up now in this case you need to connect your Gmail once you have done that you just need to click on conf bigger you need to select a sheet to do this you can select random sheets in this case I'm going to create a new sheet and then select it and then click save you should specifically specify what sort of categories that you're going to be extracting from beforehand on the Google sheet for example you need to specify these different rows or these categories and once you have set those categories you need to select them on the Google sheet node when you're going to be configuring it so that you can connect your nodes together in the same matter you can do it so that your Gmail is connected and it will send it as a draft and it will basically create this and it'll be sent to whoever you want it to be automated and sent to for example the one where we're going to be focusing on extracting the duration you can click on the responses and connect it to the duration for the same thing you do it for all the other nodes so make sure you do this beforehand so that you can connect your Google Sheets to all of these separate nodes so I'm going to go go ahead and do this once I have done that you can then click on the deploy changes and there you go you have a fully automated data extraction workflow so let's get started and showcase how you can extract the data what you can do after you have deployed the changes you can deploy this as an automation or a chatbot you can then run this pipeline but in this case I'm going to be showcasing an example where we're going to go over to run Pipeline and upload that file that we had just downloaded we're going to send this in once we have sent in this document we can then start having it extract all the different components that we had specified beforehand we're going to Simply click on run and you can see that's slowly focusing on extracting all these different outputs and what it's going to do is it's going to basically have it so that it's going to be extracted to our Google sheet which we can see over here it extracted the duration for one of the clients the limitations of liabilities the billing date as well as a breakdown of the pricing stru this was something that was done by utilizing the CLA 3.5 Sonic model and I was able to do it within less than 10 minutes this is the capability of the new large Dage model as well as of ector shift you're going to be able to automate various things in various sectors so I definitely recommend that you try this out with the link in the description below as well as checking out our other practical use case tutorials with the links in the description below as well but with that thought guys I hope you enjoyed this video and you got some sort of value out of it I'll leave all the links as to what I use in today's video in the description below make sure you follow me on the patreon cuz this is a great way for you to access different subscriptions completely for free make sure you follow me on Twitter great way for you to stay up to date with the latest AI news as well as different sorts of tips and tricks and lastly make sure you guys subscribe turn on notification Bell like this video and check out our previous videos so you can stay up to date with latest AI news but with that thought guys have an amazing day for positivity and I'll see you guys fairly shortly peace out fellas
Info
Channel: WorldofAI
Views: 12,515
Rating: undefined out of 5
Keywords: claude 3.5, Claude 3.5 Sonnet, Anthropic, AI, Data Extraction, Automation, Vectorshift, No-Code, AI Agents, Chatbots, AI Assistants, LLM, Artificial Intelligence, AI Coding, Data Analysis, AI Innovations, How To Extract Data, how to extract data with ai, automate data entry, automate data entry with ai, chatgpt, zapier, openai, ai automation, gpt tutorials, chatgpt education, ai automation agency, ai for business, ai service, zapier tutorials, saas, saas tutorials, software business
Id: FZI4JcBDVWs
Channel Id: undefined
Length: 12min 31sec (751 seconds)
Published: Sat Jun 29 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.