Building Generative UI with Next.js (Jared Palmer)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everybody hey everybody my name is Jared Palmer and I'm the VP of AI at versel I joined versell about two years ago after they acquired my build system startup turbo repo many of you might also know me from my react Community contributions like formic or tsdx thank you um once I joined versell I I led the turbo repo team for a bit uh and then all our Frameworks division that's like next and spelt and react and our internal devb tools teams uh but earlier this year I transitioned I got AI pill and I transition full-time to lead our AI efforts uh today I want to talk about building generative user interfaces with nextjs and um basically what my team has been working on for the the past few months we've been we just recently launched a product called V Z and it generates code for uis from just natural language prompts and let's see if we can watch a video of how this works so as you can see you type in a prompt and it streams react code with Tailwind Shad CN UI and your favorite tools that you probably all use you can click and reprompt and it will augment the interface on demand make it black and white and when you're done you can go over in a second and just copy the code and paste it into your app and so the idea with the zero is it's not to be Pixel Perfect that wasn't the goal the goal is to get you as the name implies to your first version and get you to the point where you can just copy paste and ship in this in the rest of this talk I'm going to talk to you about how we built vzer with vel's AI stack and how you can use these tools to build your own AI native products not a surprise VZ is built with nextjs it's built with server components and actions and middleware and Edge functions and is backed by versel KV what schools that are work on vzer has helped the nextjs team dog food new features many of which were announced today using nextjs and versell allowed us to take ADV advantage of a streaming enabled architecture something that was initially designed for fast e-commerce and SAS websites turned out to be absolutely perfect for streaming these long lived AI responses just like the ones used in vzer so as you saw in the keynote um if we didn't have streaming and tried to build a traditional maybe blocking UI like we see over here our users would basically find themselves twilling their thumbs waiting seconds 10 15 30 seconds waiting for a response gross um but streaming UI solve this problem by displaying parts of the response incremented like you see over here and this is exactly the same Technique we use in v0 and popularized by chat GPT but building this isn't very easy there's quite a bit of boiler plate streaming stuff if you ever worked with streams it's not so fun um plus different AI model providers have different response payloads and so it gets even more complicated so we did something a couple months ago by introducing something called the AIS SDK which abstracts all this complex for react felt View and solid and gives you the UI hooks Anders you need to add AI features natively into your applications in the fewest lines of code you could possibly imagine and it works with Lang chain and open Ai and anthropic and hung and basically the rest of the AI stack that you're probably already using but the AIS SDK helps you finish the swing and build interfaces in your actual application not just your python notebooks so you can see it an action in our model playground if you go to sdk. ver. you can actually compare up to like 20 llm side by side and you can see the streaming in action so you can type a prompt and here we're comparing gpg 3.5 to meta uh so meta's llama model and you can see both are going to stream um at various speeds uh but they stream back the response and so there's an immediate interaction there so once ready with in the model playground we can uh click copy code and copy our code with the SDK that's ready to just put into our applications whether that's nextjs app router spel kit Etc so let's look at this code so this is a a code SYM from the AISD that gets generated and it's really straightforward first you get these helpers called streaming text response and open AI stream we have a nextjs uh what we're looking at here is a nextjs route Handler uh an app directory um and you're going to take a uh it's going to receive a messages payload you're going to send that to open aai and take the stream that's uh given back to you passes this little converter function called open AI stream and return this streaming text response and that's how you get this streaming text effect again if you use chat GPT what we it's it's it's very familiar and then on the client we switch things up and we give you this these helper utilities use completion and use chat depending on whether you want an instruction or chat like UI either way the API are very similar you're going to pass uh the endpoint in this case is like the default is API chat to the hook and you get back your message list input handle change handle submit all the utilities that you need to wire up a chat a chat app and then let's just react here we're going to um render out those messages in a list that discriminate between what's from the user and what's from the AI and render this string or what feels like a string so that's the that code you just saw is basically the same code that we used in the playground and we also took that code and we built a full featured example of chat GPT that you can deploy to versell you can interact with it at chat. ver. so why am I bringing this up vzer is actually a fork of this application so we really are using our sort drinking our own champagne um so to speak um so yeah vzer started as a fork you can and of of this chat chat app uh and I'm very proud of the fact that we built this sort of stack of both um libraries and applications that we were able to extend to go to market with the product but again it's a fork um all of this is amazing but it's not enough for generative interfaces right text is pretty limited I mean if you ask chat GPT or cla a question they they respond in plain text like MS DOS 1981 um so we knew we had to push further and the problem we found ourselves wanting to solve can be boiled down to really a simple question which is basically what if Chad GPT could just respond with react components that'd be pretty neat but what would that look like well we kind of have some examples of this already that we use every single day Google search instant answers you know those special sort of situations when you Google something and it's a different UI and Facebook news feed are are great examples of similarly generative interfaces that we want to uh that we wanted to draw inspiration from so in order to build V zero my team realized we're going to need some sort of rendering system like this and so the naive approach would be I don't know using MDX or something like that to maybe prompt the language model to render it but all these but but all these sort of suffer from a bunch of different issues um one like you have to load all the components up on the client and that's just not going to work that's not going to scale I mean imagine you're in Spa land and if Google was an spa for example and you had to load the Chuck Norris component the calculator component the the F1 schedule component when all you really want to do is translate we are so back into Spanish it's not going to work not going to make it so we're at extra es comp so we're very server pilled here so the solution is to lift things to the server and that's what we did so next and the versel SDK have re-entered the chat uh and server components and actions are the solution so the key unlock was to bug the xjs team to get server actions to return streams key unlock and today I'm excited to announced that we're going to update the aisk which was previously built with s SWR to actually take advantage of server actions this is going to allow developers to create scalable Rich streaming AI native component based interfaces so let's look at how this this works and upgrade our little chat bot that we were just looking at to render react so you'll see the directive here we changed that uh route Handler into an action U and now it's just a function and it's going to take messages as an argument and we're going to pass that to our AI in this case it's chat it's open AI uh we're going to have that same stream conversion and then this is the key part down here experimental streaming react response uh which is brand new and all you do is render a react component and we're bringing back the render function which I find you know very very very fun and you can do whatever you want in this case it's just font bold but uh you can use and combine this with open AI functions and actually let the language model uh decide what component you might want to render do all kinds of stuff it's going to be amazing for agents and other types of interactive applications it's very exciting so then on the on the client side of the world um instead of passing the URL to use chat you pass the action and instead of rendering message. content you're going to render message. UI and this is going to act kind of like children but it's the going to be the react component that the uh that you're going to have responded to uh with from the llm so again to put this in perspective what can you do with this so what we're saying here is now you can when somebody ask when's the next F1 race it's not just going to respond with text you can decide to render F1 schedule component and that component can use all the tools um that nextjs gives you it can use suspense it can use react server components it can use it's just JavaScript so you can actually just render it in place truly Interactive full Fidelity it's react so with nextjs 14 we finally have the foundation upon which we can build this next iteration of human computer interaction and user interfaces so going back to v0 and watching this again you can sort of see this in action so we have the prompt and we're streaming react code so in response to that prompt the action was triggered but unlike what we just saw previously one of the big differences between vzer and what we just showed was instead of picking from a finite set of pre-made components v0 is sort of the next evolutionary step it uses AI itself to construct the react components on the Fly uh just in time before streaming it back to the user so the end result with v0 is fully generative on demand user interfaces with infinite possibilities without compromising on performance or bundle size which we just discussed but today we're open sourcing the underpinnings of that and I'm very excited so as I mentioned we're uh we launching the support for RSC in the versel AIS SDK today and we're going to continue iterating on this so I hope this talk has basically shown you that nextjs 14 is a great place to be and lays the foundation for the next wave of generative user interfaces which in my opinion starts today so to get started all you need to do is mpm aai uh just follow the their Billboards all around San Francisco um so you know it's funny I I look at the download stats of this and I'm like oh it's Tri it's really taking off and then I'm like wait it has Billboards so we're cheating but anyway uh so to get started just npm install Ai and go and I hope today uh that we've showed you sort of the first look or the first a first glimpse of what the future holds um and the future is with sorry and um the future starts with react and next js14 so thank you very much and uh yeah happy happy [Applause] [Music] coding
Info
Channel: Vercel
Views: 8,075
Rating: undefined out of 5
Keywords:
Id: cIzsQBbZNxk
Channel Id: undefined
Length: 13min 16sec (796 seconds)
Published: Fri Nov 03 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.