Code in the Flow with VS Code’s GitHub Copilot Chat

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] w [Music] hello hello everyone happy Thursday and happy vs code live stream Thursday I'm Olivia as always I am so excited to be here to host another one of our amazing guests we have a really exciting show in store for you today um we're gonna have Harold who's a PM on the vs code team come on and talk a a little bit more about GitHub co-pilot chat and the vs code integration with it um and just show off all the really cool features um if y'all ever attend our monthly release parties you've seen kind of a lot of those new features that come out monthly this is going to be great for you to get a whole Lowdown of everything that's awesome and everything you can do with it so it's going to be great make sure that you are dropping any questions in the chat because we will be watching out for those so we can get everything answered um while you're in the chat go ahead and let me know where you're watching from I love seeing how all over the world we have our viewers um so definitely say hi um and before we kick things off with Harold we actually are going to go ahead and show a tick Tok throwback Thursday if you're a developer that works with a bunch of different languages you've got to try code runner cuz it makes setting your environment up so easy like I have have a simple app here that's in Python I can drop some python code and just to execute it just go right to the arrow here and run it rise and shine in Python right next to it I have something in JavaScript drop some JavaScript code here and hit that same triangle boom JavaScript execution done without any issues and even with Java drop some Java code here execute it and boom rise and shine in Java and there's so much more languages you can try so go ahead and give it a shot code [Music] Runner I love those Tik toks that we have that just kind of feature an an extension and do that Spotlight because I feel like there's so many extensions that you don't realize you need until you see and you're like yes that's exactly what I was looking for um okay so I see in the chat we got a lot of Hells hello from Germany thank you so much for being here oh hey Doug from New York Doug I'm sure you're gonna have lots of great questions for us today um I know you use co-pilot quite a bit so excited to see what you think of today's stream hello from India another Daron also in India um Anthony in Georgia David in Seattle oh lots of great people um from Pakistan um from Sweden see I love this I love just seeing how all over the world everyone is um from California Portugal thank you all so much for being here um Welcome to our stream I really think you're going to love Today's Show um and with that let's actually go ahead and bring on our guest Harold hey Harold hey a good good hello from California here yeah there we go yeah and I'm coming from South Carolina so even the host here we are a a by Coastal yeah thanks for having me yeah how are you doing today Harold good good it's been oh world and connecting all the demos I was like it doesn't fit in an hour so I think I have enough to come back so much good stuff yeah but it's I mean it's a good problem to have though yes it's like okay there's so much good stuff how do I condense it into one hour um and the good thing is if y'all don't know we stream every single Thursday so if we don't get to things you guys want to see more something let us know in the chat and we'll have Harold come back on and we can do a part two um so yeah Harold do you want to talk a little bit about know what you're going to be showing today and we can even dive into the demo when you're ready yeah definitely so I I work on vs code and I work on copilot chat so all the things I'm showing off uh we show start showing some these get of universe all of what you see is going to be also from pre-release so if you have co-pilot check out the pre-release version best to run and vs code insiders which is also our kind of pre-release version of us code so a lot of things have been really yeah coming in the last month what's interesting now we have a super long iteration usually we release monthly but this one is a combined one of December and January because we don't want to release in the new year uh give everybody some time to switch off so that's been kind of why why is a lot more than usual and it's all in pre-release so a lot of these things are coming out end of the months so you'll you'll see that uh but if you want to kind of follow along if you want to try it out and I'll try to every time point it out when it's just in pre-release but if in doubt and you don't see it in release it's definitely pre-release um but yeah the theme is really staying in the flow I think that's a big part of AI um in your coding flow where it just takes you to a level where you can stay at one layer of abstraction so that that'll be one and just distraction um and it's just why yeah it's a live stream so I'm curious a how what AI tools people use for coding so just to get an idea like I can go deeper into some parts and others but I assume people know some some some may I so but if you if you don't if you're totally new um let me know yeah so follow along in the chat um other than that yeah it will be a whirlwind of all the things uh but yeah I think for flow in general that's interesting what what actually means and I'm curious from the audience as well if you have any parts of the demo that you want to see more of for any use cases do you feel like that this would improve my flow I'm happy to demo those as well nice we have a couple resp coming in and so far we're at 100% people are using co-pilot so awesome and and and you can mention other products we won't Blacklist you you say something that's not copilot yeah like I I Al also still use stent chbt and constantly play around with other tools around it's just so exciting what people try out with AI and it really levels up everybody so yeah and I think it's an important part too to like see what other tools are and you can think oh you know this would be really cool if we could bring this into vs code that's how you know you have that um yeah yeah ADV integration yeah and a lot of them have have free trials available as well so I would definitely encourage you to try it out because like um let all of them they all move at different Paces they all um try out the different um ways to use AI so I think that they all end up now I think getting closer to a co-pilot chat the features I show uh you will find more and more other other products so there's definitely some some looking at what each others doing um part so cool and the first questions popping in as well but let let me start it off um ask people questions so kind of first off so hey co-pilot what does coding in the flow actually mean for developers just a get an idea so I love using copad for that kind of brainstorming so coding and flow than cop explain coding and then flow is really this this state of deep focus and immersion and both contexts are are crucial I like that he picked up on coding as a concept in development like that's basic but true so how do I get into this Flow State any tips and what I'm using here is vs code speech um it's it's an extension and it will give you this um microphone in all Copart inputs and that's just the first round of just giving you voice input and really check out the keyboard bindings you can play a lot around with just hitting a key binding and starting your your command and then copile will pick it up for you and can continue replying but you has many different ways to use actually so either it's holding it and using it and then uh handing it over so there's there's many ways you can configure this without hitting that shr button that's be cool and V code spe basically will add that microphone anywhere you can interact with cop pallet right yes we're trying to get it into all kind of places and this is just like the model we're using it has the input mode but it could also do outputs or also looking at other ways we can improve that and I think I think some some would say like why why would you use speech like I I'm I'm amazing with my keyboard I that's that's all I use like and I'm in the office why would I speak to my computer I think the the we saw this a lot with GitHub for voice or they have a and GitHub next have a voice product as well that they experimented with and it's so many people that have struggling that's struggling with typing the whole day it's nice to just kind of lean back occasionally and do let the machine do the typing for you um just people struggling with getting their words on paper and just having somebody to brainstorm with right I think and if you try out like the the voice mode and chat with there's something really magical if you can start talking with an AI and just um kind of let let your thoughts flow but but I think the accessibility is really front and center and that's something the vsod team really cares about yes yeah I love the focus that the team has on accessibility um and you know I think kind of to your point with coding in the flow and people be like oh why would I want to you know do that the point is it's your flow so like co-pilot really kind of generates or uh integrates in whatever your flow is so um for a lot of people like you're saying this is super helpful to kind of have these brainstorming conversations and actually be able to speak to it um but you know if it's not your thing you can still just keep typing um it's you know you have those options there which I think is um kind of also a key guideline is you can kind of figure out what works best in the context of how you are developing yeah and to pick up an answer or a question from from the chat so it is this complete side panel um that you have here in vs code it's gbd4 powered and that's uh but I think we we usually try to ref like right now it's gb4 is a point in time uh we do run experiments on other models we have a big offline eval suite and something the the kind best practices in AI engineering that are really emerging is like this scientific approach to to how you actually switch models and I think the there's a there's lack of understanding like how it's pitch is you you just drop in a new model and you unlock amazing capabilities but that's not really how these models work so it all have their sweet spot and they're they're kind of weak points um and we see the in our offline evil we run it against different languages and we see how good is it writing code documentation test so we have these big breakdowns or how is it how good is it and understanding your code base what interesting uh gp4 doesn't excel at everything compared to 3.5 some some some models are suddenly better at C uh some models are suddenly better at H which you wouldn't imagine like why is one language different between the languages you're just going to elevate them up if you see the graphs and how the benchmarks go it's just like all over they all somehow improve so I think that's that's where um and you can open the locks and get a co-pilot shed as well to see like which model is being used okay um this pretty noisy but I wouldn't yeah I wouldn't over index on which model uh is used I think people like to see it because like that's what they're paying for with open AI so you're some trained to really care like I want the latest and greatest um so we're we're trying to do doing the work in the background to pick the best model for the job and right now we see for chat we're experimenting now with some some more models um I'm going to show that later and then right now point in time gbd4 is what's in the side panel and yeah of course and always updating there's always other factors as well like the knowledge cut off like the newer GPT 4 model gp4 turbo and the new 3.5 has a has a knowledge cut off for from end of last year I think or something so there's there's more more considerations uh but yeah really data driven and running benchmarks and experiments all right so lots of experimentation going on kind of behind the scenes to see what is the best model for it and then that's kind of also why you know it's not in your face hey you're using this because it kind of depends on the context and where you're using co-pilot even within vs code correct yeah and that's partially about capacity like just what models are available in the pool we're using but a big part is also just the user experience because that the more the the other models have different performance characteristics like gb4 is a lot slower so in some cases you maybe want to have that control so we think like giving people control to pick the model because some as in chat you see like a really like gb4 has to be model if you want that we want to give you that that option but otherwise we let let co-pilot Chad steer the way it needs to go okay that makes sense yeah and thanks yall for all your questions and kind of voicing because you know I think this is a great uh point of you know vs code we developed in the open um so kind of getting this feedback is super helpful um and like Harold was saying they're thinking about adding into it where you can actually say Okay use this model um you know all these are things that we get from your's feedback so we appreciate that yeah definitely it's awesome um so to uh if you have more feedback there's a button up here as well to report feedback so we always um you'll see me in there and the rest of the team so kind of looking at the big ones so we have eliminat distractions R flow and using tools effectively so that's kind of what when I want to go in um so how do we use co-pilot to do that and what do I mean actually by that so we have if you look at ghost text it has this magical motion I think that's where I see some people initially struggle that it's it's a little bit more noisy to to use uh like suddenly it will pop pop in and and give yoube export here and one pop in so there's a there's a moment of of training yourself around using it so you have to understand like when when do these ghost Tex suggestions pop in when you actually need them when you feel like you have high confidence in the task and you just want to code and copot gets in the way so I think one one I would recommend the trial if you haven't used it yet is one month long so it's one month you can kind of see if you can build up the habit and I think that's where I see moving from like oh this this is annoying this is in my way to people who build a habit you see they describe this problem where they sit they're in an airplane they're sting in the airplane mode and the co-pilot isn't popping in so just sitting there and waiting there and that's when you know you have the Habit if you put yourself into airplane mode and you're just suddenly waiting and that's not I have to yeah yeah right it's like typing without intelligence suddenly right you suddenly like where's G and you're just ising then you realizing and you switch modes to to the old typing mode where ghost text isn't popping up all the time and that's I think where we're goes from annoying like this is noisy like I want to switch it off give please give me a toggle and you forget about it to oh this is just in my flow and I don't even have to think about it so a lot of these things are are about habit yeah I mean that's you know with any new tool you're GNA kind of have like a little bit of adjusting to be like okay what is this but you know and Herold to your point um you mentioned kind of like okay maybe if you're like really confident in a task like it's going to be distracting something like that um so depending on what level you are in your development Journey you know some beginners love this straight off the bat some people who are like oh this is a codebase I know really well like this is adding to it it is just kind of honing that habit to work with whatever your flow is again to you know try make it make the most sense yeah so to kick off a new session so I mean what one of the habits is like don't switch context so use GI up co- private ched as as your rep placement for a search engine right so in this case I can ask for example a question how do a functional programming in react and there's a lot of good block posts around there's a lot of things you can read but you can also just point the question directly in chat don't switch around much it picks up more context from what you're working on gives you an example and shows you that you can use you can write this as a hello world function and props and use this function then giving some some extra examples as well so it goes into all the details gives the examples in context while staying in your repon you can actually pick those up right away from here and run with them apply them to your code and do a few more things with them but in some cases you also may want to say like I want to actually have a more concrete example I think sometimes and that's something you would do you wouldn't easily do on S overflow or Google where you want to say I actually want to have this specific example of a stopwatch component that's what I'm working on and I'm struggling like how to rra my head run so you're combining two concepts and that's what AI is great at they can pick up oh you want to do a stopwatch I know what that is and I now know what function programming is so I can give you a full component in this case so to to explain what happened here so we do um as a valuate actually so when you use co-pilot as no price you get protection against copyright in infringement so that comes from Microsoft uh that also comes from the GitHub site and to get that you have to have public code detection uh enabled and I think what what so what's happening here there's an amount of characters I think it's 150 by default that that are being matched if that happens uh go go back you can actually take this last statement and try it out to see if you can actually can get an example that doesn't have that is not um license elsewhere and what what what we check is license code that we know so in this case I'll just continue on but I can also get get enough of an idea from from this example like oh it's using this case it's using a class it's not a functional programming uh component and just continue on so and yeah we're constantly working if you see those and you feel like this this is why would you snippy that like why why you remove that from me you can also uh click on helpful and we'll make sure to get some eyes on that as well and yeah so if if you are in in the settings click on the learn more link it's controlled by your organization by default so if you're in an Enterprise uh it's in it's a setting for get up Enterprise I'm in Microsoft org so that's why it's default on for me and that's that's where I want to give you control if you have it off or to not blocking you will actually see in the future which licenses apply so it gives you an idea of how to proceed if you need to make a decision as a developer how to use that code so it's kind of like it gives you the information and then it's kind like proceed with what you're comfortable with or like what you so you can definitely say like I totally expected that to happen talk about this as a value prop and I think it's something something you want it's it's something that is not happening in other AI tools that I see out there but it's a really key important to using AI it's just that having the security and we're doing more things we're also having new filters now for insecure code that are rolling out if there's any patterns in the code that we know you shouldn't use in production we'll also tell you we won't block it from you but we you will get notified um license code is in there so there's a lot of AI responsibility happening in the in these responses that um protects your code and protects what you're working on especially important and large um it's centerprise setting yeah I feel like that's like a whole Deep dive we could go into but yeah yeah I feel like um you know there is a lot of you know responsible AI is very important um and you know everyone's constantly kind of trying to figure out the best way to um properly figure out where the on us is on you know is it the user is it just letting them know about it is it blocking um you know so I think that this is all really great context for people to know yeah yeah and if if you if you're unsure like why why you have it in your settings so definitely check out your co-pilot settings on the GitHub homepage and then that should explain it as well okay we do have a few questions um one on um I guess those license copyrights again um so uh fuel SN says is the copyright of the license included um could you speak a little bit more about that because you kind of mentioned um seeing the license so that's something that's that's already happening in uh cop I think the content exclusion and then some other projects there are in preview I already see them uh in in my own tools uh on GitHub so that's that's being worked on so basically right now we have a setting of blocking and that just says like this included license text but we actually have the information so we will pull this as we go into the UI as well if you don't have it set to blocking you will get a notification like what license this code is under uh and how we found it so you can make that that call uh that's not in yet but that's something working on awesome um and then uh someone says when will this be in vs code this is I mean Herold is demoing in vs code so it's in vs code some of these are pre-release um but yeah just get the uh you can get the co- pallot chat extension um and then sign up for the trial if you don't already have it and you can be doing this exact same thing yeah it we just uh yeah basically just your starting point is install co-pilot um and then you will get completions and chat and then we'll just ask you to sign into GitHub and then then we're guided to uh get into to your trial so and then there are a couple questions um around the history of your chat with co-pilot um so how much history can co-pilot chat keep um can one thread work through a whole project yeah so um give you a quick UI tour which for later so you can you can quickly switch between past chats um one one L I like this as well is you can actually pop this into the editor so then I can open another one here and I actually now I can I can pop this also into a new window because we now have floating windows so you can actually open this up here and then you can keep this as kind of your assistant throughout so especially on large screens like keeping basically avoiding going to a a big hustl on search engines and reading long articles having that window I think is is part of the solution uh where you don't want to do that so from a think what part of the question is also how much memory does it have so and that's the thing that where it's hard to tell sometimes depending on how much other context you throw in we try to prioritize anything that's in the moment that you're adding so if you reference a file if you have code selected if you have a big chunk of code selected that will basically use up a lot of your uh token window for the AI so we're we're continuously expanding and how many tokens we allow but we also know that models aren't good at keeping massive amounts of tokens kind of making good use of it so we're trying to prioritize how we uh kind of push push more context in so if you if you're really only asking questions like I just did and that's B see every everything you see on the screen is what what you're what you have in context you can actually have longer conversations I think that's that's the way so keep in mind uh kind of what's how much context you throw in on top of the conversation and you'll see but definitely uh if if you know notice that a question is kind of off and you don't want to keep in memory you can actually hit the little um x button on the question so you can take it out of memory so you can and people do that on chbt and other places as well they like really deeply curate around what the AI knows so I can definitely recommend that habit so you can understand you have a mental model of like Ohi knows this question and it has this context and that's what the answer informs awesome um okay so we have a lot of other questions I'm just going to pick a couple so um uh Harold you can continue on through demo um but uh one follow-up question with the licenses um if you were to paste in a snippet of code would co-pilot license check that to see if it's copyrighted I don't think it runs on your input because I'm not trying to prevent you to work on open source so we're trying to prevent thei kind of parting open source and it's not like the AI doesn't like go into GitHub and copy over code that's license it's just it's part of the training data and that's something where it's just the licensing systems are very messy like not every code you find has a clear license file attached and some are Mis misli so it's it's not 100% um confidence place um so I think there's yeah so it's only on the output and but often s what happens when you give it open source code and you ask it to do something with the open source code you end up with with a code that looks again like the license code so you end up if you walk in open stores you'll hit it more often so I recommend for that case to switch it off okay that makes sense um there's one question asking if copal has the same features in Visual Studio 2022 um I they do have a lot of copal integration I would say definitely like check out the visual studio YouTube channels um and they'll also post blogs to kind of um keep up to date with what's released there because it's not an exact one to one um of what's in vs code versus visual studio um but Harold do you have anything to add to that yeah mean we're yeah we're really if you if you know co-pilot chat and completions in vs code should also get comfortable on the vs side so a lot of things I show um how it ANW question how to use models apply the same using the same service in the back we're talking a lot around like how we align ux so where if it looks very different then we did something wrong but mostly I think we're we're walking we're trying to make the best product in the editor that integrates um the way you would expect it so I think that's that's where the some of the Divergence happens just making it work really well for users of that product right cool okay and we'll do um okay couple more um so Ed has a question about um the sparkle icon they said are those stars in front of um your how can I set up a basic FL application owned by anyone um so do you want to talk about that sparkle icon just real quick so yeah so there's more of these so um I'll show some SPS later if I get to it um but otherwise so anytime you see these Sparkles now in V code maybe one one other place I show them where I was typing before is in the sample you see these um here now as well so these really tell tell you that there's a thing to gener co-pilot and we have them more in inputs now as well like in uh the commit message which I meant to show later now I can show it here so if I hit this it will basically look at the diff and what I've worked on maybe past commits as well getting some context of how we write commits messages and give me a draft so many cases these are like basically generated with co-pilot uh they're not in this case they're contributed by copilot chat as an extension and I think that's just a sign that this was developed by the vs Cod team so we're trying to get AI is deeply woven into every Fabric and every kind of thing that people feel like this is a chore like commit messages you can spend a lot of time on that especially as you just like oh I fixed this and then you right I fixed it then I fixed it again spend no time on it or a ton of time no time is not descriptive whatsoever yeah you're like pushing again yeah and then you look at look at the logs and the commit history try to find something and then it's just like fix again fix again fix again we give you give you a simple oneclick way to do this I think that's going back to flow so these these Sparkles are usually around you don't have to do prompting you don't have to think about stuff you can just automate a thing away that would take away your time and maybe at least get you started with a draft right it's really just just cutting the time short without the typical AI chore of going into prompting mode right now you have to you don't go into chat like please write me a commiss message for the things I change so that's where where the the new coding chour became prompting and I see people especially that have some prompting experience from using Twi spending a lot of fine-tuning and time on on their prompts where it's like yeah this is now just a sparkle button so and and you see you see a lot of we have other places with like commands that cut this short so I think that's that's the longwind answer like um I think Sparkles became the new AI button kind of connection and you see these as well in I think office as well has has some some connotations of those so it just yeah became the new AI everywhere me awesome yeah good call out um good ey I forget who asked that question but good question um okay we're do one more question from Doug asking um is the token Max 8K you know what the token Max is uh that's always changing so um there's two there's two places we play with token token sizes the amount of response tokens we allow and the amount of input tokens I think most people ask a question about input tokens just how much context can I provide and actually again if you open your output uh if you go into here and showed the output channels then you go into um chat you actually see see it being locked so it open up because it's messy um and that's that's where we we show what context when you use uh but it's also where we try to keep it dynamic because again the more context you kind of pre-allocate and use um the more specific you have to do like how you curate that context so many cases so we're we're we're expanding it now mostly within capacity of what what we have because we have a lot of users coming in so it just if you overload our capacity it becomes low for everybody so walk in this fine line of how can we make it larger but how can we also make sure every part of co-pilot chat adapts to what use because you don't want to just say oh like you ask a question like let me throw in the terminal let me throw outut let me show the current file and then just very confused about what you're talking about I think that's what people expect like just just give every context and then yeah magic will happen uh but that will make the response lower so if you see the graphs of how many uh how much token context we provide and then how slow the response gets and you don't want that think that's where going back to flow that's where AI can be really um anti flow in many ways where you just have to wait for an answer you ask a question and just sitting there and you lose your flow and you think about your lunch and then maybe you run off to make a coffee while else yeah so it shouldn't be that right should so we trying to keep that balance but um yeah so if if you see I think we're trying to be more transparent about what's in context what's not what you see um show later some of the um actra contexts being used so that that that should help people to understand what's in and what's not in okay totally makes sense okay so I'll let you um get back to your demo what you had planned next there is a couple there's a couple requests in the chat um if at some point you could also show um co-pilot chat you know we've kind of been asking like general questions um but if you could also show like Project Specific questions too so like actually have a project open ask C pilot chat some questions about the project rather than the you know kind of general questions we've been seeing so far yeah this this is my my last general question actually just to get an idea like oftentimes you want to have this this sounding board of hey I'm working on a project what should I use um in this case I asked web based recipe U PR App and then know some python work asure functions it gives some of your expertise and then can really go in and ask oh flask or Jango um Ginger 2 for templating boosta in some case it's also posing tailwind and it gives you a list so now you can actually um go in as well and ask like is there actually for vs code so there's a we have these agents top before there's it's they're faster ways and kind of more specialized ways to give a task in this case ads code is an agent that has knowledge of V code and now I gave it Python and booster CSS and post cre SQL and actually ask it with that knowledge and continue my conversation like is there how would I do this in BS code and this is now grounded in the commands I have into and some some more places where I can understand the extensions as well what's cool here now it actually goes in uh and recommends me use Dev container which is a great way which you have it on show as well it's a great way to use different Stacks without thinking about all the Cains of setup so it even onboards you into some of that um just to show like a last connection here is if I do this whole from scratch if I actually wanted to get coding now you don't want to go in a big Q&A with Chad you can just ask it to do it for you so in this case we have a new command in the workspace agent which would just give give it to you you describe what you want to do you can iterate on it as well with feedback in this case it knows Python and flask it didn't even give it the prior decision right I cleared the chat before it came up with its own stack based on it it provided the basic scaffolding of how it looks like and if you just want to have a proof of concept like oh I can build a recipe app then that that's a great place to start so you don't have to think about all these choices and um how did she set it up yeah I love doing stuff like this because it's just like it kind of takes out that brain work of like okay I need to actually scaffold this whole project it's just like copilot can you do it for me um yeah just a lot easier so since people ask um let me go into the vs code people and ask some good questions um just another form form factor for chat so the the panel is nice um I also like it on the other side depending on so if you want to keep it always open I think a lot of people I see like always want to have chat open to ask questions uh but the other way you can open this is if you go into the title up here and then you open quick chat let you see kind as well command shift I or control shift I on Windows and that's a different form factor and that's if I I open this any time I open this input will be focused anytime I ask a question the last question the last answer will be shown so it's a really nice way to just use chat in the flow without constantly having it your screen right so in this case I'm going to ask it co short work so um so at at workspace um is a is an agent I showed before with at workspace new but at workspace by default will pull in your whole code base so it looks at a question looks at oh actually worked so cool shortcuts I don't know forever okay uh so at workspace is is this um I guess what people usually see is schematic search but by by default underlying is there but really looks at a question and pulls in other information as well so it knows you work in vs code it has the intelligence information and the language server uh at hand so it knows a lot of the aspects of what you're working on and what you have open so if I ask a question at workspace explain disposables it would not explain disposables at a level wor of like oh like I know disposables as a word but actually go in and look at uh where disposables you find in the code is there symbols and disposable is their file names disposable and give all this information to the AI so it knows understand in the general that disposable is applied to databases and releasing resources but it also looks at now it looks at the tests and linting rules so I can understand disposables in the context of uh the vs code repo okay so that's one way and that's kind of from a higher level perspective if you want to just want to navigate a repo if you want to understand like in the team meeting somebody mentioned a word somebody mention concept of this rep that I don't understand and then where do I find this information like do I know need to go to a Wiki just just go to your code and ask the question to co-pilot um what's cool about this we also have a semantic index run by GitHub so if you're in GitHub Enterprise and you use GitHub copilot CH on github.com you will also get a similar path to ask questions against repo and it's not really chatting about your code it's more of a more powerful way to navigate and search through a code base so let's do another one uh more specific one so now we know so good short cut here is contr L to just clear it out again you want to kind control what I didn't know that that was a shortcut cool yes it works works most chat windows I think so in this case how does search work in command pallet that's really understand like oh like I it's really cool how this works like I see it all the time but I have no idea like how does magic happens I think one of the really cool things if you haven't adopted a command pel is code I Heidi recommended this case it Woody so this case it couldn't find apparently what what it was looking for and that happens as well like sometimes because it does it chains together multiple AI calls to try to make sense of it so if it fails once don't dismay so usually like oh Copa is not good at that because I ask it once but I think usually I I do I do recommend uh just trying that again so give it another chance to see if it can find right information especially we have a couple people in the chat saying oh it seems like workspace sometimes can't see like what I want it to see and maybe they're not getting the exact questions or answers that they expect yeah and that's the fun part with these life life demos I can I can pick and you see that once you are in AI you see in recordings and live demos how people really hyper optimize the query to find the right thing so I didn't I didn't do that stab so mishaps are exp I think that's good this is real yeah so I think that's that's where that's where it show so if you if you don't find a concept and that's I think the problem with AI I prompt crafting in general if if you give the wrong concept or the wrong wording to AI sometimes it can figure it out sometimes it will just totally mislead into something else so I think that's that's a good thing to keep in mind understanding how to talk about things and that's I think part of the Habit building the flow like not understanding how prompt crafting Works in general uh and how how to to explain things to AI so I can also go into specif files now so I didn't understand uh there command filtering overall and you can see what it's grounded in so you see like it didn't pick up the best the best files here so I can go in I could go now and and say like oh I mean the ha Surge and command P maybe I can find better words to describe it um so but just just to show off some of the more cool stuff we we have right so um there are a couple questions um so Ed ask is there a concept like the rest of my code sometimes my question is about where in my stuff is um so it sounds like the at workspace agent would be kind of appropriate for that where you would just be like hey um where do we Define XYZ yeah totally totally I think that's where you can and and you can combine the different methods like at workspace will pull in we'll look at a question and apply like okay which files would match this question and then AI will kind of take it pick up the right um context and explain stuff so that's really the rest of my code so at workspace is looking at everything that's in open in not in your editor as tabs but anything that's in here and we're working on things that you can control for example if you have a folder that you don't want it indexed uh we're looking we're already using the get ignore using your ignore settings from vs code so you can kind of guide it to what it's looking at um so there's there's definitely more uh more fine tune control uh but the other way you can control it if you go into chat now um I just use that so you can give it specific files well if you do hash file see down here and that's that's pre's only so I remember saying that uh but will ship within an next iteration you can actually give it specific file so even if I don't have it open um so I can ask question for example what fuzzy searches and so now there's a filtering file actually I'm GNA close it as well delay but so by default if you have text selected cop will look at that I think that's the best that's like the the the sure file way to curiate what co-pilot is explaining about is if you have it open and you have it selected and that's the way for inline chat as well that's the best way if you know you want to apply a refector on a specific set of code you selected and you give it a co-pilot in this case it could pull in the filters file it found different functions and explain them so that's how you can even even combine m files you can think about I can give it that file like use this function from that file and implement it over here so use AI to combining the things uh can also help okay awesome okay so there's a kind of a couple questions around this um Doug asks can you use um the uh pound file more for more than one file or could you use it for a wild card for example we don't do wild cards yet um that's a cool idea I think I could imagine us thinking about like at workspace and maybe giving it a folder in the future or or something else uh but we don't do um yeah there's no way to to Really guide it so def think that's part of it's all very natural language focused but we're trying to find this Middle Ground of CLI to give developers control and high specificity to just want to ask a question and it should work right now we're more in just control of explicit context and one one file okay that makes sense um and then there's a couple questions a little bit more about um you know how can I narrow down the context that it's using how can I uh take out certain files to use in the context so just to kind kind of summarize what you said like if you know exactly what code you want to look at that you want to explain just selecting that is going to be short fired way that copilot knows you want to look at this yeah yeah okay and that's something we're really actively working on as well so we just had a big team discussion this week like what should be included by default because I think for new users it's great that there's a lot of content by default but then if you just want to ask a general developer question and suddenly it answers it in the context of your current file it's it also doesn't help so and we we found like AI is still not good especially as like developer names are all kind of weird like one of the things we have in our tools called Monaco like that's that's the open source editor of V code but if you'd ask like what is Monaco in a no context question you get very interesting answers but then you want to answer it in like do you want to at least understood in the context of vs code you're in the vs code repo do you want to look at the current file so that that's where we're working on so um definitely selecting a file um best p path uh curating really what you get into and that's I think the path we're taking is more explicit and then anytime we do something implicit like magically and I can show that quickly as well so we have um let's open this this function here um and filters something we do that's really cool some space so we have um to go in now so this is a cool function this probably looks like it's like just like aggressive matching and pattern matching so this is this is a fun to thing to explain so that's that's a part we we do very magical things so if it oh just show slower so co-pilot explain this and that's s me I I know typescript I just want to get an idea of a function and now what you see it's not just pulling in it gives it the code but it also has more references in here what happens if you do SLX explain it does more implicit kind of context on top of it so it also pulls in related um definitions so it pulls in other parts of the file where things are defined that this function uses so no uses fuzzy score from somewhere else in the file so it pulls in that so I mean that's that's a part where we do a bit smarter context expansion where it's just like there's there's more we need to answer this question so that's I think where you see explain fix as well I can show that later where we not just say oh you and that's what you often do in chat just copy over the error and you copy a code and then fix this code but what fix actually does it also looks up definitions that are being used are part of the error and it expands to stack tray if there is one and pulls in other parts of the code and if it try to fix it and there's still an error it actually retries with the new error as well so there's some it approaches well already built in so I think that's where if you use the functions and there a function for something you want to do and you can always check basically the command here and I think don't tell co-pilot like please fix this error for me but use Flash fix that's the explicit way okay cool and and um you can kind of see that that would end up using does it end up using the workspace agent um if you do the slash fix um because I see it's kind of like at work space on the side too yeah we we put it into at workspace because it's the most natural way like it does some of the at workspace functionality where it reaches out and collects more data so that's where it most it uh it doesn't use the I mean that's where more clarification like it doesn't use the at workspace magic of you can give it a very broad description but it's a very uh kind of one task automated Okay cool so like every command kind of maps to an agent but it's kind of okay yeah um there was uh one question too kind of on that workspace uh Trend um where Tom was mentioning that you know sometimes he feels like the uh co-pilot's not seeing everything um that he wants it to see and he thought that maybe it was permissions based is there any ever like any instance where workspace might not have permissions to read something or to pull something into copilot's response if it's not ignored in VI code it should be in there uh the ignore feature is only rolling out later so that's in Contin exclusion thing it's called on the GitHub site that's uh already supported in beta for completions but not yet in chat so that'll be later so right now it should if it's not seeing something like I I showed like the the the command pallet search one like it worked five out of five times before for me I tested it that's cool and then um it's mostly around like that it we try to expand the surge query like what files could be matching so there's there's several llm calls in the back end to make it work so it's just probability Stacks up and sometimes it doesn't doesn't find stuff okay makes sense okay um and then we actually have one suggestion saying that they'd love to see you know you showed that file command to narrow down the file they said they'd love to see it for a directory as well too um so very cool yes definitely I think if you can um and next thing we can actually try natural language because co-pilot will see which files are in in the that we found so if you tell it only look at files um because the search is is isn't just semantic search it also looks like these file names could match so we do a lot more smart matching about where things are at uh so you can try part of this in natural language as well so you can like steer the AI towards just looking at files that you wanted awesome cool okay let's do some inline chatting um so much good stuff lot questions yeah I think we we should do like like a live stream just just go by chat Q&A just Q I know honestly yeah we could be here for hours yeah so far R for like a quarter of my demos so so I have to come back we'll definitely have to have you back this is great though yeah so yeah definitely definitely awesome to have these questions so let let me just throw wait so just show before how you can use ghost to type this but um how many like raise your hand in the Chad tell me of using uh Inland chat so that's I think chat on the typ bar is cool chat in the quick chat is cool to ask these questions but most of the time you don't want to check with your code you just want to like get faster coding with AI so in this case calculate a function human Nici number um and I can just say like make this function make it happen I don't want to describe it along many words um and it's this case you see it picked up that everything's documented everything else in the file is documented picked up style um wrote it really nicely I think important here to stay in the flow and I didn't show this how how nicely it streams in because it scrolled up but it will like really like as you as you see your response you can already start thinking like is this what I want like what what next step do I need and the big part is here that it's it's iterative like you don't want to just say like Okay I accept it and then keep working on it like think about a follow question like what is actually make it recursive this is using a loop um maybe you want to live on the dangerous site and and make it recursive so you can do that too now it roted from scratch and cursive it's much shorter but maybe let's make it not recursive so you can keep making these choices as you go through it but keep keep the thing open so otherwise I think you lose and that's something we're going to fix later on but but you lose that history of how you got to it code right so you have that one going and that's cool it's a lot of n numbers now so make readable yeah just to show you a trick I think I saw in one of the live streams that people really like is um this this field has history too so you can go in any co-pilot chat field you can go up or down if you hit command up or down you can actually navigate through the code as well so you can get escape the cursor out kind of and just get the last one in here is readable variables and then we can accept it I think so you see copad going through yeah so that streaming beautiful just just that moment just just reading along keeps me in the flow in this case feel like um and that's something we're really tweaking if you see if you feel like anytime co-pilot is too to noisy or too busy and K kicks you out of your flow I think that's where if you look at the iterations of how we show the diff and how things are streaming in it's been so much focus on theme going into those kind of experience where it's like no it shouldn't kick you out of your flow this should be something something exciting something you like you read along and you you're in into it so yes I definitely want to have a whole session for just just looking at streaming and so do an ASMR session like it's just nothing but us typing and then watching the streaming yeah yeah there's no no typing sounds too it's just yeah so it's cool if you saw in the prev like you have full Intel sense you can actually go in and added so it's still your code something that's like key rule number one for using AI like stay in control not just like oh like this Cod looks great it's sometimes hard if you don't know the the area well but you also get linting we give you intellisense so we make it as easy as possible so by by chance if I wrote this myself I always struggle with showing like the the fix aspects of of copil because copil is not is actually not often great at like making errors it's really good at making error free code and if it's if it goes wrong it's just really wrong in different ways but not it still lints fine right right not syntactically wrong yeah so this is where where show us so this is another Sparkle opportunity so it's another place where you see Sparkle So if you just have a if you have one of your red squiggles and that's really part of flow as well like errors are not part of what you're working on so you don't you don't win in the end of the day but I fixed so many bugs while I was working on this so it's really like it kicks you out of your flow so in this case if you just go in Quick Fix You can get into this menu the same menu is behind the light bulb if you go into here so you see the Sparkle and then the fix of co-pilot you can explain ER in this case but I just want to fix I just want to get out of the way and it's mostly descriptive but you understand it's um I cannot be applied but then in this case it knows that the fix is actually up here so it knew that this has to be a number in this context to be correct so um and it the nice part now that actually sh recently is we actually the AI explains what what's happening and gives you more context behind the fix so it's not just like here's a fix it's all good you can continue on move on but it gives you an opportunity to quickly read it and learn it and say yeah that makes sense like I I totally uh can see that so many possible a really great functionality because I I know we'll see people be like oh well like I don't want to be completely riing on copile and I just have I'm accepting suggestions I have no idea what's going on um that's why I love that you know the fix does kind of to give that explanation so you're not just like okay let's just fix it like you're making sure okay is this actually fixing how I want it to do I understand what's going on um so you do have that context yeah and just to just to kind of expand the idea of what what you can do uh so this is um I think what people saw before these these agents right what we call them so if we go in here hit add these are agents so terminal I didn't get to show maybe maybe I get to it in the next live stream um kind of ways we we name space uh V code and give you these these like really nice automation tasks that that just work out of the box and you don't have to think about prompting it just works same for spark lens but you can also make your own so I think I'm definitely want to get people to try out this chat agent example lives in our repository with agents um with extension apis there's a cool blog post as well that I can recommend so pursuit of wicked smartness from uh last November and it talks about this agent concept and how we we're thinking about it and and gives you some ideas how to get started it's a proposed API so you can try it out now we're definitely moving on it fast so um it's it's a really fun way to extend copy with your own kind of functionality this case a rep repo is all about cats so this is a cat themed we actually feel going through it quickly so we have the activate function which every extension has then it has a Handler for chat requests and it gets the request it gets a progress indicator to report back when it's done and it uh the typical stuff and it looks at which command it got and can reply accordingly so in this case it um if I just run this it will be a cad and it will say uh things in the voice of a cad and teach you computer science that's awesome which which we all needed yeah catchat Doug that a live stream walkr on this concept and the extensibility would be great um so yeah yeah that's that's um I think definitely on our list so I'm now in the extension uh development host U to to try this cat out so we're gonna give it I now can hit add and I have my cat and I hit teach uh I'm not sure if I can actually give it look at the code again what is recursive I actually picks pointers I think it has like a preset is it window knows um yeah actually pickes hopping so it doesn't look at what I'm typing okay um so it starts yeah explaining poter with fish and red dots so that's um very very cat-like but now I can so let's let's button yeah so what I think that's that's what people have to figure out so now I have selected the whole thing and I and I say with inline edit what if I don't want to have a cat and that's where usually many sample app start with like here we give you some scaffolding with an examp functionality and you kind of go in and you delete it all and you start from scratch but I want to change his agent from a cad to a QA tester that drafts a test plan like that's that's why and that's maybe the the the idea for the whole the whole session um and I just want to make it a change and that's where you can really expand like from inline chat to just make some code or rewrite some function just give it a whole file and tell it what to do like we saw people rewriting from gulp to another automation engine and copilot did an amazing div of just like I know both these Technologies I know what you currently have and understands from the example code that's still in there of of how the patterns work so in this case it it made a QA tester it gave it a subcommand draft gave it a prompt that you're a QA tester um and just it's it's it's a lot shorter so it removed all the T commands as well and it has a follow-up for saying thank you to QA which is amazing yeah you should definitely think the QA you should and so can I again go in and it now you see that it actually has accept buttons on every single one of these so uh basically where it apply changes um get in and now I can go and and actually run this I see here it in a QA tester it just applies to The Prompt so maybe here I even say um that I want to have more more cont right it should include uh more context from from the current file actually I like if you were having to adapt this yourself like you'd probably be doing like finding replace you'd be just like going through you'd end up missing something somewhere like and this is so helpful yeah so it's cool now it it put up and that's something we're working on ton of that's where a lot of improvements happening in lat and not so I have to like one change is up here so that's where it picks the active editor and then where it it puts the user message at here it combines the user message from active editor if it has one and gets the text so yeah so it knows where where to put it it it did it so we can actually try this out now um it's all live exciting so we like to see yeah but I think that's that's definely its own session like how people can can adopt this within the team if you have an extension you can give to the team like here just um you kept asking me about this thing and it's not really part of my work to I love helping you but uh please uh use this extension in in vs code to get some help from AI using my personalized knowledge maybe some external API you hooking into like this QA tester could be connected to uh your QA system could be connected to issu um your create process so it follows maybe a draft template even so you can think about all kind of ways um yeah there's a ton of potential there you can do this so let me select it and then yeah we have draft nice that's awesome yeah I think it did change change the image uh from a cat so I don't have a QA jpack in here uh now it looks like this is a VJs component um and it gives me a great draft ofu like which view components I should test that they State handling and a test structure so yeah that's so cool from from doing it in two minutes I think I have a really good QA tester so this is a yeah just just as an idea um for this uh the the samples repo it's it's on on GitHub I think we're link yeah and and I feel like you know co-pilot people a lot of people's intro to co-pilots they might just see like a really quick demo where they see like oh that's really cool and it kind of Sparks their interest in it there's so much potential with GitHub copilot and vs code and that integration there's so much you can do from these chat agents um you know we barely scratched the surface and we've been doing this for an hour basically so I think that it's it's really awesome just to kind of see all the different use cases and to start thinking about okay how can I really bring this into every aspect of my development workflow to make it to make it more helpful yeah yeah it's like partially is is changing habits like in the beginning it will feel awkward like you have to remember to use inline ched we try to remind you by putting up the Sparkles and the so if you see sparkles like make sure like oh like there's a trigger in your brain I like yeah probably can do this with co-pilot so I think that's best part of and really curate everything by having a text selection like inline chat is good at picking up that I wanted need to add stuff here I did intentionally select more code so it knows like to put the the change on there so it's really about being intentional what you have selected and how you use it but but biggest part is just changing bit how you code but that that changes eventually that comes into habit you don't really think about it anymore you just say I have a refactoring I can use ghostex for that right because it's just adding code maybe that's even faster because you you not do open Inland chat but me sometimes it's just open in theine chat describe what you want to do see the changes coming in iterate with AI and not just accept it so absolutely um I know we're right around time unfortunately but you have so much to show is there any anything else that you definitely wanted to show today um you know we can do yeah I can I can show more the next time but I think one one part is um just one cool advice you will again find spark of everywhere like we now even have them in the in the terminal uh hit the command I shortcut command shift I now it's breaking so ask um play around with the different context in varable so here's the commands and I haven't shown too much of that but you can even ask about terminal section for example so this is an area actively growing so keep keep playing around with different shortcuts make sure you uh check out the help as well so if you want to see more of that um we all put it here so that's it for now so I'll have to come back awesome no I think that this is great I think that help was also a great way to kind of end because um a lot of times we'll get people being like okay like how can I figure out everything it can do that slel command is kind of the best way to do it because it will be up to date um you know instead of looking at documentation or like a video from like six months ago where obviously it might be a little different like using that FL help directly in vscode is how you're G to get that most up to date what's what's hard I think me people and that's I think a typical thing of people buy it where it's like just asking the AI what it can help with sometimes it works and sometimes it Miss misleading so we we had to now in the prompt tell it it's using gbd4 because people kept asking if it's using gbd4 and the AI itself the model the vanilla model doesn't know what what it's built on like you can't ask like how's the weather like in your data center like like it just just doesn't have context so it's not trained with that information so people expect a lot of kind of context around like what the AI knows um so mostly I can play around like ask you a question about like coding related things and then you'll we'll be in a much happier place versus interrogating the AI about what it knows and that's something we we're working towards like the AI should eventually know like oh you're asking a question about the workspace but you haven't included that workspace eventually it will just do it but eventually um just just understanding the right commands and and getting thei to just do the task as how how you stay in the flow okay that totally makes sense um all right well Harold this was great I feel like it was such a great just even I mean right we scratched the surface of go by so fast I know it it does go by so fast there's so much I was like oh we have a whole hour we can get so much in but I mean y'all asked amazing questions so thank you so much for that um and for your suggestions too that's what we love to hear um you know getting that feedback I know Harold showed um at the start that just you know in co-pilot chat there's a little provide feedback icon that you can click um so you don't even have to leave vs code if you're seeing you know any issues or you want to just provide that feedback you can do that directly in there definitely cool all right well well with that Harold thank you so much for being here I'm sure we will have you on again to do the other probably what like 70% of what you had planned um but we had lots of great uh uh comments Doug said thank you so much learn new things um we had uh Ed say excellent more more more so uh yeah lots of great feedback here thank you all for tuning in thank you again Harold for being here and showing um off some amazing features of co-pilot um for those of you who maybe miss the start of the live stream or only came in for parts the good news is this will all be available on demand on our YouTube channel so make sure to subscribe to that and then check out the on demand um and if you subscribe to then you also know when we have Harold back on to do another live stream um and then we're also available on Tik Tok as well so that's a great place if you want to like find really you know 30 second videos on some cool new co-pilot features that's a great place to follow us as well so you can keep up to date with that all right well with that Herold thank you again thank you everyone for watching and we'll see you next time thank you Olivia thanks [Music] Buy
Info
Channel: Visual Studio Code
Views: 29,218
Rating: undefined out of 5
Keywords: Copilot, GitHub, Code, VSCode, Flow, AI
Id: UqX2hCpgUJ0
Channel Id: undefined
Length: 71min 15sec (4275 seconds)
Published: Fri Jan 19 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.