WSL and VS Code: code on Linux on a Windows laptop 💻

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
all right hey everyone welcome to the vs code live stream my name is sana ajani now i'm a program manager on the team and i know normally i'm behind the scenes and you're probably expecting the ever so charming burke holland to be hosting this right now but we're i'm filling in for him today and so i'm excited to uh post this live stream and bring on our guest today a couple of reminders um first off join us on youtube and join us on these live streams so if you go to code.visualstudio.com livestream that's where you'll see the list of live streams we have upcoming i know that we have a release party coming up next week or the week after so stay tuned for that as well and also make sure you subscribe to us on youtube so you get notifications about um all the live streams that we have and all the videos we're posting and lastly join us on tick tock because that's where the real party is that's where we host a lot of our like short tips and tricks to get the most out of your vs code experience so without further ado i'm going to invite um another program manager from the wsl team craig to join us hello how's it going i'm good so you're on the west coast right yep i'm all the way over near seattle all right well thanks for joining us brian early so tell me a little bit about what you're going to talk about today uh so i'm going to be talking about how you can use the windows subsystem for linux or wsl to do a really awesome linux based workflow uh inside of vs code using remote containers we're actually going to be teaching a ai well the ai is going to teach itself how to play atari's brick breaker um which is really cool so just a quick primer for maybe some people who might be new joining us can you just give us a quick primer on what wsl is yeah so wsl stands for the windows subsystem for linux and it is a really powerful way for you to use linux workflows you know applications command line tools utilities all directly on your windows machine and where it's really different than you know traditional vm setup or dual booting is uh it has a really high level of integration between windows and linux so you can access everything back and forth um and share things like even your gpu access which is what i'll be demoing today all right so truly cross-platform development exactly okay cool so i'm going to go ahead and get your screen up and take it away awesome all right so i'm jumping in over on my screen i've opened up windows terminal which you can probably check out here this is if you haven't seen it before a really awesome way to interact with the command line on windows and here i have a drop down and it actually shows all of the different things that i've installed these are shells that's like command prompts powershell the one that i'm looking at today is ubuntu so i'm going to open that and from here i'm going to go and go to this folder that i have which is where we're going to actually be doing our coding for today and it's called deep q learning because we're going to be teaching a machine to play atari uh using an algorithm called dpq learning if you're not an expert on ai and ml that's totally fine you don't need to be um we just want to do a fun example and so normally i could go in here and do things like ls or i could vim and you know start editing stuff but i really want to use vs code because i absolutely love its editing experience and all i have to do is hit code dot from here a lot of magic happens um we basically get set up using vs code remote um and how we know that is because if i take a look at the bottom right you can see this magical w cell ubuntu here that shows that i'm actually running this directly inside of my linux instance however everything on the ui side is running on windows itself and why that's important is because all of my extensions you know i know any visual themes that i might use they're all consistent across windows across linux even all my different linux distributions um so where this really matters for me is i actually have an extension installed called the vim extension and this just lets me do you know keyboard shortcuts to to program really fast so for example if i wanted to you know select this line and then select um 10 lines up all i have to do is you know hit some quick keystrokes don't touch my mouse and i can do that which is why i use it and uh that has basically transforms your your keyboard to work in a different way and that's consistent for me now across windows across linux across any project that i'm using which is awesome and if i actually take a look at these files you can see it's probably gonna be really tiny on your screen but let's make it a little bit bigger let's see if you can really see it close um you can see that the file ending for this file or actually it's path is a linux path right it's under slash home this is actually really linux and then if i go and hit control tilde on my machine that will open up the integrated console and i get my exact linux console right that i had here before um which is pretty sweet so it's a really powerful uh way to use this directly in a linux environment uh what we're gonna do is we're actually gonna take it one step further i'm going to use uh development containers to actually do my work why this is the case is because uh for using you know ai and machine learning it's really cool it's something that i don't do that often though and so i don't want to have this all installed on my machine for the rest of time i would rather just package this up into a container do my development in the container and then afterwards i can just delete it and have a really clean base environment which sounds preferable to me and so literally to get this project set up to do that all i have to do is hit shift control p i'm going to open up my command palette and i'm going to hit reopen in container from here um the same kind of magic that happened when we ran visual studio code remote is going to happen here it's going to start up a container and i'm going to i can hit show log and see what it's doing it looks really complicated but at the end of the day all that it's doing is starting up this exact same environment so everything looks exactly the same we'll close these recommendations um but now we're inside an actual container so you can see that the the path changed because i changed it inside of my container and when i go and hit um control till then open up my terminal again i get this cool tensorflow logo and i'm the root user so i basically just changed where i'm developing and then this container has all the tools set up that i need all the really intensive tools like it has cuda it has tensorflow it has everything that i need to go develop my container my machine or my project without worrying about it being actually on my machine and what's cool is i can go back here and you know i can run um docker container ps this is just going to show me what containers are running and you can see this vs code containers running right here um and if you haven't seen this before with the power of wsl i can run powershell.exe so now i'm in windows land right like this is the exact same as going to this tab and then i can run docker container ps um and i see the exact same container so no matter what workflow i'm using with wsl um i can basically enjoy you know using this on windows using this on linux it all works exactly the same so even if you're not really comfortable with the deep linux command line um or you know using a linux shell you can still boot up into remote containers really easily using totally windows tools like powershell and now i hit exit and i've jumped back to linux in this terminal but let's jump back over and let's actually run this before i jump into how everything works i mean let's do what every great developer does and just hit f5 and see what happens so i get my python file um debug info i hit a five it starts a debug session here and then you're going to see a lot of text spew out um on the on the output this is because we're actually setting up wsl to use your gpu uh using tensorflow and here we go um we have atari and we have a ai playing it i swear i'm not playing atari my hands are up here and it's actually quite good which is really cool and what's actually going on here is we have an app running inside of a linux container which is running inside of wsl and i on windows and remoting straight into that container and i'm running my app there and on each frame we are interpreting what is happening on the frame whoops we're interpreting what happens on the frame and we are choosing an action for it so at a base layer the ai is saying hey i see that i'm about to miss the ball i need to output the action of moving left um and if i go and take a look at my system you know task processor you can see that my gpu usage has gone up and i'm actually stressing the gpu here because i am actually using my windows gpu to power this in a performant way which is really powerful for if you wanted to actually do uh training on this machine which you can do as well and the last really cool part here is let's take a look closer at this window for any linux fans out there you might notice that this is a gtk based tiling here this isn't windows tiling why is that this is because we are actually running this as a linux gui app um wcell has support for linux gui apps i can show you really quickly what that looks like if i hit get it which is not installed that's fine if i xi's the old standby um then you can see i have a linux app here running straight from linux um with linux like tiling at the top and i can run that directly inside of wsl this is now supported by default on windows 11. and what i've done is i've actually piped that information to this container so i can access that same server okay now the point is how did i do this what what did i do to to set this all up well there's a dot dev container folder here that has all the magic if i go into this and i look at devcontainer.json you can see the setup that i use to set up this development container so i have my docker file which is referenced right here and i have my settings of you know what arguments i want what my workspace folder is i even specify the extensions so the python extension is installed by default let's take a look here um you can see python is installed here by default and my vim extension is still installed so all my themes are the same which is awesome even if i'm using different debugging tools and then there's also mounts so we're actually mounted the exact same folder that we have in my linux instance which i'm used to using into this instance here so for example i'm just going to go in and say you know hello world and we're going to choose you know some file.txt so that just made a file uh inside of my linux instance from normal and i get access to it here right away and of course any changes that i make here will change my source folder inside of my linux distribution automatically so uh and then the last one here is i put this in as well uh this is a bind mount for uh you'll see that it says x11 unix and there's actually a unix socket that we use to basically uh forward um graphical information uh to to facilitate gui apps i just forwarded that pod that socket into the container as well and so it's actually sharing the same socket as uh my linux instance and so anytime that this container says give me a graphics app it forwards it to my actual wcl instance letting me run graphical apps from here and then the last part of this is if i go and take a look at this uh container at the docker file you can see that i added an um i use the automatic setup which is really cool like you can basically hit shift control p and do uh you know create remote container if you search remote in this you can basically find all of the really cool extensions here and i use that setup to automatically create this um and then i install some specific things at the bottom here to set up my environment and so the last oh one thing that i forgot that's important this project uh is part of um it's from someone else's code uh user b o y u a n f on github the q learning project i basically took took that uh cloned it and um added some changes and so for you uh machine learning fans out there um i changed it to a dueling q network and if you're not a machine learning fan that's totally fine basically all you have to know is that this code is basically interpreting that image and taking a look at it and then determining what's the advantage so how you know what's the advantage of me making action right now and then what's the output that i should do um but you know that's not fun let's actually debug this so i'm gonna go here i'm gonna uncomment this comment this and then let's set a breakpoint somewhere um why not here and we'll see what happens so if i hit f5 i'm gonna debug this on my python file and we're off to the races so again this is gonna take a little while to start up because we're passing the gpu there we go it went quite quick um and uh you can see my gpu devices here and what's cool is i'm in a remote container right like i'm not actually technically on my machine but i get the exact same experience as it is on my machine i can go in i can step through anything i can do full debug information here right of my score the model everything and i can debug this exactly as it was running on my machine um which is really powerful because um this isn't something that i want to run on my machine all the time um this it might be something that i want to run somewhere else for for heavy training and so i can go ahead debug this and step through the whole process and run everything that i need to and where that gets to for the last part is exciting is that um i can actually package this entire project and i can put that into a docker file because it already is using containers and then i can go ahead and toss it up into azure so i put this into the azure cloud uh you can see it training here it's running on a beefy computer with like 13 gigabytes of memory is what it's using right now and i can go in ahead and take a look at containers the ui is going to be a little small but that's okay and i can connect to this container running now in azure and it's doing the exact same workflow so it looks super similar um because it is and it's doing the exact same workflow right if i do tail out.log so that's just this command is just showing me the output of it you can see the exact same output that we had before on my vs code instance here um working directly here so you know my step my episode number the score the loss that i'm seeing on my container um the time that it's been running et cetera and so this is actually how i trained this instance i didn't actually run it totally on my computer um i ran it to verify it and i ran some other smaller models on it um but when i was like okay i'm ready to do this on a really beefy computer i was able to just really easily push that to azure and then i on azure run this entirely in a container incredibly in easily and then the last part of here that's really cool is there are other extensions that i don't even have installed normally on vs code an example is because you know i don't do machine learning in ai all the time i don't have tensorboard installed on my um on my normal vs code instance however in this remote container i do so i can actually install other extensions that i wouldn't normally have um so we're going to open up tensorboard intensive board is is just an application that shows you information about your running sessions or your past sessions for machine learning and so it does things like you know i can this list here is my list of sessions that i've run and i've trained locally on this uh and i can do things like um change the smoothing operation of it so i'm going to really make it smooth you know i can change the horizontal axis to see when i did everything etc um so this lets me go in and take a look at this information really quickly um and a really cool extension and i don't have this installed normally on vs code but because i'm running this in a remote container vs code super smart is able to say hey yeah i recommend i see that you're running this awesome workflow let me make this recommendation to you um and run tensorboard in it just like it would normally so at the point of the end of the day it runs exactly like it would on my actual machine uh very similar in terms of performance being able to run things and even running you know linux gui applications um which is super super super cool um because this machine learned how to play atari breakout on its own and let's we have some time so i just want to give a quick overview of actually what's happening because i think it's super cool um this is a model that was developed by a google's deep mind it's part of a research paper what it does is it takes in you can see that it basically takes in this first layer is taking in a picture of the screen it's actually taking four pictures stacked on top of each other so i can get an idea of the what what's moving where right and then we're doing all this crazy math inside of this brain basically uh to output our actions uh what's really cool about this is most machine learning is done where you have a training data set and you have a model right you show it you're like this is a dog and it goes okay i got it that's a dog like this is a cat this is a car and then over time it learns right based on you telling it the cool thing about this algorithm is that it is doing everything based on reward um so what it's actually doing when it's stepping through this code is uh we go through we get what action we should do so at the beginning it starts off totally random and then over time as it learns it learns how to to play the game better and then we get a reward from our environment so based on our actions we get points and the reward in this case is just the score right it says it's literally as if you were playing brick breaker and you're moving around had no idea what was going on and then you saw the ball hit the top and it gave you a point a score of 10. you'd go oh i get it and that's what we're simulating here basically with this ai no one is teaching it how to play this game it's actually learning it totally on itself just to maximize that score value at the top which is really cool and while we're on the train of explaining how things work i will really quickly go into how this is all working behind the scenes for wsl because it's also super interesting basically what we're doing here is this diagram so this is an architecture of the windows sub system for linux specifically for windows subsystem for linux 2 type distros and we have um our hypervisor is at the bottom so we actually run in a virtualized machine everything that we're doing right now is using virtualization and this hypervisor runs actually below the os on windows and um here's our windows os right the windows kernel windows user mode space and then here's our linux instance so this is everything that's running in linux the magic here is this green arrow in wsl we basically have a lot of different connections so you can access files you can do that thing that i showed of running powershell.exe directly from wsl or docker that attaches to both right it makes it really easy to go and move in between and vs code is part of this family too right you saw how easy it was for me to move between windows user mode which was the actual vs code window right is on windows um but everything on the back end everything that was working was all inside of linux so here's the really complicated diagram i'm going to put this up and you can pause and take a look at it later basically all you need to know is that we have this virtual machine it's exact same as before and all these distributions are running inside of the same vm and then for gpu compute right we have that date you know you normally have a data talking to a machine learning algorithm that gives an outputs a trained model that machine learning framework is what that algorithm depends upon so an example that you know i used you saw tensorflow a lot um tensorflow is the framework right i called into tensorflow to make my model and to train it and so that that framework like tensorflow it depends on our hardware acceleration api so i mentioned cuda as one or direct ml these talk to your graphics card which is the last step here and so we actually changed this middle layer we changed how linux was talking to your graphics card and we made it possible for if you're running in a wsl instance um when you call into cuda or direct ml system calls that they call directly to your windows gpu and it's all shared right i don't i didn't have to pass my gpu i only have one um and i was able to train fully off it and the last one is linux googie apps um so i showed gui applications working um how does that actually work behind the scenes like what's the magic behind that uh so you remember this diagram the really complicated one okay let's make it easier i got rid of a bunch of stuff i'm going to run really quickly through all this by the way and you feel free to go back and take a look for more in-depth explanations and and views over it but basically all you have to know here is we have linux on the right and windows on the left i'm going to simplify this a little more and get rid of kind of all the other info and what we've done is we've made a system distro in here this is kind of like a companion distro to your regular one so when i opened up ubuntu and ran it we had another like mini distro running way behind the scenes this mini distro runs a wayland server in it uh weyland is just a way that you can have lynx gui apps working on your machine basically it's like a server to to um facilitate gui apps and when i go in and run an application ubuntu like you know in this case the atari brick breaker or nautilus in this example it goes and talks to this weyland server and that's really the main things that you need to know is that wheeling server then uses the power of wsl like that green arrow we talked about before to use an rdp connection over a hypervisor socket and it goes and talks to mstsc.exe and that is the same program that you use to do like remote into your other machines and things like that and so we remote the instance of um the linux gui app onto your machine and then that what is what controls the rendering and everything on the windows side and then bam you get your linux gap rendering right alongside your windows linux user apps as well so that's the simplified version and i'm you know and oh the last part of this is that if you go ahead and run something like another app or just draw like debian or in this case you know like other containers and things like that um sorry if you run other wcell distros you will get more system distros and then when we ran in their container we basically piped out to access one of the um the wayland server inside of that distro and so here's you know the actual this is the simplified version so if you take away anything just take away this for people who are really interested um these are all the bits and pieces so we have weston server talking to an xylen server to host x11 apps uh we do audio with pulse audio and that talks over free rdp to communicate to mstsc.exe so feel free to pause and really take that in after the talk um and then that is the highlight of everything that i wanted to show you the adz of how to uh run brick breaker awesome thank you so much craig i should have paid more attention to my like os architecture class in college but that was a really good kind of overview of like what's happening underneath like you're having this fairly native experience and it feels like very easy to connect to all these remote environments but there's a lot happening under the scenes yep totally and it's um it's really cool like as you said a fairly native it everything that i did is would look exactly the same as if you're running on linux which is really cool awesome we have a couple of questions actually so william has a couple questions first um what you showed with azure what would be the cost of like that um ml model that you were showing yeah so it depends um it depends on the resources and everything that you're using what's cool is you actually get to you only pay for what you use at the time so you only pay while it's training um i think what it costed me specifically when i ran this was twenty dollars um to to run this for four days over the weekend to actually train the model that i showed all right cool and william also asked if you recommend any uh resources because he's been having some trouble getting his wsl instance to run gui apps do you recommend any microsoft related docs or tutorials yeah i would recommend please take a look at aka dot ms slash wslg this will take you to the uh the actual web page actually you know what let's do this there we go um so this will take you to the the good like the github for windows subsystem following gui apps um it has all of the install structures here of how to get set up and get everything running and then if you can't and if you run into issues please go here file an issue and we take a look at these boards and we'll be able to help you out okay awesome and then stephen asked a question about um if he has multiple projects like node.js project and java project what would what would be like sort of the best uh practice just one wsl image for all of them or for every project a new one yeah um it's kind of up to you um best practice is and i think in that case kind of fuzzy uh for example let's say i go here and i have you know another node.js project for me personally i love the container workflow i'm so happy to go and use containers for all that stuff especially if you have a lot of different tool sets like it's it's a blessing for me because i do a lot of you know demos on different tools that's like i use node.js one day and then deep machine learning on another day um so having everything containerized that i can just pull it down instantly is awesome for me um but for you if you're using let's say everything totally on the same machine and consistently um just put everything on the same image and you know you could even if you want have multiple wcell distros like this debian distro um and go ahead and use this uh as another staging ground of like i do all my no do that's work in debian and maybe i do all my machine learning in ubuntu it's kind of up to you i i think there isn't a hard and fast uh best practice there okay awesome and then lastly everything you showed us the the linux gui apps uh the gpu acceleration when is that when is that available when can people start building yeah so it's if you want to get it right now uh please join the windows insiders program and head on to the dev channel you can see that i am on lovely windows 11 here given my taskbar um it's all available thank you it's all available on windows 11 so as part of windows 11 once that's generally available these features will be available if you want it now join the fast insiders program and get on the dev channel okay awesome and sorry one more last question because we do have a couple more seconds left but um there's a big uh hyper on code spaces as well so is it can you tell us what is there a relationship with code spaces and every like wso with what you showed us yeah what's cool is uh code spaces if you haven't heard about it is basically the exact same demo that i showed of running all this in containers and remotely but in your browser as well so imagine if i had this project up on github you could go in and just click open this in code spaces in your browser you get that vs code instance and then you get to run through the container and all that workflow and have all the extensions and everything like that the exact same user experience but directly in your browser without even having to download anything on your on your machine which is super super cool yeah that's really awesome like just the direction that like kind of development has gone in over the last couple years is like the fact that we can't even do something like this is pretty incredible it's pretty awesome all right well thank you so much craig for joining us today um folks if you have questions like craig said go over to the github repo and open up issues ask questions i'm sure they'd love to hear from you yep and if you have any uh specific w cells questions take a look at microsoft wsl um for general w cell and if you have any other questions you can follow me on twitter at craig alone on the screen awesome all right take care everyone thank you for joining us thanks bye
Info
Channel: Visual Studio Code
Views: 20,535
Rating: undefined out of 5
Keywords: vscode, visual studio code, vs code, wsl
Id: t4eFjyJWmqQ
Channel Id: undefined
Length: 30min 26sec (1826 seconds)
Published: Thu Aug 26 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.