NVIDIA JetBot: Jetson Nano Vision-Controlled AI Robot

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

thank for share

👍︎︎ 3 👤︎︎ u/YOUREABOT 📅︎︎ Sep 01 2019 🗫︎ replies
Captions
[Music] welcome to another video from explaining computers comm this time we're going to take a look at this a jet bot which is a vision controlled robot from Nvidia based upon their jets and nano single board computer you can probably sit on top of the robot there now the jet bot isn't a product you can buy rather it's a robot specification that Nvidia have come up with and made freely available and you have to use a Jetson nano but you also need various 3d printed parts you can download the designs for me Nvidia website and you also need Mary's motors and controllers and wheels and the camera and things like that to make it work so the idea is you build your jet bot and when you use it to experiment with machine learning now Nvidia have very kindly pre built a jet bot for us here it is they sent this to me all put together so we can test it out see what it can do so let's go and take a closer look so let's take a closer look at the jet bot and on the top we have the the Jetson Nano which is you might remember from some of my previous videos is actually a two-part construction it's a sauna system on the module which plugs into this carrier board and I just take off the asan for a second you will see that underneath here there we are we have a a Wi-Fi module which obviously gives a wireless connectivity to the board so I'll just put the Sun back in there we are hopefully that's all properly back together and at what point out you don't have to have that end up to Wi-Fi module fit it under here you can use a USB Wi-Fi dongle plugged into one of the USB ports and as this all suggests we're going to access this robot actually communicate with this robot wirelessly over Wi-Fi and because of that at the back we have this pipe OLED display and the purpose of this is to show reject BOTS local IP address so we know how to access it using a web browser at the other end of idiots and I know we find a camera and this can be a Raspberry Pi camera with a wide-angle attachment or a leopard camera and this is the leopard model and I must remember the tape a lens cap off in the middle of the jet but there's a 10,000 milliamp hour USB power bank offering to five volts three outputs which are used to power a Jetson nano and the jet bot motors and talking of motors if we turn the robot over you'll see that under here we've got a two motors each driving one wheel we've got a sixty millimeter wheels here and these are a TT motors from Adafruit and there's also an Adafruit motor control board which they're connected to and there's also this a caster wheel in this 3d printed mount to keep the robot stable and of course in addition to what I've mentioned we've got the 3d printed parts for the robot lots of wires various mounting hardware nuts and bolts things like that so there we are this is in videos jet bot vision controlled robot platform and if you're wondering the cost for everything here is about $100 for the jets and nano computer itself and up to about a hundred and fifty dollars for the camera motors motor driver OLED display and other jet bot components jet bot is a sophisticated machine learning robot platform but fortunately getting it up and running is not that difficult because Nvidia have documented everything very well in the jet bot wiki which is available here on github and the first thing to do once the jet bot has been assembled you've got all the hardware in one piece is the downloaded SD card image of what it's actually called jet bot it's effectively a dedicated version of a jet pack software available for the Jetson Nano and this needs to be downloaded and written to a micro SD card at least 32 gigabytes in size and then for the first boot the jet bot needs to be connected up to a monitor and a mouse and a keyboard and a power supply to boot to its desktop for various set of activities so for example you need to give it your Wi-Fi details to get online I've done that and what was still blind here because we're on the on the web and then it's also worth doing some software updating which is all documented down here in there in the wiki all nice nice and clear and you also need to make sure you in the right power mode which I'll just show you that that's just launched eya terminal and it's worth pointing out there are two different power modes of the jetsam Nano we need to be in the lower power mode because we're powering with jet bot from a USB power bank which could only supply a maximum of 2.5 amps on each of its port and potentially reject some nano control more than that and that wouldn't be a good thing using the power bank so what we need to do is to make sure our chips and I know is in five watt mode rather than 10 watt mode and to do that we simply enter sudo and NVP model and it's going to be - and one apparently like that and that will do its thing need to put in a old password which is jet bot username and password our jet bot on this distro and we can check it that's worked by typing a sudo and NVP model and minus Q and we should be in five watt mode yes there we are in five what mode so all the setup has now been done so we can try up with jet bot autonomously so we'll close things down and try the robot out right with everything set up I've now got the jet boat safely on the floor cause it's going to be moving around I don't want it falling off the table and I'm going to turn it on by connecting the power to their jets the Nano to their power bank like that and then we'll have to wait a minute for it to boot up and there we are it's booted we can tell because we've got some text on the little OLED display of the back so I'm now going to connect in the power to its motor and motor controllers also to the power bank like that and if we cut to a close-up shot of the OLED we can see the jet BOTS local IP address which is 192 168 1 6 so if we go to a web browser and enter that with a colon 8 8 8 8 on the end and press enter like that hopefully we'll enter something called Jupiter lab which is an interactive development environment for working with code and data and what they call notebooks and it looks like it's going to work hopefully is it going to load please load yes there it is and we've got loaded up here the first notebook provided by Nvidia to teach us all about the jet bomb which is called basic motion as you can see so we'll just go through a little bit of this give you the basic principle and what basically we do inside Jupiter lab is we could highlight code to have a select data box with code in a press control a and a depressor control enter we can execute that code this is Python code and that's going to run and when hopefully it will finish we can see down the bottom here when it's busy and now it's idle so we'll now also just run that I'm not going to take you through every piece of code we'd be here all day but that will run as well and then now hopefully if we do this we can move the robot and there we are robot is now spinning around because we've turned on its left motor that sir that's pretty cool isn't it and we can turn it off with that command there stop which is like that and if we bring in for example the time module there we are control I and try quite like this system it's rather good and we could therefore execute this piece of code here and there we are it'll obviously move for short periods of time and if a press keep pressing ctrl + Enter you can keep spinning their robot round there it is let's bring it back to something where you can probably see it better on the swing varial so we proved we've got access to the jet pods and we can execute code orbit and the other thing we can do here in case you're wondering is to shut it down in the control way we can open up a new tab there and go down to here and selector terminal and in the terminal we could do a sudo and a shutdown and now and enter via password which is what jet but enter that and that the jet pod would shut down so we've got all the basic principles here of accessing and remotely controlling the jet bot right we're now going to run another demo where the robots going to control itself using vision recognition or more specifically what it's going to try and do is avoid collisions by taking the feed from its camera constantly analyzing what it can see in that feed and working out whether it's path ahead is either unblocked at which case it can move forward or blocked in which cases get to turn so to do this there's quite a lot of code to execute so I'll execute all of this code and you can now see we've got a image from the robot you can see what the robot can see whether it thinks things are blocked or unblocked so we'll just continue with the rest of the code and there we are the robots now navigating around and hopefully avoiding obstacles it's moving forward if it senses a problem it's moving that seems to be working doesn't it this is quite impressive remember this is a pre trained neural network will it go into the bed no it's coming around again oh that's very impressive it's like a little type of insect moving around the floor goes up with our computer moves back to tip like that edge no it comes back again Oh got caught in a loop there was heating to go any further going round oh this is interesting it's catching itself a bit isn't it maybe I should move it to a more open area let's move it back over there there we are it can do a longer run now I'll bring up on the screen what it's seeing so you can see that you can see here let's go down to that I forgot to do it I was so interested the robot where's what it's seeing there's what it's seeing you can see whether it's a blocked or unblocked let's put on on the screen so you can see when it thinks it's blocked it will turn when it isn't blocked like now it won't turn put my foot in little turn there we are and I'm impressed with this now clearly this could be trained to actually be more accurate in this particular area but it does actually work it helps see is moving around detecting what it can see and remember this is quite sophisticated this isn't doing what we'd have happening if we had say a robot sensor you're lying on the floor or a robot may be sensing by ultrasonics where it was something in front of it certain distance this is actually processing and image working out what it thinks it's seeing and from that working out whether it's blocked or unblocked and moving around I think this is really fascinating it is it shows us where we're heading with machine learning we can do this on a relatively simple system and it's moving around by itself I've got to take this little box and here of course I can make it yeah we all make it turn and make it turn again make it turn again it is detecting things and moving around so I could play with this for hours and hours and hours I don't think I will just now but we've certainly seen the principle of using the jet box to actually control itself using vision recognition right for my next trick I want to get the jet boat moving around on the surface of this nice wooden table without falling off so once again it needs to be able to work out using its camera on throw until its neural net which areas are blocked which should be the edges of the table which errors are free so it can move around and you'd think this would be very similar to what we were doing in my last demo but if I run the same neural network in this context this is what happens the jet boat just rotates around and around and around and around presumably because it thinks it's blocked in all directions which it isn't but that's what manure on that is as seething so we need to train a new neural net to work on the surface of this table maybe because this is a more visually complicated surface play because the Lighting's changed something like that so how does training work well basically we need to feed the neural network with lots of a sample training data in this case lots of images from the robots camera and we to categorize those savonia long network and learn from them so to do that I'll just move out of the way and we'll point the robot to say in that direction that's the position where clearly the air in front of it is clear so if I move to Jupiter lab I'm running a notebook here from in-video called data collection and this shows us what the robot can see that allows us to tag that image as either free or blocked and what this basically does is over here in this folder called data set you'll actually put those images store them as JPEGs in these two folders called free and blocked so here I'll click at free because that's clearly free but then if I move the robot forward to say the edge of the table there that I think should be probably there should be blocks I'll do an ADD blocked and simply if I turned it a little bit like that that should also be locked I'll do ID blocked again so basically I need to take lots and lots of pictures stores of pictures maybe even a few hundred pictures so I'll get on with gathering this training data and I'll come back to you when I've finished so here I am back again and as you can see it's working the robot has learned how to move around across the whole surface of this table without falling off I'm rather nervous I'm staying out of shot as much as I can so I don't get into the robots visual field and disturb its perception of the edges I'm talking about if it's a thing aren't I in the sort of it is I took in the end about a hundred and ninety pictures of blocks non block situations and I used those to train a new neural networking model to to let the robot move around like this on the table it's a rather involved process all documented in the relevant Nvidia notebook but it's basically what I talked about in my Jetson nano vision recognition video where neural networks are shown lots of sample data lots of training data and from that training data they establish connections so they can deal with saving the world here visually in the future I thought it's gonna fall off though well it didn't I'm still rather nervous about this it works this is the first neural network I've ever trained and the result is I think pretty good you can also train manual networks used on the on the Jetson nano the jet bot here to do all kinds of other things you can train it to follow edges say on a road to follow a road you can train it to follow people all kinds of things like that this is really just the beginning but hopefully this demonstration is a good example of what can be done with a machine learning with vision recognition and robotics I'm certainly very pleased with the results of this test and I hope that you've learned something from it as well invidious jet bot is a great vision controlled robot platform to experiment with and it's been fantastic to check it out to see what it can do in this video but now that's it for this video if you enjoyed what you seen here please post up the like button if you haven't subscribed please subscribe and I hope to talk to you again very soon [Music] you
Info
Channel: ExplainingComputers
Views: 155,459
Rating: undefined out of 5
Keywords: NVIDIA JetBot, JetBot, Jet Bot, Jetson Nano, NVIDIA Jetson Nano, Jetson Nano robot, machine learning robot, vision recognition, vision recognition robot, machine learning, Christopher Barnatt, Barnatt, Jupyter Lab, Jupyter Lab notebook, collision detection, Jetson Nano SBC, SBC robot, SBC vision recognition, Python, neural network, neural network training, training data, JetBot Wiki, demo, JetBot demo, JetBot review, JetsonNano
Id: wKMWjIKaU68
Channel Id: undefined
Length: 16min 43sec (1003 seconds)
Published: Sun Sep 01 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.