TensorFlow and Keras GPU Support - CUDA GPU Setup

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
welcome to deep lizard my name is Mandy and in this episode we'll be discussing GPU support for tensorflow and the integrated Kerris api as well as how you can get your code running on the GPU now before jumping into the GPU specs I just wanted to elaborate a bit more on a point that we brought up in the previous episode so it's important to understand that Kerris is now completely integrated within the tensorflow api it's no longer a standalone api of its own it's now fully been integrated and so when we talk about Kerris now we're not talking about it separately from tensorflow it's now just an api that's integrated within the tensorflow library that we have access to and actually the standalone version of Kerr's is no longer even being maintained or having bug fixes provided by the Charis team now with that being said because Karis actually integrates so deeply with the low-level tensorflow functionality we can actually make use of the high level functionality of Karass for most things that we want to do without ever having to actually go more deeply into the lower-level tensorflow code for more complex things that's something that we might want to do but for many things many common things that we're doing with deep learning and neural networks we can use the higher-level Kerris api integrated with tensorflow without having to necessarily go deeper into the lower-level tensorflow code alright so hopefully that provides some clarification around the Charis integration within Kinser flow so now let's move on to the main topic of GPU support tensorflow including Kerris now runs transparently on a GPU with no additional code configuration required and GPU support for tensorflow is currently available for both Windows and Ubuntu systems with CUDA and abled cards now in terms of getting your tensorflow code to run on the GPU note that now operations that are capable of running on a GPU will default to doing so if both a GPU and a CPU are detected so if you have both a GPU and CPU your machine and you're calling an operation that is capable of running on a GPU then it will do so by default that's a lot of GPUs in just a few sentences if however you don't want this default functionality then there is a way to explicitly control which device your code runs on but we'll touch on that later in the course for now we're going to focus on how to actually get set up with a GPU so that you can make use of that option if you'd like to starting with hardware requirements the only thing that you need is an NVIDIA GPU with CUDA compute capability and that's for both Ubuntu and Windows systems now the steps coming forward are going to differ a bit for Linux systems versus Windows we're actually going to focus on the windows side because for Linux actually the recommended way to go is to install a docker image and run all of your tensorflow code within that and going that way you only have to just install the Nvidia drivers and it's a pretty straightforward process and tensorflow has that entire process documented in the guide which is linked to in the corresponding blog for this episode so if you're interested in going background then check that link in the blog otherwise we're going to cover the windows procedure because that's a little bit more involved so we're going to go through all those steps now okay so all of the upcoming steps are now going to be specific to Windows users so first things first we need to have tensorflow installed which we talked about in the last episode as being as simple as pip install tensorflow but I told you guys to be sure to check the system requirements so that you had your environment in order to be able to install tensorflow there's a specific requirement for a c++ redistributable there's a specific requirement for a c++ redistributable that you need before you install tensorflow and if you don't have that installed then whenever you try to import tensorflow in your code you're going to get an error about a DLL failing to load so I'll put that exact error as well as the redistributable that you need to download I'll put a link to that as well as the error that you'll get if you don't have that installed in the blog so that you can check that out but be sure to check the system requirements have everything you need in order to install tensorflow then get tensorflow installed with pip install tensorflow then we can move on to the very specific things that we have to do specifically for GPU support alright so now that we have tensorflow installed the next thing we need to do is install the nvidia drivers so first we need to download them and to do that we can come to the nvidia website and we need to specify our GPU that we have as well as the product series and things of that nature so the download and installation process is pretty straightforward for the Nvidia drivers but if you're unsure about the specs of your GPU then you can just open your device manager and go to your display drivers and you should see your nvidia GPU there and with that information you can then understand what you need to specify here on the download page for which driver that you need to download so then after you download the driver that corresponds to your GPU you need to go through just the simple installation wizard to get the Nvidia drivers installed on your machine then after that's done we need to install the CUDA toolkit and to do that that's also available on Nvidia's website and you can see here we have this CUDA toolkit archive that has all the versions available for the cuda toolkit and it's important that whenever you download and install this software that you look at the currently supported toolkit versions for tensorflow so here on tensorflow site you're able to see the version that's currently supported for the cuda toolkit this is obviously going to change over time so just be sure to check that you're installing a version that is currently supported by tensorflow so after we find the version that is supported that we need then we download that and start the installation procedure and that is again through an installation wizard and actually if you're going through the installation procedure and you see an error message that looks like this no supported version of Visual Studio was found some components of CUDA toolkit will not work properly please install Visual Studio first to get the full functionality so that's something that I actually ran into when I was installing the CUDA toolkit and come to find out the Microsoft Visual Studio is actually required for the CUDA toolkit so that's not stated currently at least on tints or flows website the reason that that error message is coming up is because it is a prerequisite for the CUDA toolkit itself as stated here on this page so now we need to jump over and install or download and install the Visual Studio the Community Edition is fine and whenever you download it and begin the installation you're going to get asked about all these different workloads and options that you want to install and configure you actually don't need anything except for the base installation so just get that installed and restart your installation rather for the CUDA toolkit and then you should not get that warning any more about not having Visual Studio as you will have that now installed on your machine alright so then once that's finished we now need to download and install the CUDA and SDK and to do that again it's on invidious website for this download in particular you actually have to register as a user on the site first and then you're going to have to go through an e-mail verification process the whole thing's free but you got to do that as a first step before you can actually download to DNN once you already register then you'll have the option to download the SDK here and then you'll just need to agree to the terms of the license agreement and you'll have these options for which version of CUDA and you want to download and install and recall that the CUDA toolkit I told you that you need to be aware of which version you're downloading so that you are installing a version which is currently supported by tensorflow so as you can see on this page the KU DNN versions correspond to the cuda toolkit versions so you need to be careful to choose the CUDA in download that corresponds to the supported version of the CUDA toolkit that you downloaded what it was to be any same for CUDA deep neural networks oh yeah so now after we have COO DNN downloaded we have to go through the install process which for this one there's no like wizard to go through or anything relatively easy like that instead it requires some manual labor on our ends to get CUDA and installed so this page that we have up right here actually goes through the explicit instructions for the install procedure we're going to walk through that right now so first things first this zip file is what has been downloaded and this folder here is just the extracted zip file that I have done previously before recording and if we go inside we see this structure here with been include and Lib folders now actually what we need to do let's pull this over a bit we need to go on our machine where the CUDA toolkit has been installed which by default is in C program files and video GPU computing toolkit CUDA and then the version for which you installed and here we can see the corresponding directories to this bin include and Lib directories over here that we downloaded what we need to do first is we're going to go into the bin folder from the download and we see this DLL here we need to then take that dll and drop it into the folder on our machine within the Nvidia install directory that's actually here I've already copied and pasted it here and if we go back then go into the include folder we have this convenient H file so then if we go into our cuda install path on our machine and go into the include folder we can see kutina naitch' here which i pasted in just previously now lastly we need to go into this Lib directory x64 we have this kudi and induct Lib file now if we go back to Lib x64 on our install path on our Nvidia in our install directory for nvidia cuda toolkit we can see we have qu DN n Lib which I pasted it in so it's just a matter of taking these three files from the download directory and then pasting them in to where Nvidia is installed on our Windows machine so then after we copy and paste all of those files in the relevant directories were then going to go to our environment variables which we can just type that into our search bar maybe if we know how to spell environment correctly and we go here and we will go to the Advanced tab which is pulled up by default environment variables and we want to look down here at the system variables find the variable called CUDA path and then it will also likely have the version of CUDA that you have appended to the cuda path and you want to make sure that that is in this location here that it's pointing to the correct install path of the cuda toolkit which in our case you can see that's exactly where we've been working right back here well yes so that's exactly where we've been working right here so you just want to make sure that that matches where we've been moving the files from the CUDA in download so then we'll just click OK if we verify that that's correct if it's not correct then change it to match your install path and then the final step which is just something that I had to do I did not see this documented I did a restart after this because actually whenever I tried to load the GPU in code and tensorflow code after this procedure it was not identifying my GPU at the time it was telling me that actually one of the files that I'll show you actually it was telling me that it could not find this CUDA in 64 under 7 DLL which we have correspondingly already moved to here so it existed on my machine but whenever I was trying to load the GPU using tensor flow code it wasn't identifying this particular DLL which is required to load this GPU so I tried a few things and it was not resolving itself so I restarted my machine and it was able to find the dll just after a restart so I recommend you doing that if you run into that same issue I will put the exact error message in the blog as well so that you can see for yourself the exact details of that error all right our next step is just to verify that tensorflow is able to identify that we have a GPU on our machine and so to do that I have opened up a Jupiter notebook and import it tensorflow you can do this in Jupiter notebook as well or whatever IDE you prefer to use and after importing tensorflow with this we then just run this line of code which is going to lift the physical devices that tensorflow is able to identify so I have this line here that's going to print out the number of GPUs available we're looking for devices that are listed as GPUs with this particular line of code and we have some verification that there is one GPU available on my machine so that is how you can verify that all the steps that we went through earlier we're done correctly and that tensorflow can now access your GPU so if you get the same output as I did where your GPU is being able to be identified by tensorflow then that's it that's all we actually have to do going forward to get tensorflow to run code on the GPU because as we mentioned at the start of this episode tensorflow code runs on the GPU by default when there is one available now be sure to check out the corresponding blog for this episode on depos or calm because there you'll be able to access the written steps of everything that we went through in this video as well as access all the links and documentation and error messages everything that we touched on you'll be able to find that in the blog for this episode also did you know that be Blizzard has a blog on our vlog channel we document our travels and show some things from our everyday lives fun fact we are actually filming this from Vietnam right now so go check out deep lizard vlog on YouTube and consider joining the people at hive mind where you'll have download access to the code files will be using in this course as well as other rewards I'll see you next time [Music] [Music]
Info
Channel: deeplizard
Views: 104,094
Rating: undefined out of 5
Keywords: Keras, deep learning, machine learning, artificial neural network, neural network, neural net, AI, artificial intelligence, Tensorflow, tf.keras, Tensorflow 2.0, tutorial, get started, cuDNN, GPU, Python, Jupyter notebook, supervised learning, unsupervised learning, Sequential model, transfer learning, image classification, convolution, CNN, convolutional neural network, educational, education, activation function, relu, SGD, stochastic gradient descent, backpropagation, cuda, gpu, cudnn, nvidia
Id: IubEtS2JAiY
Channel Id: undefined
Length: 15min 54sec (954 seconds)
Published: Thu May 21 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.