Watch this BEFORE buying a LAPTOP for Machine Learning 🦾

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
when you read about these amazing accomplishments in machine learning you hear about these huge machines that enable it often tpu clusters not even gpu and they're massive like thousands of cards that are working together and yeah most people come to me and ask me what kind of hardware do i need to buy what do i need to upgrade from my work laptop and then this video i want to dive in whether you can train with a laptop or [Music] most people don't use desktops like this one every day most people use a laptop right so i don't know like desktop computers are really clunky and laptops are really aerodynamic so there are pros and cons to using both these clunky machines are usually only used by scientists or by people doing video process and of course people who like gaming whereas with a laptop you can just sit on the couch um so yeah and work often gives you a laptop even if you're a phd student often your university will give you some kind of laptop a lot of people have approached me especially early career researchers and people doing their phds and masters what kind of computer do they need can they get the new mac m1 because it sounds really good but it's new can you trust it others have asked me like for budget and they were budgeting three thousand dollar computers so yeah let's let's have a look because that might not be necessary historically apples were marketed towards creatives people working with photoshop and the ipads of course with procreate the beautiful comics that come out of it i love it but now apple really has stepped up the game everyone was a bit cautious when they were moving away from intel because such a major change means that the entire chipset is changing and that means all the software people have to work with this new hardware in everything and that makes everything really complicated however with all the benchmarks that are coming out and marketing and everything the m1 seems to be really interesting and even google have worked on a specific tensorflow implementation for deep learning especially for m1 architectures and they are like out of beta already and this is huge like seeing something like that on a chip that has a specific neural engine is is huge apple are really really investing in this and it is being adopted by by tensorflow personally i've never really owned a mac because um they're quite expensive and i don't have that kind of money also the open source androids are just really nice in my opinion and you can do a lot of it which you had to jailbreak your apple for so i never really made the jump but also all my mac owning friends had to have external gpus for doing machine learning on their laptops and that kind of seemed a bit silly to me when i can just have something like that for a smaller price and do some really heavy lifting so yeah the m1 really changes the game but this isn't just about max machine learning isn't just deep learning isn't just neural networks that need these gpus to train you can learn a lot of machine learning models on a cpu on a very old mac the 2016 macbook that a lot of people love and my windows computer they are adequate for a lot of machine learning modeling and for most data signs you can do most of the visualizations and if it's not enough anymore you can always go to a cloud instance and pay maybe a couple of dollars and in a future video i'll show you how to get some for free as well but that's a future video so you know what to do but yeah a few dollars to upgrade to huge infrastruct structures hosted by google hosted by amazon and you don't need to carry a five kilogram laptop with you let's have a quick chat about computers and computer hardware because to do calculations and to have a proper computer you essentially need three things first there's the cpu and the cpu is kind of the brain of the computer and it is good for complex computation and calculation so everything that you do normally run your programs that's what the cpu is doing and it closely interacts with a special kind of storage with a memory it's called which is called ram so this ram is kind of like the short term memory that humans have in their brains this is opposed to hard drives they're really good for long-term storage you install all your programs on it and you have your picture saved on it but they're slower to access but very good for long term and then we're getting to the interesting bit the graphical processing unit or short gpu so we have the cpu that is the complex one and the gpu which is the graphical one the gpu developed over time it was made to make games more realistic to have these 3d renderings of whatever you're playing whether it's a horror game a shooter game or some 3d simulation or sims obviously and to make those look better to make your 3d models look crisp and look realistic and nice and to be able to look at more things people develop gpus and 3d graphics are essentially just linear algebra they're matrix calculations where everything is in space and that means that people slowly figured out that you can also do scientific calculations the classic ones where you do matrix calculations you have to invert a big matrix or something like that on your gpu and then suddenly someone figured out well a neural network is essentially just a big big linear algebra system and you can also solve that on a gpu and as an aside people always ask so why don't you do all calculations on a gpu gpus are good at this one thing they are good at exactly this calculating matrixes they're super fast at that and they're highly specialized for that you can see right here that the architecture of cpus and gpus is very very different because cpus are for complex tasks and gpus are for linear algebra and they're really good at throwing a bunch of matrixes together now these gpus have a special memory it's also called ram but we call it video ram so vram and because this one has to be even faster it's much more expensive and usually baked into the card so a gpu card usually has vram on it and that usually means that the vram is also much smaller so the card that i have in here has eight gigabytes of vram virus my normal ram in this computer is 32 gigabytes because i didn't want more like the clusters that i'm calculating on have 200 500 gigabytes of ram and the vram in the cards is eight gigabytes to 12 gigabytes many work issued laptops usually only have a cpu and something called integrated graphics and that means you can often watch netflix and youtube on them but if you start up a 3d game that is maybe 10 years old even they will buckle under the load so that means these kind of tasks that do like graphics and matrix calculations are really not good mostly these are applications or well these are laptops that are made for for excel and for having a browser open for writing word documents and not for these heavy calculations so the laptop that i just threw over there that's a bigger one it actually has a dedicated gpu and i can do deep learning on it there's another thing that totally complicates this kind of buying of hardware i already talked about different vendors like intel and arm in max now if you want to do classic deep learning on a graphics card so on a gpu that means you have to be careful that you get the right one because right now it's only possible to do this kind of training on nvidia cards the card that i have in here doesn't do that they are working on it there are some changes but it's still really hard and the easy way is to use an nvidia card and i hope you're still here because this was a lot and that is also the point it's really hard to see that you have the right configuration and if you want to do machine learning you have to get the right provider and it's a mess that's why i wouldn't really do deep learning on a laptop so do you need a gpu probably not i i don't think it's necessary especially for people starting in machine learning and even some of the more advanced people most of the more advanced people most people are building classical machine learning models not neural networks and you should probably start with some scikit-learn models as well and those are mostly decided by the cpu now interestingly i can show you that i can run a cycle learn model on my phone and it's good enough it it works really well the cpu decides how fast you train your model but honestly most of the time it doesn't matter because machine learning models train relatively fast cpus are still really really fast calculation machines it is however constrained by ram because most models cannot train iteratively so you can't feed them piece of your data another piece of your data most classic machine learning models have to fit the entire model and the entire data into your ram so you can train it once and then get your results so that really means that seeing that you have a good amount of ram in your laptop will mean that you can train larger models in my experience for classic machine learning you should prioritize ram in your in your laptop you don't necessarily need to get the gold standard cpu because like you you get diminishing returns the cpu like a mid-tier good enough cpu with a couple of threads perfect but don't go out and buy like a 10 000 cpu because you think it'll get your model to converge faster usually if you need to make those kind of optimizations you have a problem at a different part in your machine learning journey so yeah buying your way out especially with a cpu usually not that important especially in office laptops um if you have troubles opening some excel files because your ram is maybe four gigabytes um you will have problems training a machine learning model then your computer is a little bit small for these kind of tasks of numerical computation and you should definitely work work something out with with your job to get a beefier machine however and that's a really cool thing you should be able to validate your code on any old machine so if you plan to do something in the cloud but you want to run it on your laptop first that means you can run your code and until it crashes because you're running out of memory which i mean it happens it doesn't matter like your laptop is going to be fine from it but yeah then you know that your code runs and you can submit it to a cloud job and that's one way that you can stay within a budget on your laptop but yeah still utilize state-of-the-art machine learning jobs if you do have a gpu and you want to train some of the classical models a bit more advanced then you can use something like nvidia rapids qml which uses most of the like classic machine learning models but has optimized them for training with gpus now there are some trade-offs but they can often run faster which is really cool like if you have a gpu qml is and like the whole rapid stack is definitely something to try but you run in the same limitations that if you don't have enough ram or in this case vram you may not be able to train as big of a model as you want to is that all you should consider no i hate running deep learning models on my desktop i've done it before i've done it at work and it's dreadful because usually deep learning models go to the edge of your computer's performance and that means your computer is completely frozen up for the time of training and oftentimes they train hours to weeks so submitting it to a dedicated machine is so much better and a lot of times i recommend doing that in fact one of my older pcs i was training a model on it and afterwards i could hear the graphic card like i i could hear this super high-pitched noise and yeah i'm not a fan like i'm not saying i broke it but there was definitely something going on after i did some deep learning on a local machine so yeah user cloud instead cloud instances are really cheap these days you also have to consider that these kind of computers when they're running at 100 percent have significant power draw so right now power isn't cheap and that means you're paying a significant premium just for running it on your local machine where you could just submit it into the cloud seriously and where there's power there's heat so if you have a laptop like the laptop with with a dedicated gpu it can run really hot and these kind of gaming laptops are notorious for running hot when you're gaming on them it's even worse if you're like continuously running a deep learning job on it for maybe a day so even if you have the m1 machine or if you get a nice gpu laptop maybe just use it for prototyping like building a small scale version of the thing that you do want to train but i wouldn't really run it continuously i don't think it's good for hardware it's not good for your electricity bill it might not be great just for you working on anything else so can you train machine learning on a laptop yes it might not be those state-of-the-art deep neural networks but you can definitely run scikit-learn on anything from a phone to your office laptop and try out how it works deep learning is a bit more tricky but we have tensorflow for m1s and you can definitely try it on the new macbooks now i know there are m ones in the new imacs as well which is making it really interesting i know that's not a laptop but this means that on the m1 imacs you will be able to train state-of-the-art deep learning models on the smaller macbooks one of the limitations is they have very little ram and especially when you're doing image processing that's not an option so you should probably stay away from it unless you really really want a mac and you really want to try it but the imax with a higher configuration that might be a valid choice so for machine learning for science i recommend getting like a mid-tier laptop maybe with a gpu if you really want to but definitely with a nice cpu and enough ram maybe have a look if you can upgrade it sometimes you can add another ram module in there and that should give you like beefy machine where you can do a lot of experimentation on it and eventually if you want to do deep learning like i said there are three options you can go to the cloud and it's not as expensive as you would think i've run a lot of um cloud jobs for hackathons and the most expensive one cost me like 20 quid so it's not that bad especially if you win but yeah so where do you train your models do you have a laptop do you have a desktop do you train your jobs in the cloud if you have hardware questions leave them in the comments i will be able to jump in and some others might be as well it's a really nice space and yeah thanks for listening i hope this helps
Info
Channel: Jesper Dramsch – Real-world Machine Learning
Views: 7,026
Rating: undefined out of 5
Keywords: ai, deep learning, machine learning, data science, python, machine learning on a laptop, deep learning on a laptop, do you need a gpu
Id: LHiM0WnRaD8
Channel Id: undefined
Length: 18min 8sec (1088 seconds)
Published: Tue Jun 29 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.