Algorithms Explained for Beginners - How I Wish I Was Taught

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello and welcome back today's video is a particularly important one because today's video is about algorithms maybe I don't need to convince you of the importance of algorithms if you are aware of how companies coding interviews are generally formatted you are probably aware that the sort of problems they ask in or the interviews are related to data structures and algorithms in my previous video in the series we went through the first part of this equation which is data structures and today by popular demand we will be covering the second part which is algorithms as in my previous video the goal of this video is to give you an intuitive understanding of what algorithms are why we need to study them why companies care so much about algorithms that literally their entire recruiting process is structured around testing these topics and of course as you know on this channel we always leave things off with practical steps that you can take and the things that you need to do Beyond this video to master these topics I think this could be one of the most important videos that you watch our algorithms because as I always say the most important aspect about learning anything to me is being excited about it and algorithms and data structures are the kind of thing that most people don't quite understand why they are so important and that is probably actually the thing that is holding you back the most in fact I think algorithms and data structures are some of the most beautiful things about coding and computer science and I hope that with this video I can give you a glimpse of this beauty and why I appreciate them so much so that you can actually start enjoying the process of learning them and therefore wanting to learn them more so what is an algorithm an algorithm is a series of steps to perform some action to reach some goal we have some input so some data that we want to do something to and we need to figure out what are the logical steps that we need to tell the computer to do to reach the output that we desire you can think of Designing a series of steps to complete really anything you think about a series of steps to get from your house to the gym for example I was reading about this the idea of algorithms is like a thousand years old so where do computers then come in well the study of computer algorithms is just about applying Computer Resources to solve mathematical operations or problems we want to solve instead of using the human brain to solve them and the advantage that computers have is a computer scan perform these operations blindly fast so that is really where computers come in but before we go deeper into specifically computer algorithms there's a great book about this idea of sort of applying algorithmic thinking to your everyday life it's called algorithms to live by at least I recommend reading a summary of it and an excellent way to get a really good summary is actually today's video sponsor short show form is the absolute best way I've found to get really high quality book summaries and the thing about short form their summaries are not just book summaries they're more like study guides with like in-depth analysis of the book they include a short version of the summary if you just want to get the main ideas plus chapter by chapter summaries including interactive exercises to actually apply the ideas of the book when you look at a book it can be difficult to tell whether that book is actually worth reading and so what I often do first before reading a book I will go read this short form guide of that book and then based on that I will decide whether the ideas of the book are actually something that I wanted to go deeper into by reading or listening to the entire book for example for me some of my favorite categories of books that they have are technology self-improvement business entrepreneurship productivity and in addition if you are a subscriber to short form you can actually vote on the next book that you would like them to make a summary of and because straw form are sponsoring this video you are in luck because you can go to the link in the description and get a 5 day unlimited access free trial as well as an additional 20 discount on the annual subscription and give a short form for sponsoring this video and now let's get into the meat of algorithms the big question I had in the beginning and I just couldn't really understand is if computers are so incredibly fast why does it even matter what kind of algorithms we design why do we care so much about the speed of algorithms on like studying the theory of like optimizing algorithms a lot because our computers are so fast right but when you think about sorting algorithms for example there's also a problem you might think about like sorting 100 numbers or something but when it comes to computer science and programming those are actually not the interesting problems any interesting problem in the domain of computer science and like in The Cutting Edge something that the input size is ridiculously big when you think about companies like Google and Facebook the data sizes that they run these algorithms with are not like a hundred or a thousand items they can be like billions and billions and billions in these scales even with the power of computers without we have today how your algorithm is designed as in how fast it is and how much memory it consumes really starts to matter a lot so simplify designing a faster algorithm weekend number one solve much bigger problems and number two we can literally like achieve the same performance gains in our technology as like a decade worth of computer process for Speed Improvement so in general about algorithms the things we care about or the speed at which it runs as well as the memory that it uses so the very very very low level we can think of the speed of an algorithm just ask how many instructions does it have to complete if we can design an algorithm to do the same thing using 10 instructions or doing 100 instructions the algorithm that we have designed that can run it using 10 instructions is going to be faster at the very very low level that is what we really mean but as you'll see in a moment there's actually a kind of in practical way of actually thinking about it okay so let's just think about a very basic algorithm let's say you have a book and you want to find a specific page in above let's say you want to tell someone to find page 100 in the book How might you go about doing that well one very naive approach might be to just start opening page one of the other and The Logical instructions will be open page see if the page is Page 100 if it is you found it and you've completed the other if it's not you open the next page it is Page 100 no more doesn't expect is this page 100 no move on to the next page and this might sound a bit ridiculous but actually if you do this 100 times eventually you will reach page 100 and you can say yes this is Page 100 yes it is Page 100 even though like intuitively to you as a human it seems like a very stupid idea it does actually work so what is actually wrong with this algorithm it's very slow right and there's probably a much better way to do this we can use our idea from before of thinking about the amount of instructions we need to complete to reach our goal so you might say that in this particular case the running time of our algorithm was a hundred iterations or 100 instructions but the thing here is that saying that this algorithm has a runtime of 100 instructions it's not very helpful because that depends on these very specific case of us wanting to find page 100 what if I had told you to find page two in that case you would open in page one and be like is this page two no more on the next page is this page two yes we have found the solution so in this case the running time would only be two but you might think of the worst case where I tell you to find the very last page of the book you have 696 pages in this cracking the coding in the new book it's a great book by the way but in this case we would have to complete 696 iterations to find the number and so how can we just generalize this idea of the running time in this particular algorithm for computer scientists are actually quite pessimistic whenever we analyze algorithms we are always first of all thinking about the worst possible case we assume that we are the unluckiest people and we always get the absolute worst case possible so that is why in computer science when you analyze algorithms we would say that the Big O running time aka the worst case running time is 696 but what if we had a different book what if we had Instead This Book is the Leonardo da Vinci biography this one has 599 pages so in the case of this book the same algorithm would have a running time of Big O 597 but this is also not very helpful because now even though we've sort of generalized in terms of thinking about the worst case our running time still depends on the actual book so how can we generalize the running time of this particular search algorithm which is called the linear search by the way into any problem so any book and the way we think about this in computer science is in terms of the input size so in this case we think about the running time of the algorithm in terms of how much it scales when we increase the size of the book so let's say we had a book that was a thousand pages long how does the running time of the algorithm scale relative to the size of the problem and in this case it actually scales to linearly for every page we add the book the running time of our algorithm increases by one because remember we're always thinking about the absolute worst case so in computer science lingo our algorithm has a linear running time or in other words a running time of O of n where n is the size of the problem and the reason again why we care about the input size so much is because Google's the Facebooks they have billions and billions and billions of data points so for this book page finding algorithm how could we find an algorithm that's perhaps much faster than just opening a page one at a time you might already intuitively think about surely it doesn't make sense to start even looking at the first Pages because you know that is not going to be on the first page right so you might just sort of estimate and open is somewhere over here actually landed on 149 which is a pretty good guess and then you know that because page 100 it's less than 149 we can just open it somewhere over here slightly behind it and then eventually we find page 100 and if I told you to do this that's probably how you would do it right and it turns out that we can actually instruct a computer to do it almost like this in general what we can tell the computer to do let's say just in the first iteration let's tell our computer to always try the middle page to open it from the very middle of the book so in this case it's going to open it on page 336. what can we then do afterwards well we can always check very quickly using one instruction is is the page we're looking for before the one we opened or is it after the one we open 100 is less than 336 so in the next iteration but if we tell the computer to then open it on the middle of the left section of the book and this time we're opening on page 150 which is approximating and again if we know that is our page 100 less than 150 or more than 150. we can again just forget about the entirety rest of the book here and just focus on the left side of 150 because we know that what we're looking for is going to be in here and again without the computer to open it again now on the middle of this portion of the book which will be on page 75 is 100 more than 75 or is it less than 75 but it's more than 75 so at this point we can just open it in the middle of 75 150 and just through this iterative process we are going to get to our solution much much much faster so essentially in every instruction we are actually reducing the problem size by half and sort of getting closer and closer to the solution and it turns out that the running time of this algorithm is something called log n and the log is something that can be a bit confusing probably but essentially although log N means as we double the size of the book The amount of instructions we have to do so the the running time of the algorithm would only go up by one instruction because if you think about it we have two books when we double the size of the input in our binary search algorithm where we're always opening the middle page it only adds one more step because in this case we would open it in the middle of these two so in here and then we're going to say well is it to the right or is it to the left that says to the left so then we can literally just forget about this other half of the problem entirely and now we're back in the same stage where we started out before with just this one or this half of the book and you can see that even in this very small case using this algorithm is much much faster you would never use the first one because it's so much slower and you can imagine a similar problem like this where we need to find a particular data point from a list of like billions and billions and billions of billions the difference that this better algorithm makes and literally shorten the running time by like weeks or if months I think I've seen so that is really why we care so much about designing fast algorithms and another very common problem that a lot of algorithms courses start with at first is sorting and I've actually coded up a visualization that really shows the difference between different kinds of algorithms I'm not going to show the implementation in this or anything but you can call the description and find it if you want in here we have a very slow algorithm that scales at o of N squared so as the input size doubles it's going to go up by 2 squared so it's actually even worse at the linear time thing that we tried before two thousand years later you can see that it's taking quite a while whereas if you use a more optimized algorithm called merge saw where again you'll end up studying the details of this later and in whatever course designed to follow this algorithm runs on N Times log n which is actually much much faster than N squared so I hope now you're sort of starting to see why companies actually put so much emphasis on hiring people who understand algorithms who understand how to write good algorithms for their very very big problems that they solve other companies in terms of practical next steps for you to do it you want to understand this topic if you enjoyed this video first of all leave a like because I actually find the interaction algorithms quite exciting and I really enjoy teaching them and talking about them but I will only doing if you guys really want me to do that so if you want to see that go leave a like down below and go leave a comment on the specific topic about algorithms and data structures that you would would like to see next but first of all if you haven't done cs50 yet specifically their third lecture I'm going to leave the third lecture of 650 in the description below its YouTube video that lecture is specifically on algorithms and it essentially teaches what I've taught you in this video except it's the two hour version and probably a much much better explained that that I ever explained but obviously even that is not going to be enough to pass interviews so what I would do is just pick any popular course on data structure and algorithms teaches you the theory of the most important algorithms and data structures that you'll need one that I did personally for myself was zero to masteries master the coding interview bootcamp zero to Mark 3 is a coding course platform that's essentially like a subscription model where instead of paying for a Netflix subscription you can pay for as an auto Master research and characters so that they're like 60 different coding courses on a lot of different areas so I've done a bunch of them those are really good uh the teaching style is really engaging if you are interested in sort of a more academic and probably a more rigorous approach there's two specific course Sarah specializations that you can look at I've actually done both of them the first one is the Stanford algorithm specialization that's really the most theoretical approach that you'll probably find online quite math heavy but it's going to give you a very very rigorous understanding the other one is the Princeton algorithms one and albums two specializations agents are also use the textbook style one is Java based the other specialization actually doesn't basically on any specific programming language because but they rather just go through things either conceptual and a mathematical level and of that something like cracking the coding interview is actually a very good book to get sort of a review slash an overview of the topics that you need for interviews plus a lot of practice problems but I would not choose this first I would choose one of the three other courses to give you the theory and the understanding and then after that I have to go for something like this because this is more like a practice book something that you want to use to prepare like right before your interviews and obviously beyond that lead code you don't need to pay for the premium I've never paid for the premium just keep doing a lot of lead code problems so that is that this is probably a very long video already if you haven't seen my first part to this series on data structures absolutely go watch it now other than that I have this other video where I talk about my step-by-step plan of how I personally learn all these topics so go watch one of these videos next and I'll see you in the next one foreign
Info
Channel: Internet Made Coder
Views: 225,818
Rating: undefined out of 5
Keywords:
Id: JJkWemM03Lg
Channel Id: undefined
Length: 17min 38sec (1058 seconds)
Published: Sat Oct 15 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.