Big O Notation | time complexity of algorithms

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
ladies and gentlemen it's been a while since i made a video for youtube i was extremely busy at my work but unfortunately about a month ago i lost my job and i've been preparing for interviews it is a tough time and i am applying for you know a senior lead role uh architect role which requires quite a bit of preparation nowadays they ask you not only from algorithm data structure system design and all kind of stuff right and i have i've had many interesting experiences including interview at google which i will share in one of my videos what i wanted to do is all the the knowledge that i acquired uh you know in terms of preparation and giving out interviews and all that i wanted to dump into uh you know all these videos so that it will help you grow and pass so i'm starting a brand new series on algorithm and data structure in javascript all right so to start the series very first thing we want to learn is something called big o notation if we are going to learn algorithm and data structure we need to know how our algorithms are performing and for there there is a standard way of doing it called using big o notation and if you try to learn this topic uh in a using a book you will be lost because there will be so many mathematical terms but here at texit we like to simplify things and that's what we're going to do in this video with the simple examples and welcome to taxi tutorials all right so let's start with this line just in case that your interviewer asks you to put bigger notation in a sentence so here here's one bigger notation love zero allows us to compare the worst case performance of our algorithm in a standardized way there are two keywords here one is worse case performance and standardized way standardized way so that you and your interviewer can argue or which algorithm is faster using big notation which means bigger notation is universally accepted so you can compare or define a performance of an algorithm and worst case performance why do we need to look at the worst case performance well in human if you say somebody's good then they have to be good in an extreme situation or worst case if they're still good then you know they're really good right so uh similar to that if you have an algorithm that performs a certain way in a worse case then that's uh the performance of the algorithm so when i look at in two different ways of comparing algorithm one is using its speed means time complexity and second is space complexity which means how much memory does it use let's say if you want to move a house you are living in a place a and you want to move to place b and there are multiple ways to do it one is here at the bottom you can hire a truck and put load everything all your household items in the truck and then you move from a to b but if you don't want to spend money you don't want to use that much space you can simply uh get this tiny things and you load your each box and then move from a to b and you have to do many round trips considering you live right closer you would definitely save space however it's going to be much longer and you'll be tired so this is the speed wise it's slower but it requires less space this one is faster but it requires more space just to give an example of what is uh time complexity and space complexity but in terms of computer how do you define speed right you have a speed of a computer but that's we don't want to look at that because uh you can have different speed of different computers so we want to we want to remove that variable we want to look at each algorithm in its mathematical sense so when it comes to computer what are the things we look at for speed we look at how many reads or how many writes do we need for this algorithm to perform to solve that problem how many comparison of variables does it need uh does it require uh addition subtraction what not and if you add all of those up that mini operation defines how slow or how fast that algorithm is going to be so think of these items as unit for example if i want to find the maximum number in a an array of numbers we have to read those variables from the array and then we have to compare them with each other so for a given set of numbers how many comparison do you need and that would define the speed so let's look at a simple example all right so let's look at this algorithm finding maximum number from a number array and here's an array of numbers there are six numbers total and we want to find the maximum obviously you can see the 100 is maximum here so i'm not going to explain too much but there is a for loop which loops through every single number and we take the first number and then consider that as a max and if we find something larger than that then we we replace uh max with that and ultimately we will have the largest number in this max variable and that's what it is and i've written this in javascript now you can see that uh 100 is max here and it comes out which is correct but the most important thing here we want to look for is the complexity so how do we define the complexity and here is the the trick for six items six numbers we are looping through which means we are going through every single item and we are doing this comparison here that means for six item we are doing six comparison so if you define this as a function let's say function of n elements then number of then number of comparison that we need to do is we can say we need to compare uh n times right give or take not exactly but end times for example it could be a little bit larger or smaller but around end time right so for six it's six here if i extend this array to let's say hundred items then i know based on this logic this particular algorithm that uh the number of comparison would also increase to 100 if i increase this to million then the number of comparison would increase to million so for n number of elements i would need to make n number of comparisons right so this defines my speed now what if you have five or six uh less comparison five more or five less does it really matter remember we're looking at this worst case scenario we're looking at a big picture for that five is nothing what if this n is million then five is nothing right so if if it's phi n plus five we still call uh fn equal to n and this formula can be simplified using big a notation and we call it n basically for n items it's always n items we are n unit of something and this is a comparison right uh we call it o of n because it's order of n in this case this is also called linear time it's because as n grows uh as as number of item grows the complexity also grows with it at the same rate that's why it's called linear time now let's look at a different problem instead of doing this what if you want to add numbers like 0 one two three four five six or let's say five numbers if i want to add five numbers i can do the same thing here but instead of max i can say uh i can have the total and i would add each number for six numbers i would have to make six additions right and it would come up to similar complexity it will be linear time but if the numbers i've given are consecutive numbers let's say here it's zero one two three four five there is a simple formula and the formula is a mathematical formula the sum would be the number which is five here multiply by n plus one and divided by two so for five this would be five multiplied by five plus one which is six divided by two and this would be 15. and if i want to verify it i would say 1 plus 2 is 3 3 plus 3 is 6 6 plus 4 is 10 and 10 plus 5 is 15 so we get got our number the sum right so if this is let's say a million i have a million consecutive numbers from zero to million uh i don't really have to have any kind of algorithm where i have to run through you know all the additions of all the numbers i can simply do it using this particular formula and i would get the number within this three operation so i'm doing actually one multiplication one summation and one division in this case for f of n this could be million i'm only doing three operations this actually if i want to bucket it i can call it uh o of one i know it's three but because 3 doesn't change with n we can call it something called constant time and instead of calling o3 we call it o1 even if it takes let's say 10 we call it o1 because we want to bucket it and so let's look at all these buckets that we can create so starting with the constant time which is time bucket the next bucket would be something called log n bucket then you can have o of let's say square root of n which is a little bit higher than that then you can also have o of n this is the one that is linear time then from there you can have o of n multiplied by log n and then we can have even more complex n square a lot of algorithm r n square and then if i want to go even higher than that i can have n factorial and so n squared is something called quadratic complexity and if you if you go over this quadratic and then it becomes exponential but we don't have to worry too much about it but you just need to remember uh these complexities so this many buckets so if it's like a five then you put it into this bucket of of one which is a constant time if it's around log n we put it into log n and so and so forth and so now we have a like a universal lingo of if i say if i have an algorithm uh where time complexity of o of n i would know that for n number of item i would have to do n number of certain operations then i i can compare something that is written with o of n to o of log of n and i can know i would know which one is faster so now let's look at these in a graphical way so we can we would know how they are actually different from each other and for that i found a very cool tool and it's called desmos.com i actually so let's plot all these complexity uh starting from o of one so if i want to do of one i would say function of n uh equal to one which means constant doesn't matter if it if i zoom in or zoom out it always stays one which is very simple now let's look at the next formula fn equal to log two and as you can see it's a different curve but this go slower at slower rate and over the time let's look at the next one which is a square root of root of n so if i add another one and there is a function square root now this as you can see this is a green line and it's grows little faster than log n now let's look at the linear time which we looked at the first example when we looked at the finding the maximum number now this is a day and night difference you can see all of these are much lower rate now offend which is a linear growth it grows really fast as the number of item grows this grows with it so even though it's still not a bad complexity but compared to log n and square root of n it's much bigger let's look at the next one which is uh f and log2 now you can see that it's growing much faster than then and then the linear complexity and one thing you would notice that they would never around around here if i zoom in here for a very short period of time where login uh maybe lower than uh some of the other ones but n log n is maybe smaller than other one but ultimately as it grows it they will never cross each other after that so as you can see they kind of go on their own way after that they would never meet any point after that so that is important thing to notice that's why we want to look at the worst the last one if you want to look at the f n equal to let's say n square this one as you can see it's grows much faster than even n log n and i can even draw a factorial it'll be like a straight line pretty close to the the y axis actually okay so now that you know what big odd notation is is time to practice and if you are preparing for an interview any problem that you solve with a specific algorithm try to bucket it is it o of n or is it o of n square and see if you can do better than the current algorithm that you're trying and that's how you improve that's all folks uh this is the first video in algorithm and data structure series so there will be more videos coming so if you haven't subscribed to my channel please do so and also please like and comment on the video and you can follow me on facebook i have interesting groups uh one on algorithm and data structure and one also on react you can follow me for some interesting topics and you can ask questions as well if you get stuck and you can follow me on twitter and i have a medium articles uh page as well i like to write sometimes so you can check that out and you can also purchase my courses i have a couple of courses one on react and one on javascript and i'm planning to make couple of more one on interview preparation on javascript and one on algorithm and data structure and thank you for watching
Info
Channel: techsith
Views: 4,823
Rating: 4.9283581 out of 5
Keywords:
Id: 388nF8g8pvo
Channel Id: undefined
Length: 16min 44sec (1004 seconds)
Published: Wed Aug 12 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.