Support Vector Machine (SVM) Basic Intuition- Part 1| Machine Learning

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hello my name is Krishna and I'm welcome to my youtube channel so there's today in this particular video we'll be discussing about support vector machines this was one of the most requested video by all the subscribers so I'm going to discuss this and again this will be divided into multiple parts because there are a lot of things to understand over here so make sure that it was this particular video till the end so let's go ahead now support vector machine first of all we need to understand what kind of problem statement it actually solves in with respect to supervised machine learning it is useful in solving both classification and regression problem statement okay now the next thing is that what all we are going to learn in this particular topic in this particular part right we are going to understand what is support vectors we are going to understand hyperplane what is marginal distance what is linear separable points nonlinear separable points and we are also going to see some of the examples which I have actually drawn over here so let's go ahead now the main aim of the support vector machine suppose if I consider a classification problem I'll also be discussing about regression problem in the upcoming parts but now we'll just take an example of classification problem you just understand the geometrical intuition now since this is a classification problem you can see the guys we can easily separate this two classes point so I suppose if I consider this as my positive points okay and this is my negative points so I can actually classify this particular points with a hyperplane this the center line is basically a hyperplane we have also seen in logistic regression in linear regression how we can actually create a hyperplane right but this is just not the end okay support vector machine make sure that when you are creating this hyperplane right so suppose this is my hyperplane over here apart from this it also creates two margin lines okay and these two margin lines will be having some distance you know so that it will be easily linearly separable for both the classification points right now suppose if I consider this hyperplane it makes sure that there will be one more plane that will be parallel to this particular hyperplane and this particular hyperplane that you see which is parallel to this one right it will be passing through of the nearest positive or the negative point so after creating this hyperplane you know what we do is that we also create two parallel planes okay the one in these dotted lines and the other one which is dotted line and both this particular line will be parallel to the hyperplane that is actually created now when we are creating this hyperplane you know this one dotted hyperplane which is parallel to the main hyperplane over here we make sure that it passes through one of the nearest positive points over here this red one is one of the nearest positive point similarly when we are passing we are creating the another hyperplane towards the negative direction you can see that which passes through the nearest you know negative point from there we will actually create this particular hyperplane that is the intuition behind support vector machine so support vector machine does not just only focus on creating one hyperplane but instead it also creates two parallel hyperplanes such in such a way that one of the hyperplane will be passing to the nearest positive point and the other hyperplane will be passing to the nearest negative point now when we compute this distance that is d+ + d - this whole distance is basically called as margin marginal distance okay this dotted line we can consider it as a margin and the distance between these two parallel lines you know in the dotted line is called as margin now what is the significance of this particular margin remember guys we always have to create a generalized model you know whenever we are applying a classification problem statement we have always have to create a generalized model now in the generalized model you know we usually get a better accuracy for any kind of data that we actually get now when we are doing this separation with respect to the positive and negative you know any points any points that come above this hyperplane will be actually classified to this particular point any point that would be below this particular hyperplane that will be considered as a negative point now this hyperplane is actually giving us a cushion to actually divide this positive and negative in the better way so in the test data support some of the points comes over here right this is just like a cushion which can easily classify and say that okay it belongs to the positive class if it comes over it belongs to the negative class so this distance is pretty much important but still that the other question rise is that apart from this hyperplane we can also create multiple hyper planes right we can also create some hyper planes somewhere over here like this right we can create this one single line right like this sir it is being able to divide these particular points but understand when I create a hyper plane I also have to focus on this particular margin right in this particular case when I am doing this I can show you this particular example over here right suppose I have taken the same points over here and I've actually used this particular hyper plane to actually create a separation line between the positive and the negative plus but when I create this particular marginal planes itself over here what is happening this marginal playing distance is very very less our main aim should be that we should maximize this particular marginal plane okay so in based on that we actually select the best hyperplane which has the maximum marginal plane or marginal distance you can say okay so that is how it is actually and remember this all techniques upper gets applied in linear separable points no over here we can linearly separate this particular point that basically means we can actually create a straight line and we can actually you know classify these two points right so this linear straight line is basically using we are using it to actually linearly classify this particular points what if if we have non linear separable will be discussing about that okay so the main focus should be that when we are discussing about marginal distance whenever we create a hyperplane okay and those kind of hyperplane should be selected wherein this marginal distance should be maximum so if I take this two particular example one way I have actually used this for linear separable separating in this particular way and the other way is somewhere over here now this particular distance I can say it as d1 this particular distance I can say it as d2 right now in this particular which of this particular hyperplane will get selected whether this one or this one right along with this marginal plane so definitely over here you can see that d2 is very very much greater than more d1 so we may basically use this particular technique to actually divide this flower or use this hyperplane to actually divide this particular points because now here we get a more generalized model this will also work but we will not get a much generalized model that basically means for my new test data this will definitely give a lot of errors when compared to this kind of hyperplane that we have so going to this first of all we have discussed about hyper planes and remember guys in two dimensions will just not say hyper planes will just say it as a straight line okay but in three three dimensions in four dimensions we may say it as a hyper plane because we will be using a kind of planes over there to actually divide those points okay so we have discussed about hyper plane we have discussed about marginal distance marginal distance is nothing but the two parallel lines that I am actually creating with respect to the most nearest positive point and the most near is negative point the distance between them is actually the marginal distance this I can say it is a marginal positive plane marginal negative plane because this is nearest to the negative and the positive point so this two topics has been covered now we have also discussed about linearly separable linearly separable basically means we can easily separate this particular point by just drawing the straight line okay non linear separable basically means this is the best example here you can see that we we cannot just draw straight line right we cannot just draw a straight line to divide this particular points anyhow if we try to do it our hey curacy will be less than 50% less than or equal to 50% right because anyhow you can see the some of the points are over here and some of the points are over here so they are actually intermixed so we will also try to understand how to solve this particular problem okay now one important thing is about support vectors what is this particular support vector now understand guys we have actually created this particular plane we have selected the maximum distance now based on this which is the nearest positive point that is passing to marginal plane m1 and marginal plane m to the nearest positive point and the nearest negative point right and it may be multiple number of points also I may have multiple number of points these are kind of called as support vectors so support vectors are nothing but they are the points that is actually passing to the through the marginal plane that we have actually created in parallel to the hyperplane that is found out right so based on that and it may be having multi number of points not one or two it may be three or four any number of points that are passing through this particular marginal plane we will be considering those as you know we will be considering those that support vectors so this is my support vector this is my support vector and this all are also pretty much important guys because this helps us to determine that maximum distance of the marginal plane and remember finally we have to create a generalized model now still one question basically arises how do we solve this kind of problem statement and remember there is this hyperplane you know how it is actually getting created with respect to the hyperplane I will also be discussing in the upcoming parts now but still we did not understand how do we solve this kind of problem remember this is a two dimensional class right to dimension graph with respect to the positive and the negative statement so I can I can in order to solve this support vector machine uses a technique called as SVM kernels now this SVM kernels the main aim is that it tries to convert the two-dimensional I say low dimension low dimension into a high dimension okay now what what do I mean by high dimensions just take this particular example guys I am just going to rub this so that you'll be able to understand okay suppose if I consider a three dimensional graph and if I try to convert this to dimension into three dimension so okay suppose some of the points of my black points are over here right and if I convert this into three dimension I can actually create a hyperplane between this right so now when I convert this to dimension do three dimensions this will basically get converted like this and between that I can actually create at a smaller hyperplane which is basically a conversion here what we are doing we converting and I'll just name this as SVM kernels the SVM kernels main aim is actually to convert a lower dimension into a higher dimension so that we will be easily able to you know classify these particular points by a hyperplane itself okay and with respect to this hyperplane also I'll be getting my marginal plane looks like this and it will be easily separable but still understand guys we have just understood the basic understanding of support vector machines in the future class we'll also be discussing about lot of maths thing about SVM kernels different types of SVM kernels but I hope you have understood please do remember this particular terminologies that is support vectors hyperplane magnet marginal distance linearly separable nonlinear separable again understand that our main aim is to create a generalized model by just not creating a hyperplane but instead we are creating a marginal distance you know with respect to the marginal planes that are created with respect again with respect to the positive and the negative points right so the higher the marginal distance the more generalized our model is but definitely this will not be possible with respect to this kind of non linear separable points for this we usually apply SVM goodness wherein we convert our low dimension into high dimension and lower dimension it may be right to dimension to three dimensional two dimensional four dimension by creating new data points or will try to you know move it in this particular way now here you can see that this two dimension is getting converted into three dimensional easily we can actually create a hyperplane so this was the basic understanding of support vector machines in the upcoming classes I will be discussing about lot of maths how to actually apply this SVM kernels what are the different SPM kernels but just go through this read lot of blocks and try to understand and yes I'll also be discussing how you can solve two regression problem statements so yes this was all about this particular video and this is just the part one an upcoming session will also be discussing about SVM kernels will understand some of the mats over here what how this marginal plane is actually created so this is all about this particular video I hope you like it please do subscribe Chung fu utter DISA's currency only in the next video have a great day thank you one and all bye bye
Info
Channel: Krish Naik
Views: 137,235
Rating: 4.9611549 out of 5
Keywords: data science tutorial online free, python data science tutorial pdf, python data science tutorial point pdf, what is data science, data science tutorial tutorials point, data science course, support vector machine introduction, support vector machine pdf, support vector machine in data mining, support vector machine tutorial, support vector machine ppt, support vector machine geeks for geeks, support vector machine regression, support vector machine classifier
Id: H9yACitf-KM
Channel Id: undefined
Length: 12min 50sec (770 seconds)
Published: Tue Apr 28 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.