62 - Diagonalization

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
okay so this very long bird diagonalization is a topic that we're gonna study at some length it's gonna take us a bit but let me start with with the overall story the overall story so we're starting with a linear map T from V to V is a linear map V is a finite dimensional vector space we're going to assume the dimension is n and note that we're looking at a linear operator a map between a vector space and itself okay if we want to find a matrix representation of T we know that there are many how do we get these matrix representation we choose a basis once we choose a basis we know the procedure we take the basis and elements duty to them right the results as linear combination of the basis elements take the coefficients transpose we get T matrix representation with respect to e right and there are many of them because there are many bases right so let's write that T has many many matrix representations given be a basis we know well somebody might think that this is a matrix so given a basis B we know how to construct the representation matrix of T with respect to that thesis okay take different bases you get different matrix representations so our goal is among all these possible matrix representations find one that is the nicest and what is what does nicest mean is yet to be defined okay but clearly if we can find one if we can find a matrix representation that is just a diagonal matrix that's probably as nice as you can get do you agree all zeros except diagonal elements calculating the determinant is trivial it's the product of the diagonal elements calculating the trace is always trivial it's their sum right raising the matrix two powers multiplying it by itself is very easy right for diagonal matrices working with it is extremely easy okay so if here comes a big if so let's start a new board because I'm going to need some space suppose there exists a basis B whose elements are V 1 V 2 dot V n so this is a basis for V for the vector space such that the following properties property holds when you take T V 1 V 1 is the first basis element what you get is gonna be some vector so it's gonna be a linear combination of the V is right it's gonna be alpha 1 V 1 plus alpha 2 V 2 plus some linear combination of the V is right but suppose it's just some scalar multiple of V 1 itself that's it that's an assumption clearly that doesn't always have to exist or maybe not clearly maybe it always exists but in fact it doesn't always exist okay but suppose it does suppose we have a basis such that when we do t2 each of the basis elements what we get is just a scalar multiple of that basis element in that specific basis so for T of V 2 is alpha 2 V 2 and I'm writing it here for a reason dot dot dot T of V N equals dot dot dot alpha in V n okay and the reason I wrote it like this is that I'm already thinking I want to construct the representing matrix matrix with respect to that basis so what how do i construct the matrix as I said we know we need to write this as a linear combination of these guys but what is the linear combination here it's alpha 1 V 1 plus 0 V 2 plus 0 V 3 plus 0 V 4 plus dot dot plus 0 V n do you agree and what is T of V 2 if T of V 2 is just alpha 2 V 2 for some scalar alpha then this is 0 V 1 plus alpha 2 V 2 plus 0 V 3 plus 0 V 4 plus all the way to VN right and so on so the representing matrix with respect to the basis B that satisfies this specific property is so suppose there exists the basis element other such that then T the representing matrix with respect to this B is the matrix the transpose of the coefficients right these are all zeros so it's just this transpose doesn't do anything so it's the matrix alpha 1 alpha 2 alpha 3 dot dot dot alpha N and all the rest are zeros is it clear what I mean by these giant zeros above and below the main diagonal okay okay so we're gonna call this matrix D D for diagonal it's a diagonal matrix good so if there exists a basis which acts in a very special way with respect to T just this specific T namely T when acting on a basis element really doesn't do much to it almost fixes it almost because it does multiply it by some scalar then the representing matrix with respect to that basis is gonna be diagonal clear okay so this is our first goal our first goal is to find a procedure which produces such a basis how do we seek V 1 to V n that satisfy this property and it'll turn out that it doesn't always exist okay so we're gonna understand the procedure of how we find it or determine that it doesn't exist and later on we're gonna say a bit less but we're gonna say some stuff about what do we do and it doesn't exist and we can't diagonalize T this is what it means to diagonalize T we can't add nice tea but sometimes we can do not diagonal but with a bit off diagonal stuff but just a bit okay so that's gonna be later so our goals are so maybe I'll continue writing so this B this B is the basis we want to find the basis B such that T has a diagonal representation with respect to it if B if such a B if such a basis exists we say that P is diagonal lies I just feel like putting 17 more let letters but that's it that's the end of the word good clear okay now before we move on I have to throw in a bit more terminology so we can speak the correct language and understand what we're doing okay so some more terminology so first of all we know we know and this is going to be a brief summary of stuff that we discussed at length that there's a tight connection between matrices and linear maps okay the tight connection which we defined in in a in higher language of of the harm space and so on being guys amorphic to two matrices but in simple words in simple words given a T we know how to construct a matrix hey okay a is gonna be the representation the representing matrix of T with respect to some basis let's not it B because b is now our specific diagonalizing basis so let's call it f right ease for the for the standard basis so let's just call it f okay so choosing an F starting with the T choosing an F we can get a matrix starting with a matrix we know how to construct the map how do we construct a map you just define T of V to be a x right so we know how to pass from maps to matrices and for matrices to Maps right so there's a tight relation between the two and we made this relation even more precise right in terms of of what I mentioned before I don't want to say it again and again in terms of the Hans base but the point is that we're intentionally gonna blur a bit the distinction between maps and matrices okay so for example we're gonna say stuff like the kernel of a matrix what is the kernel of a matrix the kernel of a matrix is all the vectors that when you multiply them by the way tricks you get zero just like the kernel of T okay so the kernel was defined for a linear map but we're gonna say the kernel of a matrix and that's what we're gonna mean okay I'm gonna write some of these definitions a bit more precisely later for example we're gonna have the determinant of a map determinants were defined for matrices it's it's it's a certain calculation you do on the entries right what is the determinant of a map so the determinant of the map is gonna be take a matrix representing that main map and calculate its determinant okay clear and now you may ask wait suppose you take the matrix a which represents it with respect to the basis F calculate the determinant of a and say hey that's the and you got an eight let's see and you're gonna say okay that's a determinant of T but you're gonna take a different basis maybe e calculate the representing matrix and get the determinant and say wait that's the determinant of T right they're similar you may have gotten a matrix a you may have gotten a different matrix C but they're similar they have the same determinant so the determinant of T is well-defined clear so we're gonna blur the distinction between maps and matrices when it makes perfect sense to do it when it makes perfect sense to do it okay and say things like the kernel of a matrix and say things like the determinant of a map and so on okay we're gonna do it formally we're gonna define these things a bit formally later but because I want to start with just being an example okay and in the example I'm already gonna use that but I want you to feel that it makes perfect sense to do so okay good everybody okay so in particular we're gonna say that a matrix a matrix is called diagonalizable oh boy this word is too long diag how's that diagonalizable now this is the definition let's diagonalize boo so um P is called diagonal diagonalizable if it has an representing matrix which is diagonal a is gonna be called diagonalizable not diagonal if a is diagonal its diagonal but a is gonna be called diagonalizable if the T that it represents has another representation which is diagonal okay which is the same thing as saying that a is similar to a diagonal matrix okay so a is called diagonalizable if a is similar to a diagonal matrix good good okay so maybe I'll add here this is this is so D the diagonal to a diagonal matrix D and and we know what it means it means that D equals P inverse P inverse a P for some invertible matrix P which we even know how to find okay so that means that they both represent the same linear map okay so that's what I mean when I'm saying a matrix is diagonalizable okay it doesn't mean that for example I can do row operation and get a diagonal matrix okay it doesn't mean that I can all it means is this okay good okay so the two questions we want to ask the two questions we want to ask how do we know if a map or a matrix is diagonalizable and if it is how do we find the B that does the job okay so how so here are the two questions how do we determine if T is diag meaning diagonalizable not diagonal and if it is if it is how do we find be the basis that gives the diagonal form okay so here are some more definitions everybody with me is the overall story clear okay there's still a lot to do here but is the overall story clear the motivation and and the scene good okay so some more definitions a vector a vector V in V a nonzero vector a nonzero vector now it's hard to meet weight a vector nonzero vector here no how do we find the basis once we have B we can find D right okay so how do we find B that's that's the we're gonna find D the diagonal form by finding be that that's the well in fact we find both of them simultaneously in fact okay so if you want how do we find B and the the basis and the diagonal form okay okay a vector Z which is not 0 such that satisfying that when you do T to it you just get some scalar multiple of it is called and eigen vector eigenvector of this weird word Egon is in German and it means self or own okay so it's a self vector of T it kind of belongs to t belongs to t in the sense in the sense okay note of course that the zero vector always satisfies this right T of 0 is always alpha 0 for alpha equals whatever you want for alpha equals 0 or for alpha equals whatever you want in fact right do you agree right because T of 0 is 0 for any linear map okay but 0 is not interesting because we want vectors like this and we want to construct a basis from them and 0 cannot be a basis element right we want a basis we want linearly independent vectors okay so we're throwing zero out of the definition ok good so a vector satisfying this property which is what we're seeking is called an eigenvector and this alpha that does it for this V and alpha is called alpha is a number right alpha is a scalar so alpha is called an eigen value of V okay alpha corresponds to this V which corresponds to 2 T ok so that that's not accurate it's an eigen value of let's be accurate and eigen value of T corresponding to this V Korus upon ding - that's a better way of seeing it ok so that's the notion of an eigenvector and an eigenvalue okay and if we go back to a couple boards ago what we're looking for what we're looking for now we can in a bit more formally we're looking for a basis of eigen vectors okay we're looking for a basis of eigenvectors these are precisely vectors that are eigenvectors by definition okay so using this terminology in this terminology terminology with this terminology now we can see what diagonalizable means tea' is diagonalizable if and only if it has a basis of eigenvectors it has a basis of eigenvectors do you agree okay and moreover the diagonal D the diagonal D right is a matrix which has the eigenvalues along its main diagonal right remember what D was D was precisely look here a second Dee was the diagonal matrix it has it had these alphas on its main diagonal right so let's write that with this terminology T is diagonalizable if it has a basis of eigenvectors and the diagonal matrix D and the diagonal matrix D has and D is a diagonal matrix matrix with the eigenvalues the eigenvalues on its main diagonal good okay so this is the framework okay this is the entire story now we want to prove some theorems in particular that answer these two questions and I want to start with an example okay so I want to start with an example that really does the entire starts with a matrix the first example I'm gonna do is going to be very simple it's gonna be a two by two matrix okay very simple and I want to go out go through the entire process and figure out what we're doing okay we're gonna do it without the theorems and just see what we have to do and from that we're gonna that the theorem is gonna stem from the example right the theorem is gonna say hey let's do what we did in the example okay so we'll do that in a separate in a separate clip
Info
Channel: Technion
Views: 16,513
Rating: 4.958549 out of 5
Keywords: Technion, Algebra 1M, Dr. Aviv Censor, International school of engineering
Id: ABNp7qpio-c
Channel Id: undefined
Length: 25min 32sec (1532 seconds)
Published: Mon Nov 30 2015
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.