Similar matrices have similar properties

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
what does it mean for to may receive to be equal well we define the idea of equality of two matrices if they had the same size so they were both N by n and then every component on the 1 matrix was equal to its corresponding component on the other so for example the a IJ on the 1 matrix was equal to the bij on the other for all relevant I and J's I'm going to give you a different note not equality of matrices I'm going to call it similarity of matrices so I have two matrices I'm not claiming that they're identical they could have different numbers but I'm going to a claim that they're similar in some important aspect the definition is going to see a little bit strange at first but I'll explain why it is and why if you so the definition is this I am claiming that a matrix a and matrix B that those are similar matrices and I used a little till design for similarity if you can find this invertible matrix p and then it has the relationship that that the one matrix the a can be written as P times B times P inverse by the way I can also write this in a slightly different way that's completely equivalent mainly suppose I multiply on the left of both sides by P inverse so I get on the left hand side it would be P inverse times a and then I would have like PN vers times P but the inverse times P is just the identity major so I wouldn't have anything maybe like with B and then I'm going to do the same trick we're going to multiply now the right hand side both by the matrix P and on the left this is going to mean that I get a P off on its right hand side and on the right on the right hand side there's going to be a P Anderson the P again it cancels become the identity matrix so you can rearrange it like this it doesn't really matter now I'm using a bit of a pretentious word I'm saying that this property is defined as similarity so I should really only call it similar matrices if these matrices really share a bunch of important properties to us and indeed that's gonna be the case similar make these are really quite equivalent on a long list of different properties one of them is going to be for example the rank of the matrix a is equal to the rank of the matrix P but the next line I'm going to show is really important for our computation of eigenvalues and eigenvectors for example let me come along here and take the determinant of a minus lambda I now why am I even beginning to be interested in this of course the determinant of a minus lambda I was a part of how we computed eigenvalues we would say that the determinant was going to be equal to zero and then we would say what lambdas gave that equation being equal to zero that was how we found I can values so figuring out the determinate of a minus lambda I is really important things so that's why I'm going to study now I can substitute this equation in for the value of a I can say that this is the same thing as the determinant of this invertible matrix P times B times P inverse minus lambda and then I'm going to give a little bit of a pause here for just a moment because notice that P times P inverse which is the identity I can always put an identity in the middle of that so I think I can do the same thing here I can take that P and P inverse and I'm not really doing anything it's kind of like multiplying by a number and one over the number they're going to cancel so doesn't matter P times P inverse you can commute these matrices and they're going to cancel the VV identity matrix so I'm allowed to do this little algebraic trick and then I'm going to do some further algebra here I'm going to say that this is the determinant of Y I'm going to put a P all the way out the front because I notice that there's a appeal the front of both of these expressions I'm going to factor it out that's going to leave me with a B minus lambda I and then I also have a P inverse on the right hand side of both of these expressions as I'm allowed to factor that out as well so I'm just factoring out the contact or P and the P inverse all right now I'm going to use the convenient fact about determines namely the determinant of a product is the product of the individual determinants so for example this is a determinant really of three things a P that's B minus lambda I and this p inverse and so I'm going to write it that way I'm going to say it's the determinant of P it's the determinant of this B minus lambda I and it's the determinant of P inverse so the determinant of a product is the product of the determinants one final thing having to dig into the memory banks about all the different properties of determinants here another property was that the determinant of a matrix is going to be 1 over the determinant of the inverse so this determinant P is just some number right and then the determinant of P inverse is 1 over that number so I can cancel the two of them and that just leaves me with the determinant of B minus lambda I the takeaway is this these two things that I have the determinant of a minus lambda I and the determinant B minus lambda I those are the important crucial ingredients in figuring out the eigenvalues for either a or for B and it turns out that this equation this equation where you're solving for lambda you're setting it equal to 0 that that equation is identical regardless of whether it's the matrix A or B so they are going to have the same eigenvalues and this is even from counting with multiplicity say it's the eigenvalue of two appeared and the eigenvalue of two appeared three times for a the eigenvalues 2 is going to appear in the convocation for B and it's going to appear three times as well now there are many different properties like this where similar matrices result in similar properties but we had to be a little bit careful for example eigenvectors those do changes the eigenvalues that remain the same now these expressions like p inverse AP is equal to b can seem at first glance a little bit strange so i want to give you an example of one way that expressions like that might occur and i want you to think about change of basis imagine you've got some basis b and that you've got vectors x that are written in terms of that basis b and then i want you to imagine that that the matrix p here that we're specifying that that p matrix is the change of basis that takes your vector right in the B basis and puts it into the standard basis so then if I take one of these expressions how about I do the right one the P inverse ap and I'm going to say that this is equal to B well if I apply this particular matrix to some vector X that's being written in that B basis I'm going to apply it like this then on the left hand side what we really have is three different things and the first component of it where you take the vector written the B basis you multiply by P that puts it into the standard basis so what I've got identified here is a P times V xB that is a standard basis and then when you multiply it by a a is some linear transformation it tells you how you manipulate things in the standard basis so you've taken your vector actually clearing the standard basis now you manipulate it by the matrix a does whatever the matrix a is going to be doing and then you apply P inverse and P inverse is going to put it back into the B basis so whenever you have a linear transformation like this transformation a where it's thought of is manipulating something in the standard basis then if you have some vector and some other basis you can apply this P inverse ap to transform it into the standard basis to apply the linear transformation a and then to put it back into your basis B and that's just one example of where similar matrices might arise
Info
Channel: Dr. Trefor Bazett
Views: 23,755
Rating: 4.8526912 out of 5
Keywords: Similar Matrices, Ax=b, Linear Algebra, Math, Education
Id: jNtiENbAcFM
Channel Id: undefined
Length: 8min 47sec (527 seconds)
Published: Mon Jul 03 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.