60 - Similarity of matrices

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in the previous lesson we climbed the Olympus of abstractness at least to some and and but but you you did well I think I think that that at least people sitting here were able to connect and to understand the ideas and to get the general picture but the the the bottom line the general of the general picture is matrices and linear maps linear transformations are two ways to look at the same thing okay so now we're back down to earth and we want to ask a very practical very important question okay we have a linear map we know that we can represent it by a matrix we know how to do it maybe we can represent it or not maybe we can represent it with several different matrices because in that representation there's a choice of a basis we choose a basis we get a matrix representation of those representations which is best and in what sense so that's the question that we're going to address in the following several lessons and first we're going to have to define a notion of similarity of matrices and see how these things relate okay so what I want to do is start with an example they're gonna be a bit of details we're gonna have to calculate some stuff here but that's good that's back down to earth part so here's a map T from R 3 to R 3 throughout this discussion and in this in this course we're only gonna consider operators now okay so we're only gonna considers maps from V to itself not from V to some different W okay from V to itself so T is gonna be an operator and what does it do so I have to tell you what is does to a general ABC guy in r3 and what it does is it sends it to minus a be zero okay we know the geometric meaning of this in terms of of a composition of what do we have here a projection and then a reflection but that that's not important right now now we're only thinking of it algebraically this is what the map T does it's a linear map okay here are two different basis for R 3 so E is gonna be the set comprised of 3 guys 1 0 0 0 1 0 and 0 0 1 we're gonna give them names this we're gonna call e 1 this we're gonna call E - and this we're gonna call III so do you agree that this is a basis for R 3 / V good here's another one which we have encountered before in previous examples F is gonna be 1 1 0 1 0 1 and 0 1 1 and we're gonna give these guys names as well we're gonna call them surprise surprise f1 f2 and f3 right and we have met this before that's why I'm not gonna go into the details again but we did show that these are linearly independent and they span r3 and therefore E and F are both basis for R 3 so a and F are basis of our three different basis for R 3 do you agree ok so now we can ask the following what is the matrix representation of T with respect to eat what is the matrix representation of T with respect to F I want to find both okay so let's find let's find them so let's first find T sub e okay I'm taking in both cases the same basis for the domain and the range so I'm going to find T E and T FF okay so how do we find a matrix representation we do T to each of the basis elements and write it as a linear combination of the basis elements right that's how we find a matrix representation so what is T of e 1 T of e 1 is T of what is e 1 1 0 0 what is T of 1 0 0 its negative 1 0 0 right that's what T does it it changes the sign flips the sign of the first entry leaves the second entry intact and inhalants the third entry right so this is negative 1 0 0 do you agree that's what T does and now I'm gonna write it as a linear combination of the e eyes again okay so negative 1 0 0 is negative 1 times v1 plus 0 this is in my way plus 0 e 2 plus 0 e 3 do you agree good let's saying it's hard ok what is T of e 2 it's T of e 2 is 0 1 0 T of 0 1 0 is 0 1 0 do you agree that's what T does I'm just reading off the details what does T do and that is well this is just 0 a 1 + 1 e 2 plus 0 e 3 do you agree and finally what is T of e 3 it's T of 0 0 1 that's III right what T does to 0 0 1 and send it to 0 do you agree everybody with me and this is just 0 e 1 + 0 e 2 + 0 e 3 do you agree so now I take the coefficients negative 1 0 0 and write them is the first column and then 0 let's do that so from this I can conclude that the matrix representation let's write it here I want to have to write it on this board so the matrix representation of T with respect to the basis ie and when it's efore both the domain and range we usually don't write te e but we just write T sub e right is is the matrix whose first column is this row right I transpose the coefficients so it's minus 1 0 0 whose second column is the coefficients of the second row 0 1 0 and whose third column is the coefficients of the third row 0 0 0 do you agree this is the matrix representation of this linear operator with respect to this basis good everybody ok now I want to do the same thing for the FS I want to write T of F 1 D of F 2 T of F 3 write them as linear combinations of the F's and write the matrix representation of T with respect to the basis F same procedure ok and for that for that so everything is going to be easy until I get something here and have to write it as a linear combination of the FS that's going to be solving some system of equations so when I have 4 6 17 it's very easy to write it as a linear combinations of e's right it's 4 6 and 17 right but how do I write it as a linear combination of the F so I'm gonna have to solve a system of equations to figure it out or I can resort to something that I already know I already found how to write a vector a general vector X Y Z is a linear combination of these guys so that's what I'm gonna write and you can look back at when we first presented this and see that that indeed works ok we just proved it so recall recall so recall that a general vector X Y Z can be written as X minus Z plus y over 2 times 1 1 0 plus X minus y plus Z over 2 times 1 0 1 plus y minus X plus Z over 2 times 0 1 1 does this look familiar does it ring a bell ok so we did it before if you want to it's easy to verify right you just check what are we getting the first coordinate of this sum so you get an X from here to X minus X so you get X right it's easy to see that this is correct and the way we found it was by solving a system of equations ok so we did this and therefore once we know this it's easy to do the same thing for F some T of F 1 T of F 1 is T of 1 1 0 T of 1 1 0 it's the same T it flips the sign of the first coordinate leaves the second one intact and annihilates the third one so it's negative 1 1 0 right that's what T does and now I want to write this as a combinations of F 1 F 2 and F 3 okay and the way to do it is look at this at this this is how you do it right so X is now negative 1 Y is 1 Z is 0 and what you get tell me if I'm correct you get 0 F 1 minus 1 F 2 plus 1 F 3 am I correct ok so remember X is negative 1 Y is 1 so here you get 0 times F 1 here you get X is negative 1 Y is 1 so you get negative 2 divided by 2 you get negative 1 F 2 and you get 1 over there good ok likewise for the other guys I'm just gonna write it you can verify the details that I didn't make any mistake so T of F 2 is T of 1 0 one T of 1 0 1 is negative 1 flip leave until 8 so it's negative 1 0 0 and what you get writing it as a linear combination is minus 1/2 F 1 minus 1/2 F 2 plus 1/2 F 3 and finally T of F 3 is T of 0 1 1 which is 0 1 0 right and writing that as a linear combination of the FS is 1/2 F 1 minus 1/2 F 2 plus 1/2 F 3 ok take the coefficient matrix transpose it that's gonna be T matrix representation with respect to the basis yes it's the matrix what 0 negative 1 1 0 negative 1 1 that's the first column right and then negative 1/2 negative 1/2 plus 1/2 that's the second column and then 1/2 negative 1/2 and 1/2 that's the third call it good so starting with two different bases enf we got two different matrix representations te and TF and there are different matrices right te was look back at this board te was in this case very nice and friendly looking a lot of zeros right TF was somewhat uglier only a single 0 in a lot of 1 halves right things can get much nastier than that but it still it looks like this one is friendlier user friendlier right so the question is what's the relation between te and TF as matrices somebody gives us this matrix and that matrix how are they related okay and suppose you don't know that they started off as matrix representations of some tea with with respect to some vases okay how are they related that's this notion that's the word similarity they're gonna be called similar matrices okay and that word turns out to hide a lot of information about them so let's start out by writing that and then we'll go back to the example some more so question question how are T sub e and this is an T F related so here's the definition this is the the street forward answer well there are representations of the same linear transformation with respect to different bases right that's a non intelligent answer at least the way we presented it okay so that's gonna be the definition two matrices two matrices a and B are called similar that's the definition are called similar if they represent represent the same linear map T with respect to different basis that's how we're going to define similar matrices it's a property of matrices okay two matrices are similar if they represent the same map with respect to different bases okay the notation is notation for similarity notation is just this squiggly tilde think a similar to be okay so in particular in particular if you start out with T it is always true that T E is similar to T F right that was the way we defined it okay so before we move on remember that we discussed the notion of being an equivalence relation okay an equivalence relation is something weaker than equality they're obviously not necessarily the same matrix and the example there were very different matrices but they're equivalent in some sense okay here it they're equivalent in the sense that they represent the same team okay so this is an equivalence relation note dad this is an equivalence relation so let me remind you what it means to be an equivalence relation it means that it has to satisfy three properties which in this case are completely trivial okay the properties are the firt that we had names for them so the first one we called reflexivity being reflexive reflexive means that a is always similar to a okay in this case a and a of course represent the same linear map with respect to well here it's not different basis it's the same basis right so a matrix is always equivalent to itself okay if a is equivalent to B then B is equivalent to a that's the property of being symmetric if a and B represent the same linear map then well BN a represent the same linear map right it's it's really trivial in this definition do you agree and finally if a and B are similar meaning they represent the same T and B and C are similar meaning they represent the same T then obviously a and C are similar they represent the same T good so do you agree to this to this statement okay the third the third axiom is called transitive AE is similar to B B is similar to C then a is similar to C okay so this is indeed an equivalence relation again something weaker than equality no claim that they're the same matrix but they're equivalent in some sense in this sense okay okay so remark remark we will see we will see and I'm postponing this for now we will see it a bit later we will see that similar matrices share a lot of properties share many properties so the fact that two matrices are simp are similar doesn't imply that they're equal they're usually not okay but many many matrix properties are the same for them for example they have the same rank always they have the same determinant so these are these are the rank of a matrix and the determinant of a matrix are things that we've already seen right we know what the rank and the determinant are and you can go to that example and verify that they te NT F indeed have the same rank in the same determinant don't do it now don't do it now look here cuz we're gonna do it I'm just postponing it we will see okay in an example and proof okay and more these are just two but they have many more but these are gonna be words that we haven't defined yet okay so they're gonna have the same trace but we haven't said what the trace of a matrix is yet but once we say what the trace of a matrix is similar matrices are going to have this same trace they have the same eigenvalues we're gonna define a notion called an eigenvalue similar matrices are gonna have the same eigenvalues they're gonna have the same characteristic polynomial see the course isn't over yet okay characteristic polynomial scary and more so we're gonna see that similar matrices share many properties have a lot of properties income in common so being equivalent in the sense of being similar is not just that they represent the same t but in fact as matrices they have a lot of shared properties okay okay so this is just the remark for now but here's a theorem a is similar to be all these properties all these are not if and only if statements so it's not going to be true that if two matrices have the same rank the same determinant the same trees the same eigenvalues the same characteristic polynomial then they're similar that's not going to be true these are all only shared properties but they don't termined the similarity you can have two matrices that have all of these in common and yet they're not similar okay so here is a property that is equivalent to being similar a property of matrices and you'll often see when I write something if and only if what it means is that this is an equivalent definition and you'll see in many books for example that sometimes similarity is defined as what I'm gonna write here and then the theorem is similar infinitely if they represent the same transformation okay so this is an equivalent definition if and only if there exists an invertible matrix invertible matrix let's call it P that's usually what it's called P such that B equals P inverse times a times P this looks very mysterious huh and if you're looking at the example and you want to somehow see P well we're gonna I'm gonna tell you how to find P but it's not trivial it's not that you're gonna look at the example in P is gonna pop at you okay so two matrices are similar somewhere out there there's a linear transformation which they both represent but in different bases if and only if somewhere out there there's this mysterious matrix P such that one B is a up to this is sometimes called conjugation by P you multiply it by one side with P inverse the other by P itself stick a in between you get B okay so this is something in the world of matrices this this statement has nothing to do with linear transformations right do you agree okay so this is an equivalent definition and we're not going to prove this we're gonna exhibit this on the example that's what we're gonna do next okay but I'm not gonna prove this okay it's it's it gets it gets technical not too much but but in a way that that I don't want to scare you off too much okay so before we exhibit this on the example I wanna I want to just give you the flavor of this thing tell you what P is the it seems like like like something very abstract right like there exists an invertible matrix P but in fact it's not in fact it's very explicit I can tell you precisely what P is okay so let's write that let's write it the problem is that I don't want to erase this example ever so I'm running out of space but I can erase this for now so let's write this as the next remark so by definition similar matrices our matrices that represent the same linear transformation with respect to different basis theorem they're similar if and only if there is some invertible p such that B is P inverse ap equals P inverse ap so remark remark P can be found explicitly is there an e here check in the speller right no II know II least I felt that something was wrong but that's part of the way so P can be found explicitly and I'm gonna tell you what it is P is take the identity map the identity map from V to V and take its matrix representation with respect to the two bases F and E where you take F for V as the domain and E for V is the range so this I here is the map I from V to V the identity map okay so this P is the matrix representation of the identity map when we take the two bases different basis for V ok good now it even looks more mysterious ok I told you explicitly what it is but huh Wow why ok so the answer is the answer is this is an important matrix ok this P is an important matrix P is called P is called as a name the change of basis matrix let me write it here P is called the change of basis matrix between E and F right because this is a linear map I is a linear map that does nothing right I is the identity map it sends a vector to itself but when I present V once with the basis F and once with the basis e what I'm doing all I'm doing in this eye is taking the same vector sending it to itself but representing it in a different basis right that's what this does so this matrix is called the change of basis matrix with respect to e an effort with respect to e and F let's write with respect to e and F okay and the theorem is and satisfies so what is it satisfies so first of all three things if you take let's write it here if you take a vector and write its coefficient vector in the basis F okay if you multiply that by P you multiply that by P you get V its coefficient vector in the basis e that's why p is called a change of basis it takes a vector with respect to F and sends it to the coefficient vector with respect to eat okay in fact it goes the other way so if you take a vector and you take its coordinate vector with respect to e and you want to get its coordinate vector with respect to F you have to multiply by not P but IEF right by the by right it turns out that it's not just any matrix it's precisely that matrix the inverse okay it's not obvious it's not obvious that I EF is IFE inverse that requires proof okay that's the theorem that's part two of the theorem good and part three part three is that if you want to change representing matrices part three is precisely the previous theorem or at least one direction of it so part three is if you take T and represent it with respect to the basis e and then you hit it from the left by P inverse and hit it from the right by P you get precisely T the matrix representation with respect to the basis F okay so this is part three is precisely the statement that if it's not an if and only if it's an if if a and B represent the same linear map right and the word for that is similar if a and B are similar then there's a invertible P such that P inverse ap equals to B do you see that okay so this is this is one direction of the previous theorem but now it's more it's stated more concretely okay it's saying precisely what that P has to be and how it implements the similarity in terms of that mysterious P okay so if B and a are similar then B equals P inverse ap where P is none other than this okay so again I'm not proving this theorem I'm not proving this theorem this is just the idea okay but I want to show all these details on our example okay so all we did so far in our example was find what this is and what this is right that's all we did we found TF and TE right so what I want to do now is find pete find p inverse and see that all these three statements are indeed satisfied in our example so i want to take a vector find its coefficient vector with respect to F multiply it by P once I find it and see that I ain't get indeed get V sub e and likewise the other way and see the dis holds okay so I want to check all these three statements on on our example good so is the ideas are the ideas clear okay who okay so let's continue our example let's recall what we already had and again it's gonna take us again two to two more calculations we're gonna have to do some work so our example please follow me on these boards we had a specific transformation that sent ABC 2 minus a B 0 that was t ok we found te the way we find representing matrix with respect to a given basis and we found on this board TF exactly the same way do you agree okay please please follow me over here good so now what we want to do is find P find P inverse and show that the three statements of that serum indeed holds in this example okay so let's get started so the first thing I want to do is find P so maybe let's erase here I might need this so I'm going to leave this on the board and I will erase T and we'll write it again later when we need it okay so what is so example continued oops continued so let's find P so remember that P is let's find P which is I F e rate the representing matrix of the identity transformation with respect to the basis F and F for the domain D for the range so how do I do it just like I find a representing matrix I take I of f1 right what is I of f1 it's I of F 1 was 1 1 0 what does I do 2 1 1 0 right nothing I is the identity but now I have to write it as a linear combination of the YZ right so it's 1 times e 1 plus 1 times e 2 plus 0 times e 3 do you agree then I do the same thing for I of F 2 I of F 2 is I of 1 0 1 I does nothing so it's 1 0 1 and I'm writing it as a linear combination of the YZ ok so it's 1 e 1 plus 0 e 2 plus 1 e 3 good and finally I of F 3 is I of 0 1 1 that was what F 3 was good everybody with me ok you have to not once in a while do this so I know that you're all with me feel like a bubblehead i of 0 1 1 is 0 1 1 good bubblehead and this is 0 times e 1 plus 1 times e 2 plus 1 times e 3 right we take the coefficients and transpose them that's this so these are our coefficients in this case it's it's the transpose doesn't make any difference nevertheless P is 1 1 0 1 1 0 1 0 1 and 0 1 1 in this specific example it turned out to be that it's just the vectors of the basis F in the columns are in the rows ok and there's a reason for it because the the other basis is the standard basis that's what happened but when this is not going to be e when this is gonna be another basis rather than the standard one it's gonna look something completely different ok this is very this is a specific example ok so this is P good ok next we want to find P inverse right or maybe maybe before we find P inverse right there was something that I wanted to show okay let's find P inverse first so how do we find P inverse there are two ways to do it that are that are directly relevant to what we're doing 1 1 we can take P right I next to it and then do row operations until P becomes I we know that p is invertible right how do we know that p is invertible we know that before we wrote anything we know that p is gonna be invertible why why how do you know that p is gonna be invertible though you can't say things that that that follow from being invertible you have to tell me what do you know about p before we even did all the calculations no but but but but i'm saying that we know it's invertible before the fear we're just displaying the theorem on an example but my claim is that this p is gonna be invertible always reason we're taking an invertible map in finding its matrix representation right i is an invertible map obviously and one of the when we discussed the relation between maps and their matrix representations if the map is invertible its matrix representation in is going to be invertible and moreover the matrix representation is gonna be is gonna be sorry the inverse of the matrix representation is gonna be the matrix representation of the inverse map remember that so that's why p inverse is precisely i EF because that's the inverse map okay good so we know that it's gonna be invertible finding for finding its inverse we have two choices one do the same thing but find i EF that's gonna be P inverse option to write it next to I do row operations once this becomes I what you have here is gonna become P inverse okay so let's write all that let's write all that I think we can erase here by now so to find P inverse right firfer checking all the statements we had in the theorem we need to find P P inverse and then represent vectors with respect to e enters and to F and show that everything works so to find P inverse we can either either calculate I not Fe but e F that's gonna be P inverse or or use P I and do row operations until we get here I and this is going to be P inverse so does everybody understand what I mean by by this notation this is a method that we've been using for a while to find inverses of matrices in general good so either way and this I'm gonna leave you leave for you to do is homework okay cuz it's there's nothing new there okay do it do both ways see that you indeed get in both ways the same thing and not just any same thing I'm gonna tell you what you're gonna get P inverse is going to be the matrix 1/2 1/2 1/2 negative 1/2 1/2 negative 1/2 1/2 and negative 1/2 1/2 1/2 that's what P inverse is going to be you can easily verify it by multiplying P and P inverse and see that you get I okay so verify okay good everybody with me okay so we found P you found P inverse okay now let's start deciphering the statements of the theorem and seeing that indeed all of them holds okay so let's first collect all our data all our data okay so let's look here for a minute this is the theorem again okay so the first thing we're gonna do we're going to take a vector we're gonna write its coefficient look on this board please we're gonna write its coefficient vector with respect to the basis F we're going to multiply that by P and see that we get the coefficient vector with respect to the basis II that's the first thing so let's do that first okay so by the way that this statement has nothing to do with T's right this is just a statement about change of basis right there's no tu there's no transformation in the picture for these two statements this is a statement merely about P itself about changing bases okay T is gonna show up here that's going to be the third thing that we're gonna check okay so let's write this first so good P is down here so let's use it okay so if we take the vector 1 2 3 let's take this vector this is going to be even okay V its coefficient vector with respect to the basis e is what I'm taking a specific vector I'm calling it V ok now I'm asking what's its coefficient vector with respect to e is the standard basis right it's the same it's just 1 2 3 right because this vector is 1 times e 1 plus 2 times e 2 plus 3 times e 3 so it's just 1 2 3 good what is V its coefficient vector with respect to the basis F the basis F is these three guys so I have to write V as something times F 1 plus something times e to F scratch something times F 1 plus something times F 2 plus something times F 3 and then take the three something's those are going to be the components of the coefficient vector with respect to F right and the way to do it since I already know this because this is something I did previously I can do it right away right so X is 1 Y is 2 Z is 3 I'm plugging them into here and I'm seeing what are the coefficients right so 1 minus 2 plus 3 is 2 divided by 2 really 1 minus 2 is negative 1 plus sorry 1 minus 3 is negative 2 plus 2 is 0 so I get a 0 here good one minus two negative one plus three is two divided by two I get a 1 here and 2 minus 1 1 plus 3 2 minus 1 1 plus 3 4 divided by 2 2 this is the coefficient vector of this V with respect to that basis to F good everybody remember coefficient vectors okay I agree that everything could get a bit confusing if you haven't practice this stuff right because we're using many many little things that that you should have practiced and feel comfortable with by now if you're not it's just confusion ok you have to go back we call it we did and check everything it's very straightforward as long as you understand what we're doing ok so do you agree ok so now let's check the claims of the theorem the first claim was so the first claim was one of the theorem said what that if you take P multiplied it by Z F you get V let's do that P is 1 1 0 1 0 1 0 1 1 x VF 0 1 2 do we get V 0 plus 1 we get 1 here 0 plus 0 plus 2 we get 2 here and 0 plus 1 plus 2 you get 3 here yay do you agree okay so P times V F gives you V P implements a change of basis that's precisely what P does good okay so this was feet I'm erasing P what was part two of the theorem part two of the theorem said take P inverse multiply it by V sub e you get V sub F let's do that P inverse times V e let's see what it is so P inverse I'm copying what P inverse was so it was all 1 half's and then we'll stick the minuses where they were and where were the minus is on the opposite diagonal so this was a minus this was a minus and this was a minus good this is P inverse V sub E is 1 2 3 and let's carry out the multiplication ok Oh 0 1 2 do you agree let's verify just at least let's do some of it so it's 1/2 isn't it fun to to multiply matrices just like we did in the very beginning right so 1/2 plus 1/2 of 2 so plus 1 is 1 and 1/2 minus 1/2 of 3 so you get a 0 here right 1/2 minus 1 so it's minus 1/2 plus 1 and 1/2 you get a 1 good everybody ok so this is indeed V F change of basis P inverse gives you the change of basis going the other way good ok so these are the first to properties the first two properties now let's throw tea into the picture okay so these two properties had nothing to do with the linear transformation the only involved vectors written in different basis now let's throw tea into the picture okay so T is a linear map let's recall ami please trace me on the boards now so T was the map that did this and we had its representation with respect to e the standard basis and we had its representation with respect to F which is no longer on the board I'm gonna write it in a second so what do we know what do we know let's collect some more details so first of all I can erase P inverse here we're gonna write it again again so the first statement is that if you take T I want you to verify these things if you take T with respect to the basis e and multiply it by V with respected basis e what are you gonna get right you're gonna get T of V with respect to the basis do you agree okay that's what a linear a lit a matrix representation does do you agree okay so let's verify that let's verify this so what is T with respect to the basis II it was negative 1 1 0 and 0 is elsewhere this is this do you agree what is V with respect to the vet to the basis e in general a general vector V with respect to the basis e is ABC right that's a general vector with respect to the basis e what do we get we get negative a be 0 we get TFE with respect to the basis he that's what T does do you agree ok so this is the general statement and this is in the example do you agree okay what happens if you take T with respect to the basis F it's matrix representation with respect to the basis F and multiply it by a vector a coordinate vector with respect to F you're gonna get T of e written in the basis F do you agree that's what T does okay this is not part of this theorem this doesn't have anything to do with P yet this is the previous theorem that you promised me that it doesn't freak you out right okay so if you check what that does so T F is 0 negative 1/2 1/2 negative 1 negative 1/2 negative 1/2 and 1 1/2 1/2 that's what T sub F was that was the matrix representation of T with respect to F okay and now if I take a general vector represented in the base F it's gonna look like instead of ABC I'm gonna have to write its components with respect to the vector F so it's gonna be somewhat nastier so it's gonna be a minus c plus B over 2 instead of XYZ I'm writing ABC and taking the coefficients right so that's going to be the first component a minus B plus C over 2 and B minus a plus C over 2 that's a vector that's ABC written in the basis F do you agree follow me to this board a second that's ABC written in the basis F this is its first component this is a second component and this is it start component good ok and what you get what you get is what would happen to this if you wrote it in the in the basis F right it would be T of this guy written in the basis F okay and you can calculate it either by carrying out this multiplication which is a bit a bit more tedious but not that much or you can T this and write it in the basis F right sorry this is already teed right this in the basis F okay and in any event what you're gonna get I'm gonna write it I'm gonna wash it in here so what you get is and you have to verify this minus a plus B over two minus a minus B over two and A plus speaking / - okay there's obviously C is gonna die from here it's not obvious that C dies out in the multiplication right but we know it's just plugging this as XYZ into there there's gonna be no see there right so verify that this is indeed the case okay verify this good and finally the last statement involving T or the representations of T is this statement the similarity statement okay that there exists a p and invertible matrix and we know what it is we found it it's this matrix such that if we take T the matrix representation with respect to eat hit it from the right by P and from the left by P inverse we get the matrix representation with respect to F okay so let's write that part three of the theorem it's gonna translate to the following thing that you have to verify right we want to write that P inverse T e P equals T F right and in terms of this example P inverse is one-half well P inverse we already know is all these 1/2 Sall over the place with minuses on these 3 these 3 so this is P inverse te was negative 1 0 0 0 1 0 0 0 0 this was te P was 1 1 0 1 0 1 and 0 1 1 right and this should equal T f TF was this matrix 0 negative 1 1 negative 1/2 negative 1/2 1/2 and 1/2 negative 1/2 1/2 it looks correct okay so this is what part three of the theorem amounts to we found the two matrix representations of T with respect to e NF we found P and P inverse they have to satisfy this okay and how do you check that they indeed satisfy that well you have to carry out the multiplication and there are two of them right you first multiply these two then you multiply by this this is easy there are plenty of zeros and all the rest are ones then you have to multiply by this which is not that difficult and you should get this okay so verify good everybody okay so wolf okay whoa but I think it's pretty clear I think it's pretty clear what's going on here you start with a linear map once you choose a basis and it's a matter of choice right it's you can choose a basis you get a matrix representation of the map with respect to that basis that's something like this or something like this if you take two different bases you get you're gonna get two different matrices here's one here's the other when you look at them you really don't see that they have anything to do with each other right if you look at this matrix here look here a second if you look at this matrix here and you look at this matrix here they don't seem related in any way okay you can't know just by looking at them that they're what we're gonna call similar okay moreover what we said previously if you start studying them in the sense you're gonna calculate ranks you're gonna see hey they have the same rank you're gonna calculate determinants hey they have the same determinant they're starting to look similar right but that still doesn't suffice to prove that they're similar those are just sufficient conditions but not sufficient okay they're similar in the sense that this holds and that's not something you can see you need to calculate P and P inverse and calculate the the product okay I dropped this I picked it up Here I am again okay so it requires some technical calculations when you're working on examples okay so all of this discussion all of this discussion was only to throw in the notion of similarity that's all we did so far okay when are two matrix matrices similar okay we still didn't answer the important question that where that where that we mentioned okay namely how do we find the best one somebody gives you a transformation this is where we started in this example right somebody gives you a transformation over here on this board how do you find the best matrix representation which amounts to how do you find the best basis to work with okay one of the basis is gonna give the best matrix in this case it turned out to be the standard basis in this example so this exemplary really is not convincing that it's ever good to look at other bases right but the standard basis is the best you get almost the identity matrix or almost zero matrix just a pretty neat one where the one for F was was was not as nice okay but in general that's not going to be the case so we're gonna have to develop them some machinery here that looks at the data and somehow constructs the basis with respect to which we're gonna have the nicest matrix okay and that's where we're heading okay so let's just summarize the two questions that we still haven't answered and that that's gonna be the last statement for today I still I still didn't show that that the some of the properties that similar matrices share that that I'm gonna do next but that's gonna be next in the sense of the next clip but where we're heading in general is answering the following two questions so question one given T how do we find the best matrix representation that's the big question that we're gonna have to answer the practical very important question and another question which is of a more theoretic nature is given a and B given just two matrices how do we determine if they're similar or not how do we know if they're similar everything we did so far doesn't answer that okay so finding P in terms of IEF what we did dependent on the fact that we already knew that they were representations of the same T so we knew what the domain space was what the range space was what the Basie's were and then we could say okay IEF is going to be the P and therefore they're similar great David that that's that's that's cheating right we knew that they're similar so we can answer hey they're similar that's not a real answer somebody gives you two matrices okay here they are somebody gives you this matrix here and this matrix here are they similar there's no way you can come up with this and this if you don't know the t2d that they arose from okay how do you know that these two matrix are similar matrices okay and that's gonna be a harder question because all the properties that we're gonna define for matrices that we mentioned several of write the trace the characteristic polynomial the eigenvalues and so on are all going to be just necessary conditions and not sufficient okay so answering that question in fact goes beyond this discourse it's a it's a it's a harder question to answer and it's of less practical importance at least in in the realm this course okay so the question we're really interested and that's what we're that's most of what we're do what we're gonna do in the next couple of weeks is answer this question okay how do we find the best one and you can as we mentioned this is of of crucial importance in applications we want to do less calculations we want to have a lot of zeros there okay so that's where we're heading next
Info
Channel: Technion
Views: 18,586
Rating: 4.8863635 out of 5
Keywords: Technion, Algebra 1M, Dr. Aviv Censor, Technion -
Id: MJic8o5ph5M
Channel Id: undefined
Length: 70min 35sec (4235 seconds)
Published: Mon Nov 30 2015
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.