Similarity Transformation and Diagonalization

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone and welcome to another video so today i'd like to talk about similarity transformations so within the context of linear algebra we're going to see that similarity transformations involve two matrices which are related uh to one another they're somewhat close and they share some properties and therefore as the name suggests they're similar to one another so what i'd like to do today is let's just go over the definition of what is a similarity transformation and then go over some of these properties that these matrices share we're going to see later that one of the most useful applications of similarity transformations is to perform what's known as diagonalization and we'll talk about that before going into some examples we'll do two examples one using what's known as a non-defective matrix and the other using a defective matrix just so we can see the difference and again this will make sense what these two mean um when we get there so if that sounds like fun why don't we just go ahead and jump right into it so what's the definition of a similarity transformation it's actually really simple so the definition is that if you have two square matrices square meaning they're of dimension n by n let's call these matrices a and a tilde okay these two matrices a and a tilde they're called similar if there exists an invertible square n by n matrix let's call this matrix t for like a transformation or a similarity transformation matrix such that you can basically take a left multiply by t inverse right multiplies by t and you get a so in other words a tilde is equal to t inverse a t so if this is the true if you're able to find this matrix t you basically have these matrices a and a tilde which are similar to one another we're gonna see what that means it means that a and a tilde share some properties uh in fact why don't we start talking about that right now let's go through we're gonna see there um in this discussion i'd like to talk about five different properties of the matrices a and a tilde and we'll see how they relate to one another okay so the first property that i'd like to investigate is the determinant of a and the determinant of a tilde and we're going to see that actually both of them have the exact same determinant okay so to do that let's go ahead and make use of a couple of matrix properties and again um in this video i like to focus just on the similarity transformation not some of these individual matrix properties so for example i'm just going to pull this out of thin air um you feel free to verify this if you'd like but if you recall right the determinant of a matrix if it's a product let's say a times b right this is the same thing as the determinant of a times the determinant of b okay so we're going to make use of this and in fact in order to avoid writing debt all the time i'm just going to refer to this as a times b determinant right is equal to determinant of a times sorry i just i did it right here i'm trying to avoid writing d e t all over the place so i'm going to use this uh bar notation to represent a determinant okay so let's recall this this is um one uh property of determinants that we're going to use um the other one that i'd like to everyone to remember is if you have the uh determinant of a matrix uh inverse this is actually the same thing as the determinant of the matrix to the negative one or one over the determinant of the matrix okay so these are properties p1 and p2 maybe again these are just properties of determinants they have nothing to do with similarity transformations this is just how determinants of matrices work again feel free to verify this on your own but now let's apply these two to our definition of our similarity transformation and see what pops out so let's go ahead and investigate what is the determinant of a tilde right well the determinant of a tilde well we said a tilde is just t inverse oops sorry t inverse a t right okay now let's go ahead and use p1 this is the a product of three matrices so i can break them up into individual determinants in other words i can write this as t inverse determinant times the determinant of a times the determinant of t okay great and now let's go ahead and use this uh p2 property right where we can see that this is the uh inverse of t so that's just 1 over the determinant of t so in other words this first term can be rewritten as 1 over the determinant of t right times determinant of a times the determinant of t right and we remember that the determinant it's just a scalar value right it's just a number so this determinant in the numerator is going to cancel this determinant in the denominator so we end up with the determinant of a tilde is actually equal to the determinant of a so this is fascinating that basically under a similarity transformation the determinant remains unchanged right so even if you provide uh this you know you perform the similarity transformation as long as t is invertible you can use absolutely any t and it's not going to change the determinant a tilde will still have the exact same determinant as a basically the same determinant that you started with okay so that's the first property um let's go ahead and look at another property okay so the second property that i'd like to investigate and we're actually going to see that the characteristic equations of both a and a tilde right the matrix characteristic equations we're going to see they're the same and therefore if you remember you solve the characteristic equation of a matrix in order to find its eigenvalues so if the characteristic equations are the same the eigenvalues are going to be the same right so uh let's go ahead and prove that right now so if you recall the definition of an eigenvalue slash eigenvector right do you remember this for matrices this is very common it basically says let's start with a matrix a right so a v is equal to uh lambda v right so if this is true then lambda is an eigenvalue of the matrix a and v is the associated eigenvector of that matrix a right so uh let's go ahead and rewrite this um so i can rewrite this as basically uh a minus lambda times the identity matrix times v is equal to zero right that's the exact same thing basically move it off the other side factor out the v this is the same thing right and we see that basically the only way that this is true for a non-trivial v is that this determinant has to be equal to zero so in other words we see that basically this implies the determinant of a minus lambda i has got to equal zero right and this was your basically your matrix characteristic uh whoops sorry i totally spelled that wrong character heuristic equation right that's your matrix characteristic equation is you basically solve this thing for lambda all right okay so now let's go ahead and rewrite this using the fact that um a is related to a tilde so again maybe it might be it might help us if we come over here since t is invertible right i could just go ahead and left multiply by t inverse and no sorry left multiply by t right multiply by t inverse in other words i can rewrite this thing as what it's t a tilde t inverse is equal to a that's the exact same thing right um okay so now substitute this this in for a right okay so we have what do we have here this is a t a tilde t inverse lambda identity matrix all right the determinant of that whole thing has got to equal zero right okay let's go ahead and write this again so i got t a tilde t inverse minus lambda so the identity matrix i can write this thing as t times t inverse if i want right that's the same thing as identity matrix is equal to zero okay so that's going to allow me to uh left factor out a t and right factor out a t inverse so we end up with basically t times the quantity a tilde minus lambda identity matrix and then t inverse right the determinant of that whole thing is got equal to zero right now this looks a whole lot like what we did in property one this is a product of three matrices right you've got t here here's whoops that pen is totally dying uh that's one matrix here's second matrix here's the third matrix well that pen is also dying okay so let's go ahead and use our property of uh this is the determinant of the product of three matrices let's break this up into three individual determinants so i have determinant of t that's matrix one times determinant of a tilde minus lambda i times the determinant of t inverse right that's gotta equal zero okay now we do the exact same thing we did last time right this well let's not skip any steps all right times a tilde minus lambda i times right this is now the same thing as determinant of t to the negative one right so again this is a scalar number which cancels with that scalar number right so we end up with uh basically the same equation right we have a determinant of a tilde minus lambda i is equal to zero right so this is the characteristic equation right uh for a tilde all right this one up here is a characteristic equation for a right and it's basically the same thing we basically see that this characteristic equation for a is the same thing as the characteristic equation for a tilde right so that shows this first part that we have the same characteristic equation and basically since we're solving this characteristic equation to get eigenvalues basically those together right basically just imply that both a and a tilde have the same eigenvalues right so the eigenvalues remain unchanged underneath any similarity transformation so as long as t is invertible again you are not going to change the eigenvalues of a a will have the same eigenvalues as a tilde um even after you do this this similarity transformation right so that's that's pretty interesting okay so i guess naturally your next question since we're talking about eigenvalues is what about the eigenvectors all right so we just show that the eigenvalues of a tilde are actually the same as the eigenvalues of a and actually i should probably mention before we move on to the third property is this is true even with multiplicity uh let me write that down even with multiplicity meaning even if there are repeated eigenvalues in a or a tilde it's it doesn't matter this is still going to hold true under the similarity transformation you'll still have those same repeated eigenvalues showing up in a tilde okay so that's the discussion on the eigenvalues so property number three looks at eigenvectors and actually this is where uh it's not the same but they're similar so we're gonna we're gonna take a look at that right now so uh to do that maybe what we should do is let's go ahead and consider just a single pair consider a single eigenvalue slash vector pair and i'm going to denote this as how about lambda i and vi right so this is a basically an eigenvalue eigenvector pair um of of a how about let's call this of the original matrix a okay so vi again we saw lambdas don't matter right the lambdas for a are the same thing as the lambdas for a tilde okay but v i want to make the distinction here that i'm going to use the notation v i to denote this is an eigenvector associated with eigenvalue lambda i of the original matrix a okay so that's uh we've got that on the board so now let's go ahead and take a look at well how does this relate to um to eigenvectors of how about a tilde okay so let's just look at a tilde okay okay and what i want to do now is let's just multiply uh this by some arbitrary vector how about ui okay now again a tilde we said a tilde was well i guess we had it right here right a tilde is t inverse a t so again since i multiplied on the right by this arbitrary vector i better do it over here as well okay all right so again uh this this ui is just some arbitrary vector okay now since it's kind of arbitrary why don't we go ahead and make it equal to uh let's just use how about the eigenvector of a multiplied by t inverse right let's just pick this okay so this is going to be my my u that i want to multiply it by okay great so let's go ahead and substitute that in over here so what do we end up with we end up with t inverse a t times t inverse vi right okay so these here uh basically this is lucky this becomes basically the identity matrix right that helps us so we end up with t inverse a v i okay now you remember what's the whole deal what is a v i remember the idea with an eigenvalue eigenvector right is the definition of an eigenvector is this right because v is this magical eigenvector if i multiply a by itself i basically get the same eigenvector just scaled by the eigenvalue right so this is again this is the definition of the eigenvector lambda i and its associated eigenvector v i for the matrix a okay so great this here i can now just plug in th this avi is the same thing as lambda ivi so let's substitute this in for that okay so we end up with what the t inverse is still hanging around and we now end up with lambda i uh vi right and again now the nice thing about since lambda i is just a scalar number right it's not a vector it's not a matrix it's a scalar so i can just move this to the front with no hassle so this is the same thing as lambda i t inverse vi right okay um and now tell you what let's let's come over here on the on the left side what was the left-hand side the left-hand side is just i haven't manipulated this at all so this is still a tilde u i right okay now look at this ui here remember what was ui ui was this up here right this was ui right it's t inverse v i okay so let's go ahead and plug that in for ui so maybe what i'll do is i'll just make a little note so we can recall what we're doing so we're going to say recall that ui all right this arbitrary vector that we're using it's it's t inverse eigenvector vi right so i'm going to plug that in over here so we end up with a tilde um uh sorry where yes okay t inverse vi is equal to lambda i t inverse vi right oh no yeah you know what i actually wanted to do i actually wanted to i wanted to make it the other sorry sorry sorry let's let's do this the other way around this is the other way around um it does i wanted to keep the u here so i wanted to make this substitution this thing right here this is ui right yeah that that will that'll do better right okay so this thing right here that's that's ui right so we basically have a tilde u i is equal to lambda i ui right perfect and this is perfect because if you stare at this thing long enough right this is again this is basically the definition of an eigenvalue eigenvector pair for the matrix a tilde here right because this is basically saying matrix times some vector is equal to the same vector just scaled by this this this value the scalar lambda i right so again this is the definition of an eigen value lambda i and eigen vector ui right for a tilde for that matrix right so what we can deduce from this is the only u this has to be an eigenvector right so we basically see that ui right is uh an eigenvector uh associated with lambda i for the matrix a tilde right and again what is ui so we see here that this is how the eigenvectors right so we see this u is actually an eigenvector of a tilde and it's related to the eigenvectors of a through this t inverse matrix right so at the end of the day what we end up with is uh shoot i ran out of room i guess i can write it over here so the end of property number three is basically it's the eigenvectors of a tilde are basically they're the eigenvectors of a but then you have to pre-multiply by t inverse right so this is the takeaway for property number three is that the eigenvectors of a tilde are related or similar to the eigenvectors of a but they're not exactly the same they're you need this t inverse and again this maybe highlights the fact of why we needed t to be invertible right so again as long as t is invertible this is true and again should hold true even if you have um multiplicity of eigenvalues right so this is always how you can find the eigenvectors of a tilde if you know the eigenvectors of a right so great that is uh that's interesting uh for this property all right give me a second to erase the board let's move on all right so property number four of similar matrices is actually they have the exact same trace now remember the idea with the trace the definition of the trace of a matrix a i remember again we're talking about square matrices it's just the sum of all of the diagonal elements right so you just basically take the one one element add to the two two add to the three three add to the four four blah blah blah blah add to the n n element basically just sum up all the entries along the diagonal that's the definition of trace um and again uh we're going to use some properties of the trace i'm not interested in deriving those or proving those here we're just going to use them but we will apply them to our similarity transformation so again the property i want to use for trace here is that again if a is the product of two matrices b and c and actually these two don't even have to have the same uh dimensions they have to have compatible dimensions but they don't have to be the same so for example b could be of n rows by m columns and c i guess in order this better be m by n right so at the end of the day we have an n by n matrix a okay so if we're looking at the trace of b times c right that's actually the same thing as a trace of c times b obviously again this times this matrix multiplication right as you can kind of clearly see here okay so that's the property i want to use let's go ahead and apply that to our similarity transformation and see what we end up with right so let's go ahead and compute the trace of a tilde right so again a tilde right we said was just t inverse a t right so let's break this up into uh two uh steps right so let's go ahead and make a substitution let's go ahead and just say i'm gonna call this matrix b as just t inverse and let's call the matrix c as sort of the product this matrix a t right so this bit right here let's just call this matrix b and let's call this matrix c right so then using our trace property where i can basically just flip the order of these matrices right we end up with this is the same thing as trace of a t times t inverse right and then this is super easy because now this becomes an identity matrix and we basically end up with just trace a right so we see that trace of a tilde is the same thing as trace of a so again pretty interesting um we see that the trace of the matrix uh again is not changed under a similarity transformation all right so with property number five uh concerns the uh matrix rank right and we're actually going to see that again uh a and a tilde have the same matrix rank so again just to refresh your memory the idea with the rank of a matrix right is it's basically the dimension of the space spanned by the columns of this matrix a right and again the other way to think about that it's the number of the maximum number of linearly independent columns of the matrix a right so if we take that uh perspective of what the rank of the matrix is you can see that basically if you have a matrix a right and you operate or transform on this by some invertible t right you're basically not changing uh the number of linearly independent columns of that matrix right because t is invertible right so basically you can see that the rank of this thing is the same thing as the rank of a right you haven't changed the rank of it by multiplying by uh by an invertible t and in fact the invertible t can be on this side or can be on the other side right it doesn't really matter in fact so here i'll tell you let me do this let me write this right all of these are the same thing right for t invertible right because that's the whole idea with invertible means that t is full rank right it doesn't uh it's not singular it's full rank so uh it's not going to change the number of linearly independent columns of the matrix a by just multiplying right and in fact you can think about this the same way of uh you know again you have a here's another invertible matrix how about t inverse right because t is invertible t inverse is also invertible so it's the same idea right i've got an a i've got some full rank matrix that i'm multiplying it by so again i can multiply it by that on either side right and you still end up with rank of a right again for t invertible right so again let's go ahead and um uh we'll we'll put these in the bank right the other way that we could prove this if you want to uh make sure that this is true right is again we can use the property of matrix rank that says that the rank of a times any matrix t right again let's let's forget about t being invertible for now but this is the product of any two matrices if you take the rank of this this rank has got to be less than or equal than the minimum of the rank of a and the rank of t right so we can note now if t is invertible well actually maybe tell you let me take away the if right let's just make the statement that right if you have t invertible right this basically only happens if and only if the rank of t right is equal to the full dimension of the space right so this only happens if rank t is equal to n for our square matrices right um so therefore i think if you look at this minimum you know that this has the rank of t has got to be greater than or equal the rank of a so we can basically write that if t is invertible we have rank of the product a t it's got to be less than or equal to the rank of a right um and uh let's tell you what let's let's let's i don't want to box this but maybe we should star this or number this um in my notes i've got this as equation uh 6b let's just use my same numbering so we don't get uh mixed up um and i could basically you know we could we could we could do a very similar thing let's just actually use 6b again but let's go ahead and change the matrices so i'm going to look at the rank of a product of two matrices let's just instead of a let's turn this into um the product a t instead of t let's turn this thing into t inverse okay so what i mean by that is now we have the the rank of the matrix uh this thing a t right that's that one times t inverse right uh this thing has got to be less than or equal than the min of the rank of a t right and the rank of t inverse right okay so again this we can write the left-hand side this is basically going to be rank of a right and again we see that this here is full rank right so rank of t inverse because t is invertible t inverse has to be invertible so the rank of t inverse this has got to be n so this has got to be bigger than this so we basically have this has got to be less than or equal to the rank of a t right okay so uh that let's call this equation how about 6c right if you look at equation 6b and equation 6c you see the only way these two are satisfied simultaneously right is if the rank of a is equal to the rank of a t right because it's on the opposite side of the uh less than or equal sign on both sides so basically combine 6b and 6c and we end up with basically the rank of a t has got to equal the rank of a right and again this is exactly what we did up here and again you could do the exact same thing for ta which is this part of it and again you can do the same thing with t inverse so again this discussion here if you glossed over that is basically just to get us and convince ourselves in the head that these two things are true and i know i said we wouldn't try to prove too much of this but i think this is maybe somewhat useful here because now it kind of gets us in the mood for applying this result up here to our similarity transformation so uh yikes i'm kind of running out of space let's let's come over here how about okay so let's go ahead and look at now how about what what is the rank of a tilde right so the rank of a tilde is basically well again what is a the a tilde is just t inverse a t right okay so tell you what let's again break this up into the product of two matrices how about the rank of let's group uh let's do t inverse a as one matrix and then times t right now pursuant to this equal set of equations right where we basically see that uh again this is the product of two matrices and this is something here times t which i know is full rank right so this is basically i am going to use this portion maybe so let me circle this in green right where the rank of some matrix times an invertible full rank t is basically the rank of that original matrix right so this thing can be basically be written as rank of just t inverse a right it's just the matrix thing in parentheses right and now this is the exact same thing you look at this and we say okay this is a product of two matrices it's an invertible matrix times a so again you can kind of come over here and we could say this part sorry i should have put right this t inverse a is basically the same thing as rank a so this is basically just rank a so here you go we end up with rank of a tilde is equal to the rank of a so we see again the rank of the matrix is unchanged underneath a similarity transformation all right so that was great we saw that way similarity transformation preserves a lot of properties between the matrix a and the transformed or similar matrix a tilde with the exception of the eigenvectors which we saw were somewhat similar and related now remember those properties that we just talked about are basically true for any similarity transformation as long as t is invertible right now what i want to talk about now is one of the most useful types of similarity transformation so i would like to look at a particular t that is going to diagonalize your matrix so what we're going to see is the matrix a tilde that you end up with is going to hopefully be diagonal it's going to have only values along the diagonals and 0s everywhere else and again this process is known as diagonalization because we're going to take a matrix a which is you know might have entries all over the place and then through a similarity transformation with a very specific t we are going to be able to turn that a tilde matrix completely diagonal so to go about that we're going to see this has something to do with the eigenvalues and eigenvectors of a matrix so again remember that for your matrix a right the idea with the eigenvector let's call it vi is that if you multiply it with a you basically get that same vector but scaled by the eigenvalue right so again this is for uh n equals one two all the way up to or sorry for i equals one two all the way up to n right so tell you what let's write this out as you know n separate equations right so you could write this as basically um you know a v1 equals lambda 1 v1 right av2 is equal to lambda 2 v2 etc etc all the way down to a v n is equal to lambda n v n right so these are basically n equations right individually one of them at a time okay so what i would like to now do is how about let's stack all of these up together so what i mean by that is i would like to uh take this first eq let's look at one of this one equation right again maybe we should be a little more specific this is n equations and it's an n sort of vector equations right because this first equation right here it's basically says this is a vector uh is equal to another vector right and it's an n element long vector so you can think about the left hand side this av one right it's an n by one element vector so you know what let's stack this let's start making a matrix where this first column is going to be this thing here right so this is going to be a v1 right so i'm going to basically just take this first part here put it over here in the first column of this matrix that we're gonna build up okay and again uh let's leave the entries over there as something else uh and i guess we're gonna take a look at that in a second um this is equal to uh you could write this as and again let's write it again as another vector over here the right hand side of this is lambda 1 v 1. would everyone agree that's the same thing so i'm basically just taking this part and jamming it in over here so these again you can think about these as vectors if you if that's helpful right so i'm really i haven't i haven't rewritten anything i've just stacked it up so this first equation is basically the first column of this matrix right let's do the same thing for the second column of this matrix is going to be the second entry so this is going to be a v 2 right that's this column is equal to lambda 2 v2 right right etcetera etcetera as you can see and we're going to go all the way down to a v n is equal to lambda and v n right right okay um okay so this is basically again think about the dimensionality this entire matrix here is a n by n square matrix right because this vector right here this is an n by one this is an n by one and all the way down to an n by one so you basically stack up all of these together and y you have these this n element vector n element vector and element vector you stack up n of them you end up with an n by n matrix right and same thing over here right this thing is an n by one this is an n by one all the way up to an n by one so you stack them all together so this whole thing is an n by n so you basically again we now are basically taking all of your eigenvalues eigenvector pairs and we're just writing them as a basically a matrix equation right so now you know what we can do tell you what let's move everything over to one side okay so what i mean by that is uh i'm gonna drop that uh this sort of vertical vector notation because i think we all understand that av1 if this is a vector av2 this is a vector all the way up to a v n this is a vector right and i want to move this to the other side so i have to basically say minus right lambda 1 v 1 lambda 1 sorry lambda2 v2 all the way up to lambda and vn right this has got to equal um it's a bunch of zeros right because i moved it over the to the other side so this is basically a n by n matrix of zero so you can think about this as like again a zero vector here zero vector here all the way up to another zero vector right and again each one of these is an n by one and by one all the way up to n by one so you end up with an n by n so again same thing well i'm doing nothing more than just rewriting these n vector equations in matrix form right okay so now what's great about this is uh let's examine this first term here you notice a is pre-multiplied everywhere so let's pull it out on the left so this can be rewritten as a times v1 v2 all the way out to vn right okay minus let's look at this thing okay this thing lambdas right remember all these eigenvalues they're just scalar numbers right so if you stare at this n by n uh matrix long enough you'll basically see that you could write this whole thing as v1 v2 all the way up to vn times a diagonal matrix which is lambda 1 0 zero basically it's lambda one lambda two all the way down to lambda and on the diagonals and zeros elsewhere so you can write it inside the diagonal matrix so again take a little while to convince yourself of this that this matrix which again is an n by n right this whole thing is an n by n and so is this thing this is an n by n matrix right the product of these two is this term up here right okay so again this is equal to uh again all this big giant zero matrix on the other side right um okay great great great so now uh actually sorry for moving things back and forth but again let's let's let's go ahead move this to the other side right so let's take all this this thing move it to the other side so we basically end up with this thing where it's a times this v1 v2 vn matrix is equal to this v1 v2 vn matrix times this diagonal matrix of lambda 1 lambda 2 all the way down to lambda n and zeros elsewhere right now this is fascinating because if you stare at this why don't i call this matrix here t this is my similarity transformation matrix right because it shows up on both sides okay and let's call this thing here this is let's call it a tilde right so if you look at this if we go ahead we can write this now as a t is equal to t a tilde right okay and now here's the rub this is fine this is true so far so if you look at what we did here is we chose our transformation matrix t to basically be this v1 v2 all the way up to vn so what this transformation matrix is this is a matrix of the eigenvectors right stacked up in column format and again i'm going to call that out in column format because depending on what software packages you're using for example mathematica gives you eigenvectors of a matrix in row format that's not what we're doing here right this matrix t is the first eigenvector as a column the second eigenvector as the next column et cetera et cetera all the way up to the nth eigenvector as a column here right but this is the very special transformation matrix okay and this is true no matter what okay so we actually haven't done a full similarity transformation matrix a similarity transformation yet because um i have an isolated a tilde right now here's where we got to pay a little attention right now if t is invertible right if t is invertible that's great because if that's the case what can i do i can basically isolate a tilde and we end up with our usual a tilde is going to be t inverse a t right okay and what's special about that is look at the form that the a tilde took right so again a could have been anything right but if we use a similarity transformation of the eigenvectors as our similarity transformation matrix we end up with this a tilde format where you have the eigenvalues falling along the diagonals and there's zero elsewhere so this a tilde is a purely diagonal matrix right which is very very very nice for a lot of different reasons right and again the thing that we have to i want to really hammer home is that this is true if it's invertible right if it's invertible we can diagonalize a matrix right if it's not invertible this is still true right a times the eigenvectors it should be equal to the eigenvectors times your dia your your eigenvalue uh diagonal matrix here right that should still be true but you're not going to be able to get all the way down to this this relationship if t is not invertible so again i think i'm hammering this um home and maybe i'm talking about this too much but again i want to stress the fact that this is very very useful we're going to see this show up over and over and over and especially in control systems and linear system theory this is sometimes referred to as a a modal form and sometimes this is this t is sometimes referred to or i should maybe let's just write that t is sometimes known as the modal matrix modal m-a-m-o-d-a-l modal matrix of a okay and again t is just the eigenvectors of a stacked up in column form okay and um again i think it's beyond the scope of the lecture today to talk about what does it mean for what what does the word modal mean we will talk about that in other discussions when we look at linear system theory and dynamic systems i think it'll be much more clear why this is called a modal matrix but for now we can see that the power of this is that it allows us to diagonalize um a matrix and we see there are some caveats right we have to ensure that uh the eigenvectors basically span the space right if the if all the eigenvectors stacked up column wise are full rank right that means this t hold on let me let me let me get that straight if the eigenvectors are all linearly independent right meaning the eigenvectors span the space then t will be invertible and you can diagonalize a matrix right so again this is a little bit uh it might help us understand if for example if you look at the wikipedia page on diagonalization in fact in fact here i'll flash up um a screenshot of the of the text there right it says basically right you have a square matrix which is diagonalizable if and only if right the sum of the dimensions of its eigen spaces is equal to n right the full space so basically that's exactly what we're talking about right if the eigenvectors span the space then the matrix is diagonal is diagonalizable all right you can perform this diagonalization uh procedure okay now um if it's and and in that case a is sometimes referred to as a non-defective uh matrix because it can be diagonalized right if it cannot be diagonalized it's sometimes referred to as a defective matrix and in fact that's what i want to look at right now why don't we take a look at an example of all of these with those two situations a non-defective matrix and a defective matrix so we'll run through all the properties we talked about we'll show those properties are true regardless if the matrix is defective or not but we'll see that this diagonalization is only possible for a non-defective matrix all right so here are the two example matrices that i'd like to look at so the first one is a non-defective matrix so we'll see that this one has eigenvectors which span the space and again that only matters for our discussion on diagonalization we can show that this matrix when we go ahead and use a similarity transformation namely i'm going to use this similarity transformation of t here to basically compute our similar matrix we're going to see all those properties hold of properties one through five regardless if it's defective or not and we'll see that since this matrix is non-defective we'll then be able to use a another transformation matrix which i think we're going to call it m instead of t we'll use m to denote the modal transformation which is just the eigenvectors and we'll go ahead and look at diagonalizing that matrix a okay and then the second example i just like to look at a slightly different matrix it's the same as the first matrix the only entry that's different is this red one in the 1 1 position so instead of a 3 let's make it a 5 this matrix is actually defective so its eigenvectors do not span uh r4 in this case um but again we will go ahead and use the exact same transformation uh matrix here to perform our similarity transformation and we'll see that still even though this matrix is defective properties 1 through 5 still hold it's just going to be an issue we're not going to be able to diagonalize this matrix using the modal transformation of its eigenvectors so why don't we jump over to matlab and actually do this because it'll be a lot quicker and easier just to have matlab do a lot of these operations all right so here we are i've just written a little bit of code ahead of time to save ourselves some headache and um as you can see i'm just going to go ahead and get the non-defective matrix to start with so let's go ahead and run this and uh okay here is our transformation matrix our similarity transformation matrix that we talked about up on the board um so again let's just go ahead and run to the next break point to get t into the uh system now let's again make sure that t is invertible so let's compute its determinant and we see yep that does not look anything near zero so i think we are good to go t is definitely invertible so we should be able to perform our similarity transformation matrix here using just t inverse a t so let's go ahead and do that and as you can see we get something totally different than what we started with right so here was a and now here's a tilde it looks completely different but as we discussed on the board a lot of these things are the same four properties should be the same and the fifth one right the eigenvec vector should be very similar so in fact let's just go ahead and take a look at this so property one the determinant right both determinant of a and a tilde those should be the same so let me just execute that and we see yes the determinant for both matrices is 20 in this case um how about the eigenvalues so matlab remember has a function called ig eig which is going to give you the eigenvalues uh in a diagonal matrix d and the eigenvectors in column format as a matrix v so that's what i'm doing right here i'm going to look at the eigenvectors of a and the eigenvalues of a and then we'll look at the eigenvectors of uh a tilde and the eigen values of a tilde here um again maybe let me put a break point right here and i'll run just at this point and we can look at in in this case for property two i only care about eigenvalues so let's look at d right and as you can see d is is basically a diagonal matrix like i guess if we look at this let me open up you can see it's a diagonal matrix these are all zeros if you look at zero plus zero i that's what all these other entries are so there's only entries along the diagonals and the diagonal entries are the eigenvalues so that's what the eige function is giving us so it looks like for this matrix a there's an eigenvalue at three plus one i three minus one i one and two right that's the eigenvalues so i'm actually going to rip them off using the diag function to just extract the diagonal entry so again if i f10 this and run that line you'll see that lambda here this is a list of all the eigenvalues of the matrix a so again it's like we said it's 3 plus 1i 3 minus 1i 1 and 2 right let's do the same thing for a tilde and here you see that it's the same but a little bit different in the sense that the ordering is different so look the first two eigenvalues have the exact same value and order but then you see that for whatever reason matlab's eige function uh decided to return us the eigenvalues in a different order but regardless of that the eigenvalues are actually the same we just need to be a little bit careful that for whatever reason matlab decided to give us the eigenvalues in different orders therefore this matrix of eigenvectors are also we got to be a little careful how we're going to play with them right but again this is enough to prove property 2 that basically um well not basically they are the same right the eigenvalues of both a and a tilde are the exact same now let's look at the eigenvectors right we discussed on the board that the eigenvectors of a tilde should basically be the eigenvectors of a but then we have to multiply by inverse t right so that's what i'm going to do right here i'm going to compute what are the eigenvectors of of a tilde what should they be using our relationship we derived on the board of t inverse times the eigenvectors of a right now what i'm going to do here is i wrote myself a small little function called r matrices same so don't don't try to search for this function on the matlab help it's my own custom function but i'm just checking to see if two matrices are the same within some tolerance in this case it's 10 e to the minus 10 so it's some very small value so as you can see what i'm doing right here is i'm checking that this quantity a times the eigenvector 1 right or what i think the eigenvector should be using this equation should be equal to the eigenvalue of a tilde which is the exact same thing we just showed as the eigenvalue of a and a tilde the exact same thing right but multiplied by this eigenvector of a tilde right so what i'm doing here is i'm checking to make sure that this thing what i'm calling eigenvector tilde is actually the eigenvectors of a tilde right so all of these should come back and say true true true so let me just step through this step step so yep the first one matches right so this first eigenvector is correct the second eigenvector matches the third one matches and the fourth one matches so i know that this thing right here this igvec tilde this is in fact the the four eigenvectors of a tilde right that's what we just proved so um that's great because you saw we actually i i did not use this at all right i did not use the eigenvectors that the matlab function i computed for me i used the eigenvectors of the original a matrix and then used our similarity transformation to get those eigenvectors right so this is great it shows property 3 is it works for this case where the eigenvectors are similar they're related to the original a matrix eigenvectors through this transformation and the last two properties are almost trivial right so again i'm just going to look at the trace let's take a look at both those and we see yeah in both cases a and a tilde both have a trace of nine and they should both have the same rank so both of them have a rank of four now let's talk about diagonalization so again we just saw that those four five properties were fine regardless of the similarity transformation now let's switch gears and talk about diagonalization so in this case i do want to use a transformation matrix i'm going to call this the modal transformation matrix m and i want that to be the eigenvectors of the original a matrix so again make sure we're very clear on this right this matrix v this is the eigenvectors of a not a tilde so if i use the eigenvectors of a i can go ahead and perform my similarity transformation here using the modal transformation matrix and i should end up with a purely diagonal matrix with the eigenvalues along the diagonal so again let me just step step step and look at this diagonal this totally works right if you remember what do we call this thing i think i i made a variable lambda right these were the eigenvalues it was 3 plus or minus 1 and 1 and 2. and look at this here's three plus one here's three minus one here's one and then here's two all along the diagonal so indeed this trend this similarity transformation or this modal similarity transformation has gone ahead and diagonalized my a matrix so this a tilde which i call the diagonal is indeed diagonal so in this case it totally worked and when i say it totally worked i mean the the diagonalization process worked for this non-defective matrix okay now let's go ahead and change this instead instead of using a non-defective matrix let's use the defective matrix and go through this exact same exercise so again let's go ahead and i guess i can take all these break points off because none of this is going to change um yeah the this transformation matrix is still going to be invertible i can still go ahead and compute um a tilde which again here this looks a little bit different right um but again we should be able to see that all these properties are the same so again let's check the determinants yep in both cases um the determinants are now 32 i think it was 20 earlier but again it doesn't matter that because they're both the same we can also see that the eigenvalues are also the same so now in this case aha now look at this we've got multiplicity of the eigenvalues here we've got a four and a four okay which is going to potentially cause us problems when we start looking at the eigenvectors but again what's important to note here is that um both a and a tilde have the same eigenvalues namely four four one two and again look at this for whatever reason the ordering is different but still it's four four one two right same eigenvalues and again um we can do the eigenvectors let's go ahead and compute the eigenvector so again i'll just i'll just hit f5 run through all these and we see that yep checks out one two three four so all four of these eigenvectors um yeah well they are eigenvectors of the a tilde matrix now what's interesting maybe we should interject here is if we look at this um eigenvector matrix if we look at the rank of this thing aha look at that the rank is only three so in this case the eigenvectors of a tilde um do not span the the space um and in fact the same is true of the uh the rank of the eigenvectors of a and in fact that's probably what we wanted to look at in the first place because this is what we're going to use down the road when we started talking about diagonalization and again look at this it's also rank three um so uh okay we'll keep that in the back of our head we'll revisit this in about 60 seconds but in the meantime let's go ahead and convince ourselves that even with this defective a matrix doesn't matter the traces are the same right in this case they're both 11 and now the rank of a and a tilde are the same and again it's there's they're still full rank right there were no zero eigenvalues but the eigenvectors now this is where we come to it when we start trying to do diagonalization so if i tried to use a transformation matrix m which is now the eigenvectors of a the problem with this as we saw earlier is the rank of this thing is only three so i can't actually do this you cannot diagonalize any longer you can't go ahead and say um inverse m right this is going to fail right because this thing there look at this it is singular or badly scaled so it's not going to work and again when i say it's not going to work we mean that we can't diagonalize the um the a matrix in this case because it's a defective matrix and its eigenvectors do not span the entire space okay so now this could be ameliorated to some degree by investigating what's known as jordan normal form but again i think that's outside the scope of this discussion we'll try to have another discussion in another video later discussing jordan form um of matrices but for now i think this is a good discussion showing an example of all of these different properties under the similarity transformation all right so there you have it a pretty uh mathematical sort of sterile approach and discussion and investigation on similarity transformations in our immediately following video we'll take a look at trying to apply this to an engineering problem of basically changing the state in a linear system using a similarity transformation we're going to see how this might be actually applied to some control or dynamic systems uh applications so with that being said um i think this is probably a great spot to leave it i hope you enjoyed the video and if so i also hope you'll consider subscribing to the channel if you scroll a little ways down and click on that subscribe button it really does help me continue making these videos um the other way you can contribute and help the channel is via patreon um i want to take a moment to thank some of our recent patrons um the nice thing about this method is that 100 of the proceeds that the channel receives via patreon is going to be directed to uh k through 12 science engineering and adventures for kids and young adults so uh speaking of the new videos also remember they should come out every single monday so i hope i'll be able to catch you at one of these future discussions and we can all learn something new together so until then i think i'll sign off talk to you later bye
Info
Channel: Christopher Lum
Views: 130
Rating: undefined out of 5
Keywords: Matrix similarity transformation, similar matrix, similar matrices, diagonalization, diagonal matrix
Id: wvRlvDYDIgw
Channel Id: undefined
Length: 59min 26sec (3566 seconds)
Published: Mon Dec 13 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.