54 - Matrix representation of linear maps

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
we're finally ready to show that if we take a linear map it can be represented by a matrix so for any map T from between V and W there's a certain matrix that does precisely what T does by T or V equals a times V okay and this idea is we're going to break it into two parts the first part is going to be a theorem that says that this can be done okay and we're going to write the theorem we're going to give an example we're going to prove the theorem so that's going to be the first part that's going to be this clip showing that it can be done the second part is how do we do it somebody gives you a linear map you're not going to say hey this can be done you want to actually do it you want to produce that matrix you get T you want to produce a and that's the idea of how to do it is in fact hidden in the theorem and that's that's going to be the next clip how we do it how we actually produce the matrix that represents a given linear transformation okay so part one the theorem so here it is in fact the the actual statement is going to be a corollary of the theorem so you're not going to see it yet you're going to see it at the end as a corollary so let a let V and W be vector spaces the only thing we're going to need to require in order for everything to compile is that they share the same field as the field of scalars okay that's throughout any statement on linear transformations right vector spaces over the same field F no it doesn't compile we talked about it when we define the linear map what is T of V what is T of alpha V equals alpha T of V okay if alpha doesn't live in the same place you can't pull it out it doesn't make sense at all okay so let B equals V 1 V 2 up to V n be a basis for V so V is going to be a space of dimension N and B is a specific basis for V and let a which I'm going to denote by W 1 W 2 through W n these are little V these are little w's being any subset of W so there there are no assumptions on a not that it's a basis not that it's even spanning early nearly independent nothing just n vectors in W ok but the only thing is that I'm taking the same number as the basis of V ok why so this is going to be the reason define define T of so I want to define a linear transformation between V and W ok I have a basis B for V so any element in G is a linear combination of these basis elements right so in order to tell you what T does in order to define a linear transformation I have to tell you what it does to every element in V so I'm going to write that element in V as a linear combination of these alpha 1 V 1 plus alpha n V M this is a general element in V looks like that and I'm going to define it to be here's the definition of T I'm going to define it to be that exact linear combination of the little w's alpha 1 w1 plus dot dot alpha n WM that's the definition of T that's how I'm defining T do you agree that it's defined now for every element of V okay then there are three statements one T is a linear map this requires proof I just told you what it does why is it linear okay you have to show it you have to show that T of V V plus T of U plus V is T of U plus T of V and that alpha T of V is T of alpha v right you have to prove it that's part one too if if this set a in W was not just any random collection of n elements in W but happened to be a basis for W then T is not just the linear map it's one-to-one and onto if a is a basis for W then T is one-to-one and on 2 & 3 is the statement that this T this specific T T is specific in this theorem it's the transformation that does this T is the only the unique map the unique linear map obviously it's not the unique linear map from V to W you can take many of those but it's the unique map that does something ok such that t of VI equals WI for every I so if you want a map that maps V 1 to W 1 V 2 to W to VN 2 WN there's only one map that does that precisely and it's this given by this formula ok so these are the three statements of this theorem what I want to do is first give you an example show how these ideas build up okay before we even and then I'm going to make a few remarks and then we're going to prove it so first an example example so let's find a map from R 2 to R 3 so this is V this is W they're not of the same dimension there was no claim that they are ok so we're going to find the map such that I wanted to perform in a specific way on r2 I want to prescribe what it does to the standard basis of our 2 such that T of 1 0 equals I want to tell you precisely I want to I want to declare what it does to 1 0 I want it to be root 2 negative 1 and E I want this to be what T does 2 1 0 and T of 0 1 I want that to be one pi/2 suppose I want to find the map that does that to the standard basis of our - okay how do I do it here's the idea how do I do it I take a general element in r2 what's a general element in R - it's some maybe that's what a general element in r2 looks like do you agree ok I write it I write it as so in order to prescribe see on all of our - I have to tell you what it does to a B right so I write a B as a times 1 0 plus B times 0 1 this is a basis I can always write it as any element as a linear combination of the basis elements do you agree and and throughout this example think are 17 and and matrices or whatever you can see that this works in general if you take a general element you can write it as T of a linear combination of the basis elements right there's nothing special about r2 here okay that's going to be the idea of the proof in a minute okay this equals I want t to be a linear map I want it to be linear map so T of a sum is the sum of the T's and then I can pull out the scalars so that's going to be a times T of 1 0 plus B times sorry B times T of 0 1 if T is going to be linear then it's going to satisfy this right and therefore this is going to be equal since I just told you what I want t to do to the basis elements that's it it's going to be a times aa root 2 negative 1 e plus B times 1 pi - do you agree what what we showed so far is what we showed so far is that if you want to take linear transformation and you want to force it to do something to basis elements it forces what he does to any general elements defining T on a basis fixes T T doesn't have any any choice for any vector a B that's what it's going to have to do to it ok furthermore once we write it like this I can rewrite it I can rewrite it as one pie - um let's take an intermediate step I can write it as a root 2 + B negative a plus PI B degree and a e + 2 B this is T of a be precisely to agree and furthermore I can rewrite this as tell me if you agree root 2 negative 1 e 1 pi 2 times a B so it's root 2 times a plus 1 times B that's the first element negative a plus PI B that's the second element and E a plus 2b that's the third element matrix multiplication do you see that so when I tell you what T does to basis elements in fact I fix T there's only one T that's going to do it for the entire space and it's in fact given by a matrix the matrix is a matrix whose columns are T's of the basis elements do you see that ok so there we have it we just started with a transformation that was prescribed on basis elements didn't have any matrices involved in it but suddenly it showed up as a times V T of V is in fact a times V do you agree ok so this is a matrix let's write columns are T of the e eyes right if this was one of this and this is e 2 then the columns are T of e 1 and T of them - do you see that ok so what we're going to keep this example in mind as we go through the theorem and get to the conclusion the choral area of the theorem that given a map you can always find a matrix like this okay so let's go questions about this example that did everything that happened make sense ok so let's go back to this theorem what's the statement let's start with part three well part one is that you take a linear transformation define it on a linear combination as a linear combination and it's a linear map this is just this is going to require just working with the definition three lines of technical calculations completely basic and and it's going to prove it that's there's no issue in in part one ok part two part two tells you that if you're sending a basis to a basis then the map is going to be one-to-one and onto ok that's already hinting at something we're going to say later that a linear transformation is one-to-one and on to if the diamond if and only if the dimensions are the same we already hinted at that previously so this is this is really laying the ground for things to come ok what's important for us is part three ok part three years says that this T which was precisely what we did in the example is the unique map that that that is prescribed but by what it does to the basis elements okay you pick the images of the basis elements whatever you want T of V one you want it to be W 1 T of V 2 you want to call it W 2 whatever you want a pre re there's no claim on the W's whatever you want that fixes t ok good ok so let's write that as a remark first let's write that as a remark first and then we'll prove the theorem so remark 3 of the theorem means the deep the deep that the meaning of part 3 is that if we take a basis B for V and define what I'm going to write it in kind of a a non formal way and define what T does - ah - basis elements right so that B was v1 through VN and I want to fix what T of v1 through T of VN is that's the Wis then this determines T it fixes T there's only one T that does that okay how T of alpha 1 V 1 plus dot dot dot alpha n VN write v1 through VN are the basis element take a linear combination that's a general element just like in the example a B okay it's precisely alpha 1 T of V 1 which was the W 1 okay so it's precisely the linear chin of the images okay good if you take a basis and prescribe what T does to the basis elements it fixes T good okay so let's prove this this theory okay so proof of theory so what did the theorem say let's look back at the theorem we have two spaces V and W we have a basis for V and we have some n elements in W any n elements okay we're defining T of an element in V any element in V is a linear combination of the basis elements we're defining it to be the same linear combination where we replace the V's with the W's okay the same linear combination of the W's part one says this is a linear map let's show that it's a linear map okay by definition so part one I need to take two guys in V let's call them u 1 and u 2 and I have to show that T of their sum is T of U 1 plus T of U - right that's the first part of showing the map is linear so what is T of U 1 plus T of U what is T of u 1 plus u 2 u 1 with some linear combination of the little V's right it's an element in big V B is a basis for that so u 1 is some alpha 1 V 1 plus dot dot alpha n VN that's u 1 and u 2 is some other combination beta 1 V 1 plus beta n VN do you agree that's two general elements their linear combinations of the basis elements okay can I break this into alpha 1 T of V 1 etc can I do that now right that's what we're trying to prove right we're trying to prove that it's a linear map so we can't just say like we've been saying several times when we were working with linear maps that this equals alpha 1 T of V 1 plus alpha 2 T of V 2 etc that's what we're trying to proof we can't do that right good is the logic clear okay so what we can do is we can rewrite this as T of alpha 1 plus beta 1 V 1 plus alpha 2 plus beta 2 V 2 that's just collecting these guys in in the order of the V's right there's alpha 1 V 1 and there's beta 1 V 1 here then there's alpha 2 V 2 and then there's beta 2 V 2 here so I can write this do you agree okay and now what I do know is I know how t is defined T is defined by sending each a linear combination like this to the exact same linear combination of the W's so by definition of T this equals alpha 1 plus beta 1 W 1 plus alpha 2 plus beta 2 W 2 plus dot alpha n plus beta in W n that's the definition of T right ok now this I can now break as alpha 1 W 1 plus alpha and W n plus beta 1 W 1 plus beta n WN right this is precisely T of alpha 1 V 1 plus alpha n VN that's how T is defined do you agree plus T of beta 1 V 1 plus beta n VN good ok so T was really constructed to be linear and it happens to be linear that's basically the statement so this is T of U 1 oops so that's the linearity we proofed it good ok so show that T of alpha u equals alpha T of U that's the second part of being linear very similar manipulations like this show it ok so that proves part 1 if you define a map T of a linear combination as the same linear combination of the of some elements in the range what you get is a linear map good ok let's prove part 2 so part 2 I don't want to okay let's erase here you won in u2 or any general elements in in the domain space v each each element in the domain space is a linear combination of the basis elements of the domain space okay so u1 was some linear combination I called it alpha 1 V 1 through alpha and VN and likewise for YouTube beta 1 V 1 plus ok ok so part 2 um the claim is that if if w sorry if a was not just any collection random collection of n vectors in W but a basis for W and T is defined as such then it's a one-to-one and onto linear transformation okay so let's prove that T is 1 2 1 let's show that first so the way to show that a transformation is one-to-one you should already feel by this point that showing that a map is one-to-one how do you do it right show that the kernel is 0 okay so how do we show that the kernel is 0 we take somebody in the kernel and show that it's only 0 ok so let let u be some element in the kernel some element in the kernel okay what does it mean that it's an element in the kernel it means then that when we tee it what do we get there that's what it means to be in the kernel right on the other hand what is T of U just like before you is some some linear combination of the basis element so it's alpha 1 V 1 plus alpha n VN right and by definition it's alpha 1 W 1 plus alpha n W in that's the definition of T that's what T does it takes a linear combination and gives the exact same linear nation with double use instead of V's okay so what do we have here we have a linear combination of the double use which equals zero but now we're assuming that these double use are a basis that's the assumption of part two okay so if we have a linear combination that equals zero what are all the alphas right so since a is a basis for W a was W 1 through WN alpha 1 equals alpha 2 equals alpha N equals 0 all the coefficients here are 0 but if all the coefficients here are 0 that's these alphas so U is in fact 0 so therefore U is 0 and therefore kernel is trivial okay if the kernel is trivial T is 1 to 1 good ok now we need to show the T is on to this can be argued directly ok and in fact it's a one-liner directly but can you tell me how to show that T is on to not in a one-liner proof but in a half liner proof the corollary of the rank nullity theorem right we're assuming that a is a basis so let's look at the at the theorem again we're assuming in part two that a is a basis so B is a basis for V it has n elements a is a basis for W it has n of n elements so these two have the same dimension right if they have the same dimension and the map is one-to-one then it's also on - that was a previous corollary just in the previous clip do you remember okay so let's write that since a dim V equals them W that's our assumption in part two and T is 1 to 1 T is also on 2 by a previous theorem a map a linear map between two spaces that have the same dimension is 1 to 1 if and only if it's on to good everybody okay let's prove part 3 by the way it's worth giving it a thought how to prove this without without pulling this from your sleeve right this is heavy artillery here you can prove the T is on to directly by the definition of T in this case it's very easy try to think of how to do it okay not now later try to think of how to do it let's prove part three so part three is the claim that the way T is defined as sending the linear combination of a given basis to the same linear combination of any chosen subset of the range it's the only it's the unique map that sends the VIS to the WS ok so let's prove that and this is not under the assumptions of part two part two started with the word if we're no law we're again not assuming that a is is a basis for W ok we're back to the general situation so if T of V I equals W I okay so we're assuming that we have a map that does that we want to show that it can only be the map from the theorem okay then then um what is T of alpha 1 V 1 plus alpha n V in so if T of VI equals WI and T is linear we're assuming that we have a linear map that does that and we want to show that it's precisely the T from the theorem okay so if T is linear and does that then what is T of a linear combination we already know it's alpha 1 T of e 1 now we're using the fact that T is linear plus alpha N T of e n right and therefore it's precisely alpha 1 W 1 plus alpha n W it good so a linear map is determined but what it does to basis elements good ok so finally here's the corollary what we really want doesn't it suffice to say that's one-to-one and onto well it's implicit once you say it's one-to-one and onto what's implicit no but the third part is not related to the second part the second part start starts with the root with the word if okay the third part is not under the assumption that the if of the first part is satisfied its general okay okay so corollary and this is what we were interested in any tea from fn2 f10 okay so any tea where the vien the W's are well well we're thinking of them at first as column spaces okay but later we're going to see how to generalize this to any T from V to W but at first from just FN to F M that's what's going to follow from that theorem any tea any linear map any linear map T from FN to F M is of the form T of V equals a times V for some matrix a and a is going to be of size M by n okay why is that true so proof idea do precisely what we did in the example let's look at the example again we had a linear transformation from r2 to r3 okay it's determined by what it does to the basis elements so duty to the basis elements you get these two set them as columns that's your a that's the proof that's going to be the proof okay and the details of why this works in general and not just in this example are in that theorem that we just proved okay so let's do that let e 1 e 2 up to en be the standard basis for the domain space F in ok 1 0 0 0 0 0 1 0 0 0 all the way up to 0 0 0 0 0 0 0 what okay remember what the e eyes are ok denote T of e I equals WI T now is given I don't know anything about it except the fact that it's a linear map ok so T does something to e 1 call that something W 1 T does something to eat to call that something W 2 T does something to any I call that something WI I don't know anything about the W's they're just the t's of the basis ok so that this is going to be the basis B and this is going to be the set a in F M good ok by the previous theorem so we have what is T of any element V in FN I can write V as a linear combination of the basis for FN so it's T of alpha 1 e 1 plus alpha n e n any vector in FN can be written as the linear combination of these basis elements right ok this equals this equals we know that if if T does this then it's fixed on any element so it equals alpha 1 W 1 plus alpha n WN right because this equals alpha 1 T of e 1 plus alpha 2 T of e 2 plus alpha 3 T of e 3 and each of those are precisely the W's good ok and this equals this equals just like in the example just like in the example these are the da let's look at the example again in the example we're now at this point this is w 1 that's T of e 1 this is W 2 T of e 2 right and T of V is alpha 1 W 1 plus alpha 2 W 2 and that equals a a times V where a is the matrix whose columns are the W's ok so let's write that that equals a times V or like we can write it explicitly ok let's write a is the matrix whose columns are the W's so W 1 W 1 is an element in FM right it has M M components let's call the components w 1 W 1 1 w 2 1 all the way up to W N 1 those are the components of W 1 okay and the components of W 2 are going to be called W 1 2 W 2 2 all the way up to W M 2 and so on all the way up to the components of W n I'm going to call them w 1n w2 n all the way up to WM n right and I'm going to multiply that by V what is V V is our alpha 1 e 1 plus dot alpha and E n so V is precisely this vector alpha 1 through alpha n right so multiply you get this times this and this times this and you can spell it out it's precisely this linear combination okay so these are let's write that in blue so the columns of a where the columns of a are the w i's okay so any map from FN to F M can be presented by a matrix a whose columns are the images of a basis for F n okay just like in the example so first verify that you understand the example which is the simplest case and then see that you can write this in general good everybody good ok so I can see in your eyes that some of you are good and some of you are a bit lost lost whenever you see something like this it makes you a bit who oh but go back to the example see what happened in the example it's precisely the same thing precisely letter to letter ok except that there this was only had a and B rather than alpha 1 through alpha in and therefore this had only two columns T of e 1 and T of e 2 and E 1 was 1 0 and E 2 was 0 1 and that's it ok so everything was a lot simpler in the example that the idea is precisely the same idea ok good so what I want to do next but we're going to separate it we're going to do it in a separate clip what I want to do is show this in even further generality I'm going to take a vector space V here and a vector space W here assume nothing about them and show how to construct the matrix which performs just this but the matrix is not going to act on this itself the coordinate vectors exactly ok because if this is a space of polynomials I don't know how to multiply a matrix by a polynomial it doesn't make sense a V where a is a matrix and V is a polynomial doesn't compile but if V is the coordinate vector of the polynomial it's going to be somewhere here and a is going to know what to do with it and it's going to send it precisely to something here which when I translate back to polynomials is going to be what I want ok so that's coming up next commissioner good
Info
Channel: Technion
Views: 39,401
Rating: 4.8445597 out of 5
Keywords: Technion, Algebra 1M, Dr. Aviv Censor, International school of engineering
Id: tRbXrnoVJI8
Channel Id: undefined
Length: 42min 7sec (2527 seconds)
Published: Sun Nov 29 2015
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.