Tensor Calculus For Physics Majors 005| Diagonalizing 2nd Rank Tensors

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
what's going on smart people bringing you another episode of tensor calculus for physics majors in the last video we talked about transformations of to index tensors and a lot of times it's convenient to project a to index tensor onto some coordinate system and work with its matrix representation but sometimes that matrix representation will or will not be nice to work with and by nice to work with I mean its matrix may or may not be diagonal a diagonal matrix is one of whose off diagonal components are zero now if the matrix representation is not diagonal the question is then is it that way because of some intrinsic complicated nature of the tensor itself or did you just choose the wrong basis to represent it by and as it turns out a symmetric real second-rank tensor can always be diagonalized so for this video we're gonna pick on the inertia tensor which is a symmetric real tensor I'm going to show you how to diagonalize that and then we'll make some final comments before we get into the metric tensor next video let's kick things off by considering angular momentum we're going to end up projecting everything onto some XYZ basis using de Brock notation we know that the angular momentum is equal to the inertia tensor which I'll have as this double arrow acting on angular velocity and if we consider some tires rotating about some axle where it's pointing in the Z direction so we have some something like this here's all right angular velocity that is pointing in the Z direction in an ideal situation meaning like your tires are well balanced the angular momentum will be pointing in the same direction as the angular velocity but this is not always the case if you've ever driven a car where the tires aren't balanced you'll feel it shaking quite a bit because the angular momentum is not pointing in the same direction as the angular velocity but let's go ahead and expand out this representation what this is saying is about X Y and LZ it's equal to this is go to a second this is nurse the tensor and matrix representation times Omega and I just said that Omega is in the Z direction so what we've never Tate that by zero zero Omega Z if we carry out this matrix multiplication we get that LX is equal to well we get zero 0 IX is Z Omega Z Omega sub C let's call it that I guess we'll do that whatever right now we're not worried about what's contra variant or covariant indices okay why I why Z Omega C LLC is equal to i ZZ magazine so here's our expression for angular momentum components we want angular momentum to strictly be pointing in the same direction as Omega which is be in the z direction but it can't just be pointing in the Z direction if we have these other components of angular momentum we know Omega Z we're assuming is not 0 which means we need a way of proving that I XZ and I Y Z are and we can do that by considering the geometry of the object we're calculating these components for so we're dealing with tires tires if we're assuming a uniform continuous mass distribution are basically cylinders right so they have some cylindrical symmetry that we can exploit so let's write down some of your coordinates real quick X is equal to R cosine Phi Y is equal to R sine Phi Z equals D and our measure DV equals our FDR define DZ cool we we defined our component of the inertia tensor earlier to be AI J is equal to the integral of R squared Kronecker Delta IJ minus X I XJ summed over all the masses okay now let's take a look at X Z and by symmetry we're also looking at Z X because the inertia tensor is a symmetric tensor I XZ well that's just equal to well Delta IJ if I does not equal J this is just 0 so that's 0 this is gonna be - I'm gonna write a triple integral x z DM substituting in our values for cylindrical coordinates and DM is just Rho DV where Rho is the mass density and this is equal to triple integral of R squared cosine Phi z dr d phi of DZ now the thing that makes all of these components simplified quite a bit is the fact that we're integrating over this angle over the angle of the tires which is going from 0 to 2pi so theta or Phi is going from 0 to 2 pi the only difference between I XZ and iy z is the fact that instead of using a cosine and Y Z we'll be using a sign but since we're integrating from 0 to 2 pi anyways both of these integrals go to 0 so this is 0 and similarly I Y Z is equal to 0 because we're integrating sine from 0 to 2 pi by symmetry i zy equals 0 and so does i ZX so the last thing that we're gonna have to take care of so we're just so that this is zero this is zero this is zero and this is zero that still leaves Y X and X Y they don't show up explicitly in the angular momentum but we'll show real quick that those are also going to be zero so I'm going to erase this real quick okay if we want to find cix fly that's again equal to minus the integral X Y DM minus the integral of R cubed now times cosine Phi a sine Phi D are defined DZ this is also zero because we have an odd function times an even function integrating from zero to PI so this is also going to contribute zero so at the end of the day our representation of the inertia tensor simplifies I xx it's zero zero zero iy y zero zero zero by Z Z what this means is we've conveniently chosen a basis where one of our basis vectors points along the axis of rotation of principal axis of rotation and everything aligns perfectly such that we we chose the right set of basis vectors that diagonalizes our matrix but we won't always be so lucky so next we're going to get into an example where we don't have a diagonal matrix and I'll show you how to diagonalize it so if we don't have a uniform mass distribution there's no reason to think we can exploit this symmetry and think that our tensor will be diagonal so the mechanic will diagonalize the tensor by changing the mass distribution itself and adding weights we on the other hand have the luxury of choosing a different basis now we were kind of lucky with that last example we already had a set of basis vectors that diagonalized our inertia tensor for us but now let's purposefully screw things up a little bit we'll start with an inertia tensor that is not diagonal and change our basis to one that does diagonalize the matrix turns out that a more natural basis to represent the tensors and such that it diagonalizes them is the basis such that the basis vectors are just the eigenvectors of the tensor itself you know we're going assume that our inertia tensor represented in some basis on mij so coordinates are our excise well this is just the matrix element I and like I said we're going to assume that in this basis the tensor is not diagonal the goal is then to find some new coordinates a new that's an M some new coordinates we'll call X alpha where alpha is our set of new basis vectors that are also the eigenvectors of I such that I alpha beta is equal to this matrix element alpha beta and we want it to be diagonal and that's gonna turn out to be some I in value x Kronecker Delta alpha beta now this isn't the same eigenvalue multiplied by each row this is going to be the eigenvalue of the beta the eigenvector right there the Kronecker Delta is just 1 0 0 0 1 0 0 0 1 so like lambda 1 times this lambda 2 times this lambda 3 times this so this lambda is not just one number times the Kronecker Delta it's the set of eigen values great so we've established that we want to find the eigenvectors of this of this tensor and that will diagonalize our tensor for us the next step is how to find the eigenvectors well it's the same as any other way let's write the matrix version of this equation by acting on our eigen function or eigen vector is equal to its buying value times the eigenvector subtracting this over and factoring out the common factor of this eigen vector we can I was annoying to draw minus lambda identity matrix times this is equal to 0 why why am I writing this identity matrix well we have a tensor of to here we can't subtract a scaler from a tensor that doesn't really make sense so we have to make sure that it's the same rank now we're looking good to start finding these eigenvectors one one kind of condition that we have to impose is that the determinant of this quantity here has to equal zero so the determinant is equal to zero the reason for this is if it's not then we can assume that this matrix can be inverted and if it's inverted yeah we'll have a unique solution for beta but if we multiply it by the inverse then the inverse times this would just give us the zero vector again and we would get beta equals zero no matter what this is and that is well that's not very interesting so we want to impose this not being invertible by saying that its determinant will equal zero then solving this will let us find the eigenvalues of our matrix here and then after that we'll be in good shape to actually find the eigenvectors requiring that the determinant equals zero well yeah we won't necessarily have the trivial solution anymore but we also lose uniqueness but we'll be able to recover uniqueness again by imposing that our eigenvectors be orthonormal so we'll end up normalizing them and we'll get a unique solution again let's actually do an example we're gonna pick on the inertia tensor one more time the components of the inertia tensor should have units of mass times distance squared and to keep me from having to write em a squared a bunch of times I'm going to introduce some perimeter I think that's an ADA is equal to M a squared that way I don't have to write this all those times so let's introduce some inertia tensor that won't be diagonal and then we'll diagonalize so let I equal I'm going to write ADA times 2 minus 1 0 minus 1 2 0 0 0 4 this is different from the one given in the tensor calculus book that I've been using and I'm using this one because in that book they kind of just give you the eigenvalues and say and we calculated it and they do that because if you actually go through the calculation like I did you end up getting factors of like 1 over 430 - which is extremely discussing so he changed the problem a little bit to make the algebra a bit nicer I still made it so that the eigenvalues will give us real and positive eigenvalues which is what you should expect for the inertia tensor to begin with now if we want to find the eigenvalues we need to do the determinant of I I'm going to lose the arrow you get the idea that this is the inertia tensor minus lambda times the identity okay so this is equal to ADA times well actually no let's keep the ada in there we'll do determinate of to ada minus lambda - ADA 0 - ADA - a minus lambda 0 0 0 for 8 a of minus lambda ok and this should equal 0 this is going to give us a characteristic equation that will let us solve for our eigen values and then we will use those eigen values to get our eigen functions or eigen vectors sorry if you're already familiar with finding the determinant of a 3x3 matrix you can go ahead and skip to this video but I wanted to be careful it's just in case some people forgot how to do that so let's go ahead and do that the first way are the first step of finding this determinant is we want to find it associated with this coordinate so we write it is 2 800 minus lambda times block off the column block off the row and it's miss determinant the second part is going to give us 0 so this is going to be 2 ayna minus lambda for a 2 minus lambda then we're going over a column so it's minus - ADA block off the column block off the row then we get minus a 2 times 4 8 and minus lambda and again the other term is just going to be 0 times 0 and we already have a zero here so that's just going to zero out the next term so it's plus zero equals zero and now we just want to solve this characteristic equation for lambda so let's distribute everything out and we will get let's see here so this is going to be a 4 a 2 squared minus 4 a 2 lambda plus lambda squared times 4 a 2 minus lambda minus minus minus this is minus a 2 squared 4 a 2 minus lambda equals 0 we have a common factor of 4 a 2 minus lambda so let's go ahead and factor that out and we get this we're gonna have a 4 a 2 squared minus in a 2 squared so that's 3 a 2 squared minus 4 a 2 lambda squared is equal to 0 let's go ahead and solve this for our first eigen value we get 2 for a 2 minus lambda equals 0 put lambda 1 is equal to 4 a ton cool why not give value down now let's do the same thing here we get 3 and ax squared minus actually a lot represented in decreasing order so let's do lambda squared minus 4 a 2 lambda plus 3 a 2 squared equals 0 and this should actually factor this could give us lambda let's see - 4 nope - 3 ADA and then lambda minus 1 yeah let's go ahead and multiply this out just to make sure so that's lambda squared minus a minus 3 a 2 plus 3 a 2 squared so that does give us it so that tells us that our other and values are lambda equal to 3 ada and lambda equal to ADA perfect now that we have our onion values let's go ahead and find these eigenvectors this took me so much longer when I tried to go through the algebra that was actually given in the book awesome so we have our three eigenvalues we will have three associated eigen vectors because remember we're looking for I to be acting on some eigenvector to give us that associated eigenvalue times that eigenvector again and we'll write out distinct names for our packet vectors we'll let them equal let's call one row one will be Sigma let's let another one equal gamma so these are going to be our three eigen vectors so the first one that we want to do is we want to find I acting on a row should give us and let's make it associated with lambda one so let's do lambda one times well let's get started so I again we know that that's ADA 2 minus 1 0 minus 1 2 0 0 0 4 acting on Rho and we'll let its components be Row 1 Row 2 Row 3 Row 1 Row 2 Row 3 it's equal to the eigenvalue 1 which is gonna be for ada that's okay there we go x times our eigen vector again so Row 1 Row 2 Row 3 right off the bat we see that the ADA is just canceled again assuming that the mass is not 0 and then a is not 0 so if we multiply this palette the first component of our I in vector is going to be 2 Rho 1 minus Rho 1 Rho 2 I mean is equal to 4 Rho 1 so just a matrix multiplication and then took it as the weighted our coefficients here and then let me just cancel okay and then we can subtract over this to Row one and we get that minus Row 2 is equal to two Row one saw me from Row one we get Row 1 is equal to minus root by 1/2 row - great let's do the other equation so this is now we're looking for Row two so this is going to be minus 1 so minus Row one plus two Row two plus zero Row three is equal to Rho 2 for room - ok let's subtract over the row twos and we get minus Row 1 is equal to 2 Row 2 or it's just x minus 1 now we get Row 1 is equal to minus 2 times Row 2 so we're having conflicting things here we're saying Row 1 is equal to 1/2 basically if wrote 2 and it's equal to twice the only way for this to be satisfied is if Row 1 or Row 2 is equal to 0 which would make Row 1 equal to 0 ok now for the final equation we get 0 0 for Row 3 it's equal to for Row 3 Row 3 so we should get Row 3 equals Row 3 it's not very helpful but we can fix this we can make this unique you see why I said that this gives us something that might not be unique at first but we can make it unique by imposing that on vector B orthogonalize so in other words Rho Rho u equal 1 let's go ahead and do that this is a first in our second these are a first and second and our third so we get 0 0 Row 3 times 0 0 Row 3 is equal to 1 and that just tells us that row three is equal to one great so now we have a unique I get vector let's go ahead and write that up here so Rho is equal to zero zero one one down two to go I'm going to erase this and I'm sure you'll see a pattern here basically doing the same thing three times so now let's do Sigma and have that be associated with lambda 2 so I'll make that Sigma 1 Sigma 2 and Sigma 3 lambda 2 is 3 ADA 1 Sigma 2 Sigma 3 against ADIS cancel and we can multiply it out component by component we get that 2 Sigma 1 minus Sigma 2 it's equal to 3 and Sigma 1 subtract this over we get that minus Sigma 2 is equal to Sigma 1 ok so that's one thing that's one condition that we have do it for the second row now minus minus Sigma 1 plus 2 Sigma 2 it's equal to 3 Sigma 2 let's subtract this over and we get minus Sigma 1 is equal to Sigma 2 so we get the same thing here okay final equation we get for Sigma 3 it's equal to 3 Sigma 3 again this can only be satisfied if Sigma 3 is 0 so Sigma 3 equals 0 since we know that Sigma 1 is just minus Sigma 2 we can fix these two by again imposing orthogonality orthonormality I should say so let's go ahead and normalize it so we'll do Sigma Sigma it's equal to now we don't know what the first two are we just know that they're opposites of each other so let's just have them in terms of Sigma one so we'll have Sigma 1 we know Sigma 2 is just minus Sigma 1 minus Sigma 1 and then we have 0 Sigma 1 minus Sigma 1 and 0 Sigma 1 squared plus Sigma 1 squared is equal to 1 so 2 Sigma 1 squared is equal to 1 so Sigma 1 is equal to 1 over root 2 well that also defines Sigma 2 is equal to minus 1 over root 2 now we can collect all these terms and right here and create our second eigenvector so Sigma is equal to 1 over root 2 1 minus 1 is 0 under the final one last step is to finally find the third eigenvector I've already expressed it in here let's go through the exact same motions so writing out this matrix multiplication you guessed it - gamma 1 minus gamma 2 is equal to gamma 1 subtracting this over and adding that over we get that gamma 1 equals gamma 2 next equation minus gamma 1 plus 2 gamma 2 is equal to gamma 2 again gamma 1 equals gamma 2 super-helpful second equation last one for gamma 3 is equal to gamma 3 so that tells us that gamma 3 has to equal 0 then we do we fix these constants again by just normalizing our eigen functions I get vectors and we get so gamma gamma equals 1 equals let's call it gamma 1 gamma 1 gamma 1 gamma 1 0 gamma 1 squared jump gamma 1 squared is equal to 1 so given 1 and is equal to whatever root 2 G also tells us that gamma 2 is equal to 1 over root 2 ok so now we can write our final item vector to be gamma 1 of 1 over root 2 1 1 0 great the last step that we have to do is we have to just create a matrix out of these eigenvectors so we'll call this matrix u so we put it up here let you all reach this as well let u equal a matrix consisting where each row is an eigenvector so we will have 0 0 1 1 1 over root 2 1 over root 2 minus 1 1 over root 2 and 0 whatever root 2 1 over root 2 and 0 again I'm actually gonna factor out the 1 over root 2 because I'd rather deal with root twos then 1 over root 2 so this is equal to 1 over root 2 0 1 1 of 0 minus 1 of 1 root 2 0 0 the last thing that we want to do is we have to recall how these tensors transform how these matrices transform we can transform a matrix any matrix like this T prime is equal to the transformation using our unitary matrix which is what these are representing times are unprimed you this is transposed I have an entire video proving this transformation property here which I'll link in the description I know a lot of this I've already covered multiple videos but I think it'll be nice to have it all in one except for this one this would add another 15 20 minutes to the video so we need to transform our inertia tensor by multiplying it by this matrix that is formed by its eigenvectors and the transpose of that matrix so let's find what this is if that tells us that u transpose again it's going to be the 1 over root 2 out front rows become columns so 0 1 1 0 minus 1 1 and root 2 0 0 so here's our u transposed now let's find out once and for all if we act on our inertia tensor here with these matrices here do we get a diagonal matrix let's see so u transpose is well with multiplying whatever root 2 times 1 over root 2 is that's just going to contribute a 1/2 of total transpose 0 1 1 0 minus 1 1 and root 2 0 0 sandwiched in between all this is going to be our inertia tensor which is going to be ADA turns all this let's go ahead and pop the ADA out front here 2 minus 1 0 minus 1 2 0 0 0 4 times the untransparent 1 of 1 0 minus 1 1 root 2 0 0 and let's do our matrix multiplication I'm gonna go ahead and erase let's see I guess I guess up here is as good a spot as any to erase okay so this is a 2 over 2 this means everything 0 1 1 0 minus 1 1 root 2 0 0 times this matrix multiplication so we've got 0 0 0 2 1 0 which is 3 & 2 minus 1 0 which is 1 next row 0 0 0 minus 1 minus 2 0 so minus 1 minus 2 is minus 3 and then minus 1 plus 2 plus 0 so we've got a 1 again final column silent row I mean we've got a 0 0 4 root 2 0 0 0 & 0 0 0 final matrix multiplication this is equal to a 2 / 2 0 0 4 root 2 times root 2 so it's going to give us an 8 then we've got 0 0 0 0 0 0 0 ok 0 0 0 1 minus minus 3 we know 3 + 3 0 so that's a 6 1 minus 1 plus 0 which is 0 0 0 0 set 0 so many times the starting did not sound like a word anymore 1 or 3 minus 3 plus 0 that's a 0 one plus one that's a two giving our final representation so our let's call it I what was the index I was giving it alpha beta I guess let's just call it I prime since we're not just representing to the elements is equal to one-half so we've got four over two so that's four or eight over well so we've got eight over two times eight us that's for inna okay zero zero zero three eight zero and zero zero ada so not only does constructing the eigenvectors diagonalize the matrix but the fact that we normalize them makes it so that the diagonal elements or just the eigenvalues of those associated eigen vectors let's recap what we just did this was the inertia tensor that we started with and this is our diagonalized inertia tensor but what does this mean physically well this is corresponding to the inertia tensor for some object rotating about some axes the fact that we diagonalized it means from the beginning we said that our angular momentum is equal to the inertia tensor times the angular velocity expressing it this way means that if we have a rotation about one of these basis vectors about one of those axes the angular momentum is guaranteed to also be in that direction so if we have angular velocity in the Roewe direction exclusively so will the angular momentum so that's what makes it so powerful to be able to diagonalize our tensors in this way again we'll do diagonalized tensors it's going to play a bigger role later on which I thought is why it deserves its own video today even though I've covered it in bits and pieces and other videos the last thing that I want to talk about is changing gears a little bit because I want to take the definition of transformations of to index tensors themselves and show an alternative way of thinking about them so if we were to have some tensor in the prime frame I J we know now that that transforms as DX Prime I say DX K some other index e ex prime J DX n TK and so we have our tensor transformation coefficients and the component in the transform frame but think about for a second the way that just vectors transform if we want to define say the i prime component so in a transform vector component we know that that's equal to DX prime by 2 DX k ok similarly we can have another vector we'll call B Prime let's call it J so these are components that keep following up with vectors these are components of vectors this is equal to G X prime J DX it's called n let's not reuse our indices B N and if we multiply these two vector components together we get a prime by B prime J is equal to DX prime I DX K DX prime J DX end a que bien and that looks an awful lot like this definition of how a to index tensor transforms in fact it's exactly the same and we can define this TK n to be the product of these vector components a que bien and at this point it shouldn't look too surprising to express it this way because we know if we have C a component of the inertia tensor by I J this is the integral of R squared Delta IJ minus X I XJ integrated over all the mass here we know that the Kronecker Delta is a to index tensor but we don't know explicitly that the product of two vector components makes one as well and now we do by taking the definition of help actors transform and multiplying them together more specifically what we're doing is we're saying that we can create the tensor by taking what's known as the outer product of two vectors and this is also going to play a bigger role in the future videos in the next video we're finally going to delve into the metric tensor I'm looking forward to that hope you guys found this video helpful let me know in the comments section if you'd like to see maybe some exercises from this book worked out in a future video and I'll see you guys there
Info
Channel: Andrew Dotson
Views: 14,340
Rating: 4.9811764 out of 5
Keywords: andrew, dotson, physics, tensors, tensor calculus, diagonalization, inertia tensor, inertia, moment of inertia, symmetric tensor, diagonal, tensor calculus for physics, eigenvalue, eigenvector, unitary, matrix, transformation, angular momentum, angular velocity
Id: DUuGBuh57PY
Channel Id: undefined
Length: 37min 51sec (2271 seconds)
Published: Sat Nov 24 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.