Tensors Explained Intuitively: Covariant, Contravariant, Rank

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

What was intuitive about this? After it said the difference between co- and contravariant the video literally said the same thing over and over again. What an awful video.

👍︎︎ 57 👤︎︎ u/suuuuuu 📅︎︎ Jul 20 2017 🗫︎ replies

I am still not sure, If I understood the whole thing.

Maybe someone smarter than me can answer this: Are contravariant and covariant vectors the same when the base is an orthonormal-basis? In the video it is said that the contravariant vectors are the vectors that i get through adding together the base-vectors (each base-vector multiplied by a certain amount). Then the covariant vectors are when i project the vector on each base vector. I alwaus thought that's the same thing. Or is it just the same thing for the orthonormal basis? I am right now not able to adequately see the 3d-thing in my mind. Or it is too late.

This video actually comes at the right time since i tried to finally 'get' what tensors are. I thought before that I knew what they are (arrays of a certain dimension) but someone made me realize that i actually didn't understand what they are (since their behaviour is also important). So I am dumbfounded. I tried to understand the wiki-pedia article, but my mind just shuts down. I somehow seem to have everything wrong in some profound way. ...

edit: Ahh is the covariant vector actually the vector consisting of the projections on the surfaces of two base-vectors? (contrary to the hm projection of the vector on the single base vectors?)

edit2: Ok i think I somehow got something (I'll have to sort that out and clean up the everything in my neural computer tomorrow...): covariant vectors are the projections on the vectors. contravariant vectors are the projections on the normals of the surfaces of every set of 2 base vectors. for an orthonormal basis co- and contro-variant vectors are the same. I think the last point would be important to include in the video. Since almost always we are dealing with orthonormal bases. From this perspective contra and covariant vectors don't make sense at all. why make a distinction if they're (seemingly) the same?

👍︎︎ 6 👤︎︎ u/exocortex 📅︎︎ Jul 20 2017 🗫︎ replies

Tensors are mathematical objects that transform in a special way when the basis vectors change.

Don't be put off by this start. The video actually makes it clear that it is about how the components transform.

(The music is the overture to Rossini's opera William Tell.)

👍︎︎ 20 👤︎︎ u/Bromskloss 📅︎︎ Jul 20 2017 🗫︎ replies

I do enjoy these videos from Eugene. If this one on Tensors isn't engaging enough, my favourites are the ones on waves light and sound, thermodynamics, GR and Schrodinger's equation.

👍︎︎ 5 👤︎︎ u/Aerothermal 📅︎︎ Jul 21 2017 🗫︎ replies

The timing with the William Tell Overture is great. The covariant/contravarient distinction comes with the Storm, and the Finale plays for the rank-3 tensor basis.

👍︎︎ 2 👤︎︎ u/Spirko 📅︎︎ Jul 20 2017 🗫︎ replies

Wish I had this video for my GR class

👍︎︎ 6 👤︎︎ u/Cogito_ErgoSum 📅︎︎ Jul 20 2017 🗫︎ replies

ohhhhhhhh

👍︎︎ 1 👤︎︎ u/b0n_ 📅︎︎ Jul 21 2017 🗫︎ replies

I thought it was pretty good explanation.

👍︎︎ 1 👤︎︎ u/usmcram 📅︎︎ Jul 21 2017 🗫︎ replies
Captions
Tensors are mathematical objects that transform in a special way when the basis vectors change. Understanding tensors is fundamental to understanding the curvature of space time in Einstein’s General Relativity. A tensor of rank 1 is a vector. Let’s pick new basis vectors. We typically describe a vector by how many of each of the basis vectors we have to add together to produce it. But, there is also another way to describe a vector in terms of the basis vectors. This is by taking the dot product of the vector with each of the basis vectors. For now, let us return to the typical way of describing a vector. Here, the vector has the components 4, 2, and 6. Suppose we double the length of each of the basis vectors. The components of this vector are now 2, 1, and 3, which means that the components have decreased. When we increased the lengths of the basis vectors, the components of our vector decreased. If we decrease the lengths of the basis vectors, the components of our vector increase. Because these two quantities change “contrary” to one another, we refer to this by saying that these are the “contra-variant” components of the vector. Describing a vector in terms of the “contra-variant” components is the way we usually describe a vector. Suppose we instead describe the vector in terms of its dot product with each of the basis vectors. If the basis vectors are of length 1, then the result of these dot products will be these three lengths. If we double the length of a basis vector, then its associated dot product will also double. When we increase the lengths of the basis vectors, the dot products also increase. When we decrease the lengths of the basis vectors, the dot products also decrease. Because both quantities vary in the same way, we refer to this by saying that these are the “co-variant” components for describing the vector. We can distinguish the variables for the “co-variant” components from variables for the “contra-variant” components by using sub-scripts instead of super-scripts for the index values. We can distinguish the variables for the “co-variant” components from variables for the “contra-variant” components by using sub-scripts instead of super-scripts for the index values. Since these are just two different ways to describe the same vector, the same variable name is used, which in this case is the variable “V”. Suppose we have two different vectors. Let’s call the first vector “V” and call the second vector “P”, and let’s always continue to use these names, even if the basis vectors change. Suppose we multiply one of the contra-variant components of V with one of the contra-variant components of P. If we consider every possible way in which we can do this, we would get a matrix as shown. This is an example of a tensor of rank 2 with two “contra-variant” index values. Suppose we multiply the “co-variant” components of V with “contra-variant” components of P as shown. This will give us a different description of the same rank 2 tensor, with one “co-variant” index value and one “contra-variant” index value. Suppose we now instead multiply two co-variant components from each of the two vectors as shown. This will give us a different description of the same rank 2 tensor, with two “co-variant” index values. What makes a tensor a tensor is that when the basis vectors change, the components of the tensor would change in the same manner as they would in one of these objects. A tensor does not necessarily have to be created from vector components as is shown in these examples. A tensor of rank 1 is a vector. A tensor of rank 1 has a number associated with each of the basis vectors. In a tensor of rank 2, instead of associating a number with each basis vector, we associate a number with every possible combination of two basis vectors. In a tensor of rank 3, we associate a number with every possible combination of three basis vectors. A tensor of rank 3 can be composed of the combinations of components of three vectors. We can create different descriptions of this tensor by using different combinations of contra-variant and co-variant index values for the components of the three vectors.
Info
Channel: Physics Videos by Eugene Khutoryansky
Views: 869,363
Rating: 4.8203559 out of 5
Keywords: Tensors, Covariant, Contravariant
Id: CliW7kSxxWU
Channel Id: undefined
Length: 11min 44sec (704 seconds)
Published: Thu Jul 20 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.