Hey folks, my name is Nathan Johnston and
welcome to Lecture 14 of Advanced Linear Algebra.
Today we're going to be learning about isomorphisms -- this is one of
the rare topics in this course that does not
have a direct analogy from the previous linear algebra course that you took.
For example, we learned about vector spaces right at the start of this
course and it was just a generalization of R^n.
Last lecture, we learned about invertibility of linear transformations,
which is just a direct generalization of invertibility of a matrix.
Well, isomorphisms are really a brand new idea to help us get a better grasp
on the fact that there's so many different vector spaces out there.
The idea is even though there are lots of different vector spaces out
there, a lot of them look the same in some
sense -- doing linear algebraic calculations on them
is the same. We've already seen this idea a little bit -- remember that
what we've been doing recently is we've constructed coordinate vectors of
vectors. You start off with some vector in a vector space and you
constructed a coordinate vector out of it
by using some basis. Then we had this idea that you could solve
a linear algebra problem in V itself by instead
doing something involving the coordinate vector, so in a sense the coordinate
vectors were the same as the original vectors -- they just were maybe
easier to work with because we like working with lists of numbers.
The idea behind an isomorphism is very much the same -- an isomorphism
is just a function that takes you from one vector space to another
equivalent vector space that's maybe easier to work with.
So here's the definition for today's lecture. What an isomorphism
is and what it means for two vector spaces
to be isomorphic. Here's the setup: you've got two vector
spaces -- they don't have to be finite dimensional -- they can be any old vector spaces
as long as they're over the same field -- you have to be working the same scalars
in both of these vector spaces. Then we say that
these two vector spaces are isomorphic
if there's some invertible linear transformation going between these
vector spaces. It doesn't matter what the
invertible linear transformation is -- there just has to be
at least one invertible linear transformation from one of the vector
spaces to the other. If this happens we say those
vector spaces are isomorphic and we denote it like this V sort of
equals W because we think of it like: V and W they're kind of the same.
They're not exactly the same, they're not necessarily the same vector
space, but they're very very similar to each
other. The invertible linear
transformation that we found that showed that these
vector spaces are isomorphic, we call that an isomorphism
between those vector spaces. This idea is best illustrated by an
example of why this corresponds to the vector spaces being
the same in a sense. A standard example
is the vector spaces M_1,n (in other words,
the matrices with 1 row and n columns) and M_n,1 (the matrices with n rows
and and 1 column), so in other words the row
vectors and column vectors, respectively. Let's
show that these vector spaces are isomorphic.
Okay, so how do you show that two vector spaces are isomorphic to each other? You
just find some invertible linear transformation
between them, and in this case there's a very obvious linear transformation
that's invertible between them: it's one that we work with all the
time -- it's the transpose map. If you take the transpose of a row vector
you get a column vector, and that's invertible because, what's the inverse?
It's also the transpose map. If you take the transpose of the column vector
you get back to the row vector. Okay, we already talked
about in previous lectures how the transpose map is a linear transformation,
so yeah, this is what we need. This T -- this transpose map -- is an
isomorphism between these two vector spaces, so
they're isomorphic to each other, so we can write
M_1,n is isomorphic to M_n,1. Certainly these two vector
spaces *feel* the same as each other as far as linear
algebraic properties are concerned -- as far as
vector addition and scalar multiplication are concerned -- all
they are is vectors but oriented in a different
way -- the column vectors are oriented up and down and row vectors are oriented
side to side. But as far as scalar
multiplication and vector addition are concerned it doesn't matter the way that
you arrange the numbers, so the way you do linear algebra in
these vector spaces is the same, so they're isomorphic. Alright, let's do
another example, or rather let's talk about a sort of a ramped-up version of
this example. both of these vector spaces -- the row
vectors and the column vectors -- they're actually both also isomorphic
to just F^n. Remember F^n is just the set of
vectors, as in lists of n numbers, so we usually use slightly
different notation for these three vector spaces.
M_1,n -- the row vectors -- we usually denote them like this with square
brackets around them. M_n,1 -- again, column vectors -- also
square brackets around them because really they're matrices. Okay, and
whenever we care about shape we use square brackets,
whereas F^n -- vectors as in lists of numbers -- we don't care about the
shape of these, and whenever we don't care about shape we use round
brackets. These really are vectors, they're not matrices,
but all of these vector spaces are isomorphic and they're
isomorphic in sort of an obvious way -- you can define linear
transformations between these just by writing the vectors in a
different orientation or maybe using different
brackets. An isomorphism from M_1,n to F^n, for example,
is just the function that changes the type of brackets that you have on the
left and the right. And of course that is a linear
transformation and it is invertible -- to invert it you just
change the type of brackets that you use. So that's an invertible linear
transformation, so it's an isomorphism, so these
spaces are all isomorphic to each other. They're all
essentially the same even though technically they're not quite the same,
but they're the same enough that we call them isomorphic.
So in a sense this is sort of the idea here --
vector spaces are isomorphic if they contain the same
vectors, just they may be written in a different way -- that's the idea here.
There's slightly more exotic examples than these
ones but that's the idea, they have the same vectors
but just different labels on the vectors, and the isomorphism -- that invertible
linear transformation between them -- it just changes the labels from the
labels that are used in one space to the labels that are used in the other space.
Alright, so let's maybe look at a slightly more complicated example.
Alright, let's show that P^3 -- the space of degree less than or equal to
three polynomials -- let's show that that's isomorphic to
R^4 -- you know, lists of four real numbers. Again, to show that two vector
spaces are isomorphic you just have to find some invertible linear
transformation between them. Alright, so i'm just going to come up
with one, and it's just sort of staring at us in the face when we write down what the vectors in these two
spaces look like. The members of P^3, what those look like,
well they look like a plus bx plus cx^2 plus dx^3.
Well, how can I map that to R^4 in a reasonably natural way that is
invertible? Well, just list out those four
coefficients here, the a, b, c, and d, and throw them into a vector
with four entries. In other words, throw it into a vector that lives in R^4.
Well, it's straightforward to show that that's a linear transformation,
and furthermore it's invertible because you can map back the other way.
The inverse function -- the inverse linear transformation --
just sends a, b, c, d to that polynomial that we started with: a plus bx plus cx^2 plus dx^3. Again, that's a
linear transformation. Alright, so yeah there's an invertible
linear transformation between these two spaces
so they're isomorphic to each other. That's it! That's all you have to check.
We found some invertible linear transformation
between them so we're done. One really nice thing is,
back when we introduced coordinate vectors, really what we did
was we showed that every finite dimensional vector space, it just
looks like F^n. This is really what we're doing in this example.
Here we're taking a polynomial of degree less than or equal to 3 and we're
just representing it in the standard basis.
That's all we're doing, right? You're taking coefficients of a linear
combination in the standard basis. Alright, the same thing works in more
generality so that's what our next theorem is --
suppose that you've got some n-dimensional vector space over
some field F. Okay, well then that vector space is isomorphic to F^n.
Okay, and this we've been implicitly doing this for a little while
but finally we've built up the terminology and the technology that we
need to actually pin this down as a theorem.
Every n-dimensional vector space, every finite dimensional vector space,
is isomorphic to the vector space that just consists of a list of numbers.
Alright, so this is really really nice, but how do we prove this? Alright, well
again to show that two vector spaces are isomorphic to each other what you've got
to do is you've got to find some isomorphism
between them -- you've gotta find some invertible linear transformation from
one to the other. Alright, so the way that we do that is
just sort of what we've hinted at here -- let's use coordinate vectors.
Okay, let's consider the linear transformation that sends V into F^n and
it's going to be defined by well it just sends v
to its coordinate vector with respect to whatever basis you like. It doesn't
matter what basis you pick, it's just there is a basis -- that's great --
use that one. Okay, so that's our linear transformation.
Now to show that these two vector spaces actually are isomorphic to each other,
we have to show that this function here, this T, is a linear transformation.
and we have to show that it's invertible. If we get those two things then, great, these
two vector spaces are isomorphic to each other.
Alright, so how do we do that? Let's start off with the fact that it's a
linear transformation. Alright, so we want to show that T of (v plus w) is
equal to (T of v) plus (T of w), and similarly for scalar multiplication, we
want to show that T of cv equals c times (T of v).
Alright, well just plug in what T does and so what we want to show is that the
coordinate vector of v plus w equals the coordinate vector of v plus
the coordinate vector of w. We talked about that a couple weeks ago, that's a
property of coordinate vectors that we talked about just as soon
as we introduced coordinate vectors. So that's true, we're happy with that
property, and similarly for scalar multiplication
-- T of cv equals c times (T of v) -- well, that just means the coordinate vector of
cv equals c times the coordinate vector of v, and
again we talked about that property back when we first introduced coordinate
vectors. That's true as well, so yes this function
here that sends a vector to its coordinate vector, that's a linear
transformation. Alright, the second thing that we've got
to show, we've got to show that this T, this function that sends a vector
to its coordinate vector is invertible. Okay, the way that I'm
going to show this is, well, let's go back to the end of last
lecture. Let's go back just to the very very end
when we talked about invertibility of linear transformations
and we mentioned this property here that if you have a linear transformation
between two finite-dimensional vector spaces with the same dimension,
well then that linear transformation is invertible if and only if T of v equals
0 implies v equals 0. Okay, so that's the property
that we're going to use here... that's how we're going to show
invertibility. So let's come back down here now.
Alright, so first off, do these vector spaces have the same dimension?
V and F^n? Well, yeah because we said V is an n-dimensional vector space and F^n,
well yeah that's n-dimensional right? It's just got the standard basis
which has n vectors so that's n-dimensional.
Alright, so they have the same dimension. We're happy with that part, now
we've just got to show that T of v equals 0 implies v equals 0.
Alright, well what is T of v? That's v with respect to B -- that's
the coordinate vector. Okay, so if this coordinate vector equals
0 I'm not sure that that implies v equals 0... how do I
do that? Well, you just do it straight from the definition.
v_B equals 0, well that just means every coefficient in the linear
combination where you represent v as a linear combination of the members
of B, every coefficient equals 0. Okay, so in other words the vector v
equals 0 times the first basis vector plus 0 times the second basis vector
and so on up to 0 times the n-th basis vector. But if you just have 0
times something plus 0 times something and so on all the
all the way to the end of the sum then that entire sum just equals the 0
vector. Alright, so yeah, if the coordinate
vector equals 0 then the vector itself
must equal 0, so that linear transformation is invertible.
Therefore it's an isomorphism and therefore the spaces that it acts on
really are isomorphic to each other, so we're done. That's the end of the theorem.
Okay, great, and just as one sort of final corollary of this is
that -- it's not too hard to show that if you have vector spaces V and W that
are isomorphic to each other and then vector spaces W and X that are
isomorphic to each other, then actually V and X must be isomorphic to each other.
In other words, isomorphisms are transitive. Okay, if this thing's the
same as this and this thing's the same as this, well then
all three of them are the same as each other.
So, one thing that you can do with this fact is,
when you combine it with this previous theorem, what it tells you
is that if you have any two vector spaces -- doesn't matter what they are --
if they have the same dimension and they're over the same field
then they're isomorphic to each other. Okay, so that's our final fact and
actually this is if and only if. If you have two finite-dimensional vector
spaces over the same field, so using the same
scalars, then they're isomorphic if and only if they have the same dimensions. So,
for example, R^2 and R^3 are not isomorphic because they're 2 and 3
dimensional, respectively, but R^2, well that's isomorphic to every
single plane in R^3, as long as they go through the
origin so they're really vector spaces. It's also isomorphic to the degree less
than or equal to 1 polynomials, the 1-by-2 matrices, and so on. R^4 is
isomorphic to the 2-by-2 matrices. R^8 is isomorphic to the degree less
than or equal to 7 polynomials, and so on. You know these things
automatically now because you only have to check
dimensions -- as long as the dimension is the same,
they're isomorphic. If the dimension's not the same they're not isomorphic,
and that's all there is to it. Okay, and it's because of coordinate vectors,
because if they have the same dimension we can represent
them by the same types of coordinate vectors.
So in a sense, the idea here is that vector spaces being isomorphic, like even though they look different on the
surface -- the labels for their vectors look different like maybe they look like
a polynomial, maybe they look like a matrix, maybe they look like a list of
numbers -- but really the way you do calculations
in those spaces no different. It's no more different than if you wrote
down those vectors using different fonts -- the choice of font doesn't matter,
it doesn't affect the calculation, just like none of those other fine
details affect the calculations. They don't affect the linear algebra at all,
so they're isomorphic to each other. Alright, so that'll do it for today's class.
I will see you all for Lecture 15, when we start talking about
other properties of linear transformations. We already talked about
invertibility -- we're going to go on to other properties now like range, rank,
eigenvalues, and stuff like that, so I'll see you then.