OK, this is linear
algebra lecture nine. And this is a key lecture, this
is where we get these ideas of linear independence,
when a bunch of vectors are independent -- or dependent,
that's the opposite. The space they span. A basis for a subspace
or a basis for a vector space, that's a central idea. And then the dimension
of that subspace. So this is the day
that those words get assigned clear meanings. And emphasize that we talk
about a bunch of vectors being independent. Wouldn't talk about a
matrix being independent. A bunch of vectors
being independent. A bunch of vectors
spanning a space. A bunch of vectors
being a basis. And the dimension
is some number. OK, so what are the definitions? Can I begin with a fact,
a highly important fact, that, I didn't call directly
attention to earlier. Suppose I have a matrix and
I look at Ax equals zero. Suppose the matrix
has a lot of columns, so that n is bigger than m. So I'm looking at n equations -- I mean, sorry, m
equations, a small number of equations m,
and more unknowns. I have more unknowns
than equations. Let me write that down. More unknowns than equations. More unknown x-s than equations. Then the conclusion
is that there's something in the null
space of A, other than just the zero vector. The conclusion is there
are some non-zero x-s such that Ax is zero. There are some
special solutions. And why? We know why. I mean, it sort of like seems
like a reasonable thing, more unknowns than equations,
then it seems reasonable that we can solve them. But we have a, a clear algorithm
which starts with a system and does elimination, gets
the thing into an echelon form with some pivots
and pivot columns, and possibly some free columns
that don't have pivots. And the point is here there
will be some free columns. The reason, so the
reason is there must -- there will be free
variables, at least one. That's the reason. That we now have this -- a complete, algorithm, a
complete systematic way to say, OK, we take the
system Ax equals zero, we row reduce, we identify
the free variables, and, since there are n
variables and at most m pivots, there will be some free
variables, at least one, at least n-m in fact, left over. And those variables I can
assign non-zero values to. I don't have to
set those to zero. I can take them to be
one or whatever I like, and then I can solve
for the pivot variables. So then it gives me a
solution to Ax equals zero. And it's a solution
that isn't all zeros. So, that's an important
point that we'll use now in this lecture. So now I want to say what does
it mean for a bunch of vectors to be independent. OK. So this is like the
background that we know. Now I want to speak
about independence. OK. Let's see. I can give you the abstract
definition, and I will, but I would also like to
give you the direct meaning. So the question is, when
vectors x1, x2 up to -- Suppose I have n vectors
are independent if. Now I have to give you -- or linearly independent -- I'll often just say and
write independent for short. OK. I'll give you the
full definition. These are just vectors
in some vector space. I can take combinations of them. The question is, do any
combinations give zero? If some combination
of those vectors gives the zero vector,
other than the combination of all zeros, then
they're dependent. They're independent if no
combination gives the zero vector -- and then I have, I'll have
to put in an except the zero combination. So what do I mean by that? No combination gives
the zero vector. Any combination
c1 x1+c2 x2 plus, plus cn xn is not zero except
for the zero combination. This is when all the c-s,
all the c-s are zero. Then of course. That combination --
I know I'll get zero. But the question is, does any
other combination give zero? If not, then the
vectors are independent. If some other combination
does give zero, the vectors are dependent. OK. Let's just take examples. Suppose I'm in, say, in
two dimensional space. OK. I give you -- I'd like to first
take an example -- let me take an example where
I have a vector and twice that vector. So that's two vectors, V and 2V. Are those dependent
or independent? Those are dependent
for sure, right, because there's one
vector is twice the other. One vector is twice
as long as the other, so if the word dependent
means anything, these should be dependent. And they are. And in fact, I would
take two of the first -- so here's, here is a vector V
and the other guy is a vector 2V, that's my -- so there's a vector V1 and
my next vector V2 is 2V1. Of course those are
dependent, because two of these first vectors minus
the second vector is zero. That's a combination of these
two vectors that gives the zero vector. OK, that was clear. Suppose, suppose
I have a vector -- here's another example. It's easy example. Suppose I have a vector and the
other guy is the zero vector. Suppose I have a vector V1
and V2 is the zero vector. Then are those vectors
dependent or independent? They're dependent again. You could say, well, this
guy is zero times that one. This one is some
combination of those. But let me write
it the other way. Let me say -- what combination,
how many V1s and how many V2s shall I take to get the zero vector? If, if V1 is like the vector two
one and V2 is the zero vector, zero zero, then I
would like to show that some combination of
those gives the zero vector. What shall I take? How many V1s shall I take? Zero of them. Yeah, no, take no V1s. But how many V2s? Six. OK. Or five. Then -- in other words,
the point is if the zero vector's in there,
if the zero -- if one of these vectors
is the zero vector, independence is dead, right? If one of those vectors is the
zero vector then I could always take -- include that one and
none of the others, and I would get the zero answer,
and I would show dependence. OK. Now, let me, let me
finally draw an example where they will be independent. Suppose that's V1 and that's V2. Those are surely
independent, right? Any combination of
V1 and V2, will not be zero except, the
zero combination. So those would be independent. But now let me, let me
stick in a third vector, V3. Independent or dependent
now, those three vectors? So now n is three here. I'm in two dimensional space,
whatever, I'm in the plane. I have three vectors that
I didn't draw so carefully. I didn't even tell you
what exactly they were. But what's this answer on
dependent or independent? Dependent. How do I know those
are dependent? How do I know that some
combination of V1, V2, and V3 gives me the zero vector? I know because of that. That's the key
fact that tells me that three vectors in the
plane have to be dependent. Why's that? What's the connection between
the dependence of these three vectors and that fact? OK. So here's the connection. I take the matrix A that has
V1 in its first column, V2 in its second column,
V3 in its third column. So it's got three columns. And V1 -- I don't know, that
looks like about two one to me. V2 looks like it
might be one two. V3 looks like it might be maybe
two, maybe two and a half, minus one. OK. Those are my three vectors, and
I put them in the columns of A. Now that matrix A
is two by three. It fits this pattern, that
where we know we've got extra variables, we know we
have some free variables, we know that there's
some combination -- and let me instead of x-s, let
me call them c1, c2, and c3 -- that gives the zero vector. Sorry that my little bit
of art got in the way. Do you see the point? When I have a matrix,
I'm interested in whether its columns are
dependent or independent. The columns are
dependent if there is something in the null space. The columns are
dependent because this, this thing in the
null space says that c1 of that plus c2 of
that plus c3 of this is zero. So in other words, I can go
out some V1, out some more V2, back on V3, and end up zero. OK. So let -- here I've give the
general, abstract definition, but let me repeat
that definition -- this is like repeat -- let me call them Vs now. V1 up to Vn are the
columns of a matrix A. In other words,
this is telling me that if I'm in m
dimensional space, like two dimensional
space in the example, I can answer the
dependence-independence question directly by
putting those vectors in the columns of a matrix. They are independent if the
null space of A, of A, is what? If I have a bunch of
columns in a matrix, I'm looking at
their combinations, but that's just A times
the vector of c-s. And these columns
will be independent if the null space of
A is the zero vector. They are dependent if there's
something else in there. If there's something else in
the null space, if A times c gives the zero vector
for some non-zero vector c in the null space. Then they're dependent,
because that's telling me a combination of the
columns gives the zero column. I think you're with
be, because we've seen, like, lecture after
lecture, we're looking at the combinations
of the columns and asking, do we get zero or don't we? And now we're giving
the official name, dependent if we do,
independent if we don't. So I could express this
in other words now. I could say the rank -- what's
the rank in this independent case? The rank r of the,
of the matrix, in the case of
independent columns, is? So the columns are independent. So how many pivot
columns have I got. All n. All the columns would
be pivot columns, because free columns
are telling me that they're a combination
of earlier columns. So this would be the
case where the rank is n. This would be the case where
the rank is smaller than n. So in this case the rank is
n and the null space of A is only the zero vector. And no free variables. No free variables. And this is the case
yes free variables. If you'll allow me to stretch
the English language that far. That's the case where
we have, a combination that gives the zero column. I'm often interested in the
case when my vectors are popped into a matrix. So the, the definition
over there of independence didn't talk about any matrix. The vectors didn't have to be
vectors in N dimensional space. And I want to give
you some examples of vectors that
aren't what you think of immediately as vectors. But most of the time, this is
-- the vectors we think of are columns. And we can put them in a matrix. And then independence
or dependence comes back to the null space. OK. So that's the idea
of independence. Can I just, yeah, let
me go on to spanning a What does it mean for a bunch
of vectors to span a space? space. Well, actually, we've
seen it already. You remember, if we had
a columns in a matrix, we took all their
combinations and that gave us the column space. Those vectors that we started
with span that column space. So spanning a space means -- so let me move that
important stuff right up. OK. So vectors -- let me call
them, say, V1 up to -- call you some different
letter, say Vl -- span a space, a subspace,
or just a vector space I could say, span a
space means, means the space consists of all
combinations of those vectors. That's exactly what we
did with the column space. So now I could say in shorthand
the columns of a matrix span the column space. So you remember it's a bunch of
vectors that have this property that they span a space, and
actually if I give you a bunch of vectors and say -- OK, let S be the
space that they span, in other words let S contain
all their combinations, that space S will
be the smallest space with those
vectors in it, right? Because any space with
those vectors in it must have all the combinations
of those vectors in it. And if I stop there, then
I've got the smallest space, and that's the space
that they span. OK. So I'm just -- rather than, needing to say,
take all linear combinations and put them in a space,
I'm compressing that into the word span. Straightforward. So if I think of a, of
the column space of a OK. matrix. I've got their -- so I
start with the columns. I take all their combinations. That gives me the columns space. They span the column space. Now are they independent? Maybe yes, maybe no. It depends on the particular
columns that went into that matrix. But obviously I'm highly
interested in a set of vectors that spans a
space and is independent. That's, that means like I've
got the right number of vectors. If I didn't have all of them,
I wouldn't have my whole space. If I had more than that,
they probably wouldn't -- they wouldn't be independent. So, like, basis -- and that's
the word that's coming -- is just right. So here let me put
what that word means. A basis for a vector space is,
is a, is a sequence of vectors -- shall I call them V1, V2,
up to let me say Vd now, I'll stop with that letters
-- that has two properties. I've got enough vectors
and not too many. It's a natural idea of a basis. So a basis is a bunch
of vectors in the space and it's a so it's a sequence
of vectors with two properties, with two properties. One, they are independent. And two -- you
know what's coming? -- they span the space. OK. Let me take -- so time for examples, of course. So I'm asking you now
to put definition one, the definition of independence,
together with definition two, and let's look at examples,
because this is -- this combination
means the set I've -- of vectors I have is
just right, and the -- so that this idea of a
basis will be central. I'll always be asking
you now for a basis. Whenever I look at a
subspace, if I ask you for -- if you give me a basis
for that subspace, you've told me what it is. You've told me everything I need
to know about that subspace. Those -- I take their
combinations and I know that I need all the combinations. OK. Examples. OK, so examples of a basis. Let me start with two
dimensional space. Suppose the space
-- say example. The space is, oh,
let's make it R^3. Real three dimensional space. Give me one basis. One basis is? So I want some vectors, because
if I ask you for a basis, I'm asking you for vectors,
a little list of vectors. And it should be just right. So what would be a basis
for three dimensional space? Well, the first basis that
comes to mind, why don't we write that down. The first basis
that comes to mind is this vector, this
vector, and this vector. OK. That's one basis. Not the only basis, that's
going to be my point. But let's just see --
yes, that's a basis. Are, are those
vectors independent? So that's the like the x, y,
z axes, so if those are not independent, we're in trouble. Certainly, they are. Take a combination c1 of this
vector plus c2 of this vector plus c3 of that
vector and try to make it give the zero vector. What are the c-s? If c1 of that plus c2 of that
plus c3 of that gives me 0 0 0, then the c-s are all -- 0, right. So that's the test
for independence. In the language of matrices,
which was under that board, I could make those the
columns of a matrix. Well, it would be
the identity matrix. Then I would ask, what's the
null space of the identity matrix? And you would say it's
only the zero vector. And I would say, fine, then
the columns are independent. The only thing -- the identity
times a vector giving zero, the only vector that
does that is zero. OK. Now that's not the only basis. Far from it. Tell me another basis, a
second basis, another basis. So, give me -- well,
I'll just start it out. One one two. Two two five. Suppose I stopped there. Has that little bunch of
vectors got the properties that I'm asking for in
a basis for R^3? We're looking for
a basis for R^3. Are they independent,
those two column vectors? Yes. Do they span R^3? No. Our feeling is no. Our feeling is no. Our feeling is that there're
some vectors in R3 that are not combinations of those. OK. So suppose I add in -- I need another vector
then, because these two don't span the space. OK. Now it would be foolish for me
to put in three three seven, right, as the third vector. That would be a goof. Because that, if I put
in three three seven, those vectors would
be dependent, right? If I put in three
three seven, it would be the sum
of those two, it would lie in the
same plane as those. It wouldn't be independent. My attempt to create
a basis would be dead. But if I take -- so
what vector can I take? I can take any vector
that's not in that plane. Let me try -- I hope that 3 3 8 would do it. At least it's not the
sum of those two vectors. But I believe that's a basis. And what's the test then,
for that to be a basis? Because I just picked those
numbers, and if I had picked, 5 7 -14 how would we know do
we have a basis or don't we? You would put them in
the columns of a matrix, and you would do
elimination, row reduction -- and you would see do you
get any free variables or are all the
columns pivot columns. Well now actually
we have a square -- the matrix would
be three by three. So, what's the test
on the matrix then? The matrix -- so in this case,
when my space is R^3 and I have three vectors, my matrix is
square and what I asking about that matrix in order for
those columns to be a basis? So in this -- for R^n, if I have -- n vectors
give a basis if the n by n matrix with those columns,
with those columns, is what? What's the requirement
on that matrix? Invertible, right, right. The matrix should be invertible. For a square matrix, that's
the, that's the perfect answer. Is invertible. So that's when, that's when the
space is the whole space R^n. Let me, let me be sure
you're with me here. Let me remove that. Are those two vectors a
basis for any space at all? Is there a vector
space that those really are a basis for, those, that
pair of vectors, this guy and this 1, 1 1 2 and 2 2 5? Is there a space for
which that's a basis? Sure. They're independent, so they
satisfy the first requirement, so what space shall I take
for them to be a basis of? What spaces will
they be a basis for? The one they span. Their combinations. It's a plane, right? It'll be a plane inside R^3. So if I take this vector
1 1 2, say it goes there, and this vector 2 2
5, say it goes there, those are a basis for --
because they span a plane. And they're a basis for
the plane, because they're independent. If I stick in some
third guy, like 3 3 7, which is in the plane -- suppose
I put in, try to put in 3 3 7, then the three vectors
would still span the plane, but they wouldn't be a basis
anymore because they're not independent anymore. So, we're looking at
the question of -- again, OK. the case with
independent columns is the case where the column
vectors span the column space. They're independent, so they're
a basis for the column space. OK. So now there's one
bit of intuition. Let me go back to all of R^n. So I -- where I put 3 3 8. OK. The first message is that the
basis is not unique, right. There's zillions of bases. I take any invertible
three by three matrix, its columns are a basis for R^3. The column space is
R^3, and if those, if that matrix is invertible,
those columns are independent, I've got a basis for R^3. So there're many, many bases. But there is something in
common for all those bases. There's something that this
basis shares with that basis and every other basis for R^3. And what's that? Well, you saw it coming, because
when I stopped here and asked if that was a basis
for R^3, you said no. And I know that you said
no because you knew there weren't enough vectors there. And the great fact is that
there're many, many bases, but -- let me put in somebody
else, just for variety. There are many, many
bases, but they all have the same number of vectors. If we're talking
about the space R^3, then that number of
vectors is three. If we're talking
about the space R^n, then that number
of vectors is n. If we're talking about
some other space, the column space of some matrix,
or the null space of some matrix, or some other space
that we haven't even thought of, then that still is
true that every basis -- that there're lots of bases but
every basis has the same number of vectors. Let me write that
great fact down. Every basis --
we're given a space. Given a space. R^3 or R^n or some other column
space of a matrix or the null space of a matrix or
some other vector space. Then the great fact
is that every basis for this, for the space has
the same number of vectors. If one basis has six vectors,
then every other basis has six vectors. So that number six
is telling me like it's telling me how
big is the space. It's telling me
how many vectors do I have to have to have a basis. And of course we're
seeing it this way. That number six, if
we had seven vectors, then we've got too many. If we have five vectors
we haven't got enough. Sixes are like just right
for whatever space that is. And what do we call that number? That number is -- now I'm ready
for the last definition today. It's the dimension
of that space. So every basis for a space has
the same number of vectors in it. Not the same vectors,
all sorts of bases -- but the same number of
vectors is always the same, and that number
is the dimension. This is definitional. This number is the
dimension of the space. OK. OK. Let's do some examples. Because now we've
got definitions. Let me repeat the four
things, the four words that have now got defined. Independence, that looks
at combinations not being zero. Spanning, that looks at
all the combinations. Basis, that's the
one that combines independence and spanning. And now we've got the idea
of the dimension of a space. It's the number of
vectors in any basis, because all bases
have the same number. OK. Let's take examples. Suppose I take, my space
is -- examples now -- space is the, say, the
column space of this matrix. Let me write down a matrix. 1 1 1, 2 1 2, and I'll
-- just to make it clear, I'll take the sum there, 3 2 3,
and let me take the sum of all -- oh, let me put
in one -- yeah, I'll put in one one one again. OK. So that's four vectors. OK, do they span the column
space of that matrix? Let me repeat, do they span the
column space of that matrix? By definition, that's
what the column space -- Yes. where it comes from. Are they a basis for
the column space? Are they independent? No, they're not independent. There's something
in that null space. Maybe we can -- so let's look
at the null space of the matrix. Tell me a vector that's in
the null space of that matrix. So I'm looking for some vector
that combines those columns and produces the zero column. Or in other words, I'm
looking for solutions to A X equals zero. So tell me a vector
in the null space. Maybe -- well, this was, this
column was that one plus that one, so maybe if I have one of
those and minus one of those that would be a
vector in the null space. So, you've already told me now,
are those vectors independent, the answer is -- those column
vectors, the answer is -- no. Right? They're not independent. Because -- you knew they
weren't independent. Anyway, minus one
of this minus one of this plus one of this zero
of that is the zero vector. OK, so they're not independent. OK. They span, but they're
not independent. Tell me a basis for
that column space. What's a basis for
the column space? These are all the questions that
the homework asks, the quizzes ask, the final exam will ask. Find a basis for the column
space of this matrix. OK. Now there's many
answers, but give me the most natural answer. Columns one and two. Columns one and two. That's the natural answer. Those are the pivot
columns, because, I mean, we s- we begin systematically. We look at the first
column, it's OK. We can put that in the basis. We look at the second
column, it's OK. We can put that in the basis. The third column we
can't put in the basis. The fourth column
we can't, again. So the rank of the matrix is -- what's the rank of our matrix? Two. Two. And, and now that rank is also
-- we also have another word. We, we have a
great theorem here. The rank of A, that rank r,
is the number of pivot columns and it's also -- well, so now please
use my new word. This, it's the number
two, of course, two is the rank
of my matrix, it's the number of pivot columns,
those pivot columns form a basis, of course,
so what's two? It's the dimension. The rank of A, the
number of pivot columns, is the dimension of
the column space. Of course, you say. It had to be. Right. But just watch, look
for one moment at the, the language, the
way the English words get involved here. I take the rank of a matrix,
the rank of a matrix. It's a number of columns
and it's the dimension of -- not the dimension of the matrix,
that's what I want to say. It's the dimension of a space,
a subspace, the column space. Do you see, I don't
take the dimension of A. That's not what I want. I'm looking for the dimension
of the column space of A. If you use those words right,
it shows you've got the idea right. Similarly here. I don't talk about the
rank of a subspace. It's a matrix that has a rank. I talk about the
rank of a matrix. And the beauty is that
these definitions just merge so that the
rank of a matrix is the dimension of
its column space. And in this example it's two. And then the further
question is, what's a basis? And the first two
columns are a basis. Tell me another basis. Another basis for
the columns space. You see I just keep
hammering away. I apologize, but it's,
I have to be sure you have the idea of basis. Tell me another basis
for the column space. Well, you could take
columns one and three. That would be a basis
for the column space. Or columns two and
three would be a basis. Or columns two and four. Or tell me another basis that's
not made out of those columns at all? So -- I guess I'm giving you
infinitely many possibilities, so I can't expect a
unanimous answer here. I'll tell you -- but let's
look at another basis, though. I'll just -- because it's
only one out of zillions, I'm going to put it down
and I'm going to erase it. Another basis for the
column space would be -- let's see. I'll put in some things
that are not there. Say, oh well, just to make
it -- my life easy, 2 2 2. That's in the column space. And, that was sort of obvious. Let me take the sum
of those, say 6 4 6. Or the sum of all of the
columns, 7 5 7, why not. That's in the column space. Those are independent and
I've got the number right, I've got two. Actually, this is a key point. If you know the dimension of
the space you're working with, and we know that this column
-- we know that the dimension, DIM, the dimension of
the column space is two. If you know the
dimension, then -- and we have a couple of
vectors that are independent, they'll automatically be a basis. If we've got the number
of vectors right, two vectors in this case,
then if they're independent, they can't help
but span the space. Because if they
didn't span the space, there'd be a third guy
to help span the space, but it couldn't be independent. So, it just has
to be independent if we've got the numbers right. And they span. OK. Very good. So you got the
dimension of a space. So this was another basis
that I just invented. OK. Now, now I get to ask
about the null space. What's the dimension
of the null space? So we, we got a
great fact there, the dimension of the
column space is the rank. Now I want to ask you
about the null space. That's the other
part of the lecture, and it'll go on to
the next lecture. OK. So we know the dimension of the
column space is two, the rank. What about the null space? This is a vector
in the null space. Are there other vectors
in the null space? Yes or no? Yes. So this isn't a basis because
it's doesn't span, right? There's more in the null
space than we've got so far. I need another vector at least. So tell me another
vector in the null space. Well, the natural choice, the
choice you naturally think of is I'm going on to
the fourth column, I'm letting that free
variable be a one, and that free variable
be a zero, and I'm asking is that fourth
column a combination of my pivot columns? Yes, it is. And it's -- that will do. So what I've written
there are actually the two special solutions, right? I took the two free
variables, free and free. I gave them the values 1 0
or 0 I figured out the rest. So do you see, let me
just say it in words. This vector, these vectors in
the null space are telling me, they're telling me
the combinations of the columns that give zero. They're telling me in what way
the, the columns are dependent. That's what the
null space is doing. Have I got enough now? And what's the null space now? We have to think
about the null space. These are two vectors
in the null space. They're independent. Are they a basis
for the null space? What's the dimension
of the null space? You see that those questions
just keep coming up all the time. Are they a basis
for the null space? You can tell me the answer
even though we haven't written out a proof of that. Can you? Yes or no? Do these two special
solutions form a basis for the null space? In other words,
does the null space consist of all combinations
of those two guys? Yes or no? Yes. Yes. The null space is
two dimensional. The null space, the
dimension of the null space, is the number of free variables. So the dimension
of the null space is the number of free variables. And at the last second,
give me the formula. This is then the key
formula that we know. How many free variables are
there in terms of R, the rank, m -- the number of rows,
n, the number of columns? What do we get? We have n columns, r of
them are pivot columns, so n-r is the number of free
columns, free variables. And now it's the dimension
of the null space. OK. That's great. That's the key spaces, their
bases, and their dimensions. Thanks.