Gram-Schmidt Orthogonalization

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
ANA RITA PIRES: In lecture, you've learned about Gram-Schmidt orthogonalization, and that's what today's problem is about. We have a matrix A, and its columns are a, b, and c. And I want you to find orthonormal vectors q_1, q_2, and q_3 from those three columns. Then I want you to write A as a-- it's QR decomposition where Q is an orthogonal matrix, and R is an upper triangular matrix. Remember, an orthogonal matrix is a matrix whose columns are orthonormal vectors. Work on it for a little while, hit pause, and when you're ready I'll come back and we'll do it together. Did you manage to solve that all right? Well let's start solving it together. So Gram-Schmidt orthogonalization, as you should remember from lecture, consists of the following. At each step, you find your orthonormal vector by taking the vector that you started with, a, b, or c in this case, and making it orthonormal to the previous ones. Let's actually do it. We want to find q_1. Well, to find q_1, start with a, and make it orthonormal to the previous one. There's no previous one, so that's very easy. The direction of a is fine and you just need to ensure that your vector has length 1. Well, a is the vector [1, 0, 0]. So you should divide it by its length, but its length is 1, so this is simply [1, 0, 0]. q_1 is done. Now let's do q_2. So with q_2, I will start with my vector b. And then I want to make it, well first of all, orthogonal to what I already have, which is q_1. For that, I'm going to subtract off from b the projection of b onto q_1. Minus b dot q_1 times q_1. Usually, when you're doing the projection of a vector onto another vector, you have to divide it by the length of, in this case, q1_. But because q_1 has length 1, you don't need to do that. So what will it be here? Well b dot q_1 is going to be-- b is [2, 0, 3], minus, b dot q_1 will be 2, and [1, 0, 0]. So this will be [0, 0, 3]. This vector is orthogonal to this one, and you can check by doing your dot product. It should be 0, and it is. We need it also to be length 1, because we want these two vectors to be orthonormal. So this is not actually q_2. Let's call this one q_2 prime, and set q_2 equals to q_2 prime divided by its length, which in this case is 3. [0, 0, 1]. That's my vector q_2. Let's go on to the third one, q_3. Well again, I start with my third vector, c. And then I want to subtract the projection of c onto q_1 and onto q_2, and that will give me a q_3 that is orthogonal to both q_1 and q_2. c minus c dot q_1 times q_1 minus c dot q_2 dot q_2. This will be c, [4, 5, 6] minus, q_1 was [1 0 0], so 4, times [1, 0, 0], minus-- q2-- 6 [0, 0, 1]. So this vector will be [0, 5, 0]. And once again, this one is orthogonal to q_1 and q_2, but it is not norm 1 yet. So q_3-- I'll call that one q_3 prime, and I'll set q_3 equal to q_3 prime divided by its length which is 5. q_3 is the vector [0, 1, 0]. One thing that I want you to note is that my vectors q_1, q_2, q_3 are very nice in this case. Usually, when you perform Gram-Schmidt orthogonalization, you end up with lots of square roots because you're dividing by the length. In this case, we have everything is integers, which is, well, very lucky. Next part of the problem is we want to write the QR decomposition of the matrix A. A equals Q*R. Well, the matrix A, you already know what it is. It is the matrix [1, 2, 4; 0, 0, 5; 0, 3, 6]. And Q you want to be an orthogonal matrix. Like I said before, an orthogonal matrix has orthonormal vectors for its columns. And we already have such a matrix. It's the matrix that has q_1, q_2, and q_3 as its column vectors. 1, 0, 0; 0, 0, 1; and 0, 1, 0. Now, we need an upper triangular matrix that makes this equality true. Take a moment to look at your matrix Q. It is simply a permutation matrix, so it's very easy to come up with a matrix that should fit here. What this permutation matrix does is it exchanges rows two and three from my matrix R to give you A. So you know what R must be. It must be 1, 2, 4; 0, 3, 6-- that's the third row of A-- and then 0, 0, 5, which is the second row of A. And indeed, R is upper triangular. This is your QR decomposition of the matrix A. Q is orthogonal, R is upper triangular. But let's see where these numbers in the matrix R are coming from. Let me label these columns for you, a, b, c and q_1, q_2, q_3. And then I have my matrix R. You know from the way that matrix multiplication works that A is going to be this matrix Q times the first column of R. So you can regard that as these numbers in the first column of R are giving you the linear coefficients in which you need to take these vectors to add up to A. Let me write that down. A is going to be 1 times q_1 plus 0 times q_2 plus 0 times q_3. Let's do it for b. The second column of this matrix will be Q times this column. So it will be 2 times q_1 plus 3 times q_2 plus 0 times q_3. And finally, for c I will have c is equal to this matrix times this vector, 4*q_1 plus 6*q_2 plus 5*q_3. Now let's go back and see where these numbers are showing up. I wanted to have A equals 1 times q_1. Well that's very easy. It comes from here, a equals 1 times q_1. Let's try the second one. b equals 2*q_1 plus 3*q_2. Well q_2, let's see. q_2 prime is equal to 3 times q_2, so let me write this here to help. 3*q_2. Now let me remind you that b dot q_1 was equal to 2. Now look at this equation. You we have b is equal to 2*q_1 plus 3*q_2, which is what we wanted. Let's check q_3. q_3 prime is equal to 5*q_3 so let me write that here, 5*q_3. And now I have c is equal to-- this number was 4, and this number was 6-- c is equal to 4*q_1 plus 6*q_2 plus 5*q_3, which indeed, is what we wanted. So this is where these numbers from the matrix R are coming from. And that finishes this problem. I hope you have a better grasp of the QR decomposition now. Bye. See you next time.
Info
Channel: MIT OpenCourseWare
Views: 14,871
Rating: 4.9692307 out of 5
Keywords: linear algebra, Gram-Schmidt orthogonalization
Id: HEQuN0QELSQ
Channel Id: undefined
Length: 10min 0sec (600 seconds)
Published: Wed Jul 25 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.