In this question Mr.Strang wrote the general formula for column space of matrices (in this case rectangular matrix, otherwise this formula would always give Identity). Question asks to project onto column space, in that case you just take A as [(0,1,0) (1,0,2)]. Using A(T)A would give you a 2x2 matrix with rank 2, now being invertible.
30:55 Since the cofactor here is 0 coincidentally, it would be better to use a general tridiagonal matrix(n by n or at least 5 by 5) to show that the cofactor is Dn-2
It's both negative when seeing the position in the big matrix. However when calculating the cofactor the second one is in position a11 of the 3x3 matrix whose determinant we are calculating.
@@weilinfu5736 is your argument according to the fact that linear combinations of the matrix row vectors are same as the linear combinations of the rows of the compressed matrix, i mean they both span same space hence there would be no loss of properties
Sure. First note that the least squares solution is the vector x which minimizes the distance between Ax and b. We know that the closest vector to b in the column space of A is exactly the projection of b onto the column space of A. This projection is given by A(A^TA)^-1A^Tb so Ax = A(A^TA)^-1A^Tb. Multiplying both sides by A^T on the left yields A^TAx = A^TA(A^TA)^-1A^Tb = A^Tb which is the equation you see him write. This will always yield a least squares solution even if the matrix A^TA is not itself invertible.
DR. Strang, from looking at your pass quizzes at MIT they seem very challenging to me. There are many twist and turns in your quizzes. I hope that the MIT students are learning linear algebra that stays with them forever. These students are supposed to be the world's brightest.
i think the explanation of the first queston was a little bit wrong it seems. because he wrote the equation to diagonalize the matxix P even though it does not have 3 independent eigen vectors.
Audio channels fixed!
Thanks, my left ear is finally able to learn.
thank you!
Thank you so much Professor Strang and MITTTTT, my best summer friendddd! yahoo! Linear Algebra!
It's Christmas and I'm reviewing Linear Algebra.
@@wilhellmllw3608 Haha. Team over achievers!
It's the middle of the Corona Crisis and I'm reviewing Linear Algebra. Nothing is more important than making progress!
The biggest of sacrifices require the strongest of wills - Thanos
Labroidas 😂 ty 4 that
its summer vacation, im learning linear algebra while others are smoking weed
At 42:15, how would I find the projection matrix, if the inverse of Atranspose*A doesn't exist, such as in this example?
In this question Mr.Strang wrote the general formula for column space of matrices (in this case rectangular matrix, otherwise this formula would always give Identity). Question asks to project onto column space, in that case you just take A as [(0,1,0) (1,0,2)]. Using A(T)A would give you a 2x2 matrix with rank 2, now being invertible.
@Chiao Chao The confusing part is the inverse…
30:55 Since the cofactor here is 0 coincidentally, it would be better to use a general tridiagonal matrix(n by n or at least 5 by 5) to show that the cofactor is Dn-2
31:54 Could someone please explain why both of them aren't minus? (which makes outcome plus not minus)
I have the same doubt
It's both negative when seeing the position in the big matrix. However when calculating the cofactor the second one is in position a11 of the 3x3 matrix whose determinant we are calculating.
Thank you for sharing us human these invaluable contents.
42:48 matix A^TA is not invertible, then what about the projection matrix P
@@weilinfu5736 tq😊
@@weilinfu5736 is your argument according to the fact that linear combinations of the matrix row vectors are same as the linear combinations of the rows of the compressed matrix, i mean they both span same space hence there would be no loss of properties
@@weilinfu5736 hey there, thanks from india
I was actually studying the topics for ml to have a deep grasp of ml algorithms
Can anyone explain how projection matrices and fitting the lines automatically minimizes the least square error?
Sure. First note that the least squares solution is the vector x which minimizes the distance between Ax and b. We know that the closest vector to b in the column space of A is exactly the projection of b onto the column space of A. This projection is given by A(A^TA)^-1A^Tb so Ax = A(A^TA)^-1A^Tb. Multiplying both sides by A^T on the left yields A^TAx = A^TA(A^TA)^-1A^Tb = A^Tb which is the equation you see him write. This will always yield a least squares solution even if the matrix A^TA is not itself invertible.
15:02 i couldn't find c for the projection matrix
I think it's done as
C^k = S[LAMBDA]^kS^-1
Find Uk...
Easy, it's in Europe!
DR. Strang, from looking at your pass quizzes at MIT they seem very challenging to me. There are many twist and turns in your quizzes. I hope that the MIT students are learning linear algebra that stays with them forever. These students are supposed to be the world's brightest.
i think the explanation of the first queston was a little bit wrong it seems. because he wrote the equation to diagonalize the matxix P even though it does not have 3 independent eigen vectors.
It has.
For lambda = 0, 0 we have x1 = , x2 = and for lambda = 1 we have x3 =
Something like (-1)^(n/2) * ((n-1)!!)^2 for the determinant of the even ones?
I'd like have a Grammpa iquals him
from 44:51 LOL
never fails to crack me up randomly while I'm studying haha
@@berrycoolcat Yeah