I was wondering if the second method would work for a subspace more than 1dimension. How would we use the second method if the subspace was 2 dimension in R^3???
Another simple method is by recognising the fact that orthogonal projection matrix will be symmetric matrix , let's call this P, and from spectral theorem - every symmetric matrix is orthogonally diagonalizable , and you already have all the required component of this orthogonal diagonalization , We know eigenvalues (1,0,0) , we know eigenvector corresponding to eigenvalue 1 , i.e. above given vector v_1 , Don't forget to normalize this, lets call normalized vector b_1, other eigen vectors are not required because we are gonna use Spectral decomposition and components corresponding to 0 eigenvalues will be anyway 0 ... A = 1(b_1 b_1^T) --- That's it.
So helpful! Thank you so much!
Good explanation 👊🏾
Very helpful!!! Thank you
I'm confused how to apply this when the span of W is more than one vector
I think if you use gram-schmidt thats how you solve it. Not sure though
I was wondering if the second method would work for a subspace more than 1dimension. How would we use the second method if the subspace was 2 dimension in R^3???
It does, that is exactly what my homework assignment is asking about.
Another simple method is by recognising the fact that orthogonal projection matrix will be symmetric matrix , let's call this P, and from spectral theorem - every symmetric matrix is orthogonally diagonalizable , and you already have all the required component of this orthogonal diagonalization , We know eigenvalues (1,0,0) , we know eigenvector corresponding to eigenvalue 1 , i.e. above given vector v_1 , Don't forget to normalize this, lets call normalized vector b_1, other eigen vectors are not required because we are gonna use Spectral decomposition and components corresponding to 0 eigenvalues will be anyway 0 ...
A = 1(b_1 b_1^T) --- That's it.
Does this work in R2 for projecting onto a line
yes!
Thank you!
Thanks!
🖐🤚