Thank you verymuch It covers 1. column std. matrix 2.Co variance matrix 3.eigen vector 4.eigen value 5.Orthogonal matrix of v1, v2, .... vd 6.projection 7.projection 100%, 75%,60%,50% In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.
Thank you sir. Brilliant methods and explanations to help a near beginner trying to grasp the nuance of eigenvalues and eigenvectors as applied in PCA and unsupervised machine learning. I'm not there yet, but I sure feel like I can and will after listening to you. Who needs machines ... lol?
Applied AI Course In 1:57 covariance matrix says Cov of X(S)= Transpose(X).X But in previous vedio we saw Cov of X(S)=1/n( Transpose(X).X ) ..Why this difference??
The goal of PCA is to find projection directions along which the variance of the projected data points is maximum. If you would write this goal in mathematical terms, and simplify the resulting objective, you would find a matrix in the resulting eigenvalue equation. This matrix is nothing but the covariance matrix. Hence, the covariance matrix naturally arises from the goal/objective of PCA. Please go through the link: www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/
Yes, these lectures are in a sequence. You need to first watch the videos for linear algebra and then watch PCA for a clear understanding. You can watch these videos in order on our website as these are our sample/free videos. Please login to AppliedAICourse.com and check out the sample videos which are in clean order starting from basics of Python
nice explanation of eigen vectors and its corresponding values; also the geometric picture of spread of the information on 2dto1d!!
Thank you verymuch
It covers
1. column std. matrix
2.Co variance matrix
3.eigen vector
4.eigen value
5.Orthogonal matrix of v1, v2, .... vd
6.projection
7.projection 100%, 75%,60%,50%
In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.
Thank you sir. Brilliant methods and explanations to help a near beginner trying to grasp the nuance of eigenvalues and eigenvectors as applied in PCA and unsupervised machine learning. I'm not there yet, but I sure feel like I can and will after listening to you. Who needs machines ... lol?
7:57 Always true only if it's a symmetric matrix
Applied AI Course
In 1:57 covariance matrix says Cov of X(S)= Transpose(X).X But in previous vedio we saw Cov of X(S)=1/n( Transpose(X).X ) ..Why this difference??
Sir, what is the purpose of using covariance matrix?
The goal of PCA is to find projection directions along which the variance of the projected data points is maximum. If you would write this goal in mathematical terms, and simplify the resulting objective, you would find a matrix in the resulting eigenvalue equation. This matrix is nothing but the covariance matrix. Hence, the covariance matrix naturally arises from the goal/objective of PCA. Please go through the link:
www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/
sir why cant we take v2 as our vector having maximum variance in a particular direction
As V1 has higher variance than V2. If you were to pick a single dimension with maximum variance, we pick V1 and not V2.
I dint understand anything from this. Do I need to watch any prior video relates to pca in order to understand this videos concepts
Yes, these lectures are in a sequence. You need to first watch the videos for linear algebra and then watch PCA for a clear understanding. You can watch these videos in order on our website as these are our sample/free videos. Please login to AppliedAICourse.com and check out the sample videos which are in clean order starting from basics of Python
can we get the answer for why u1=v1?
Check optimization videos...which are paid in their Website...