Thank you very much for this video. I spend many time in stackexchange trying to understand this relationship, but you explain it really well in a short video. Thank you so much.
in 8:00 you say that "we can get all the eigen vector and eigen values by just doing the SVD of X' * X (sample coveriance matrix)" but I think what you mean is by just simply doing the eigen decomposition of X' * X, or ultimately SVD of X' and take the V values, am I correct?
I assume too?! i was confused a bit but this is what i got now: X = Design matrix with the variables in the columns and the observations in the rows, the data must be centered and normalized: X = (X(ij) - avg(j))/(n-1), where n is the amount of observations A = Covarinace matrix of X D = Diagonal matrix containing the eigenvalues of A S = Diagonal matrix containing the singualr values of X X = USV^T SVD of the design matrix A = VDV^T Eigendecomposition of the covariance matrix Now: X^TX = A = (VS^TU^T)(USV^T) = VS^T(SV^T = VDV^T since S is a diagonal: S^TS = S^2 = D and U is orthogonal: U^T=U^-1 So the root of the eigenvalues of the covariance matrix are the same as the singular values of the design matrix Or the other way around: the squares of the singular values are the same as the eigenvalues of the covariance matrix
Excellent video..always had some doubt on why we take the covariance matrix when computing eigen decomposition. Thank you and had a lot of my doubts clarified by this video.
This fills in so many gaps I had in my knowledge on PCA and SVD. Thank you
Very big pleasure! :)
One of the best explanations on PCA relationship with SVD!
the best explanation of pca and how it relates to svd and the eigen value and eigen vectors 🙌
Thank you very much for this video. I spend many time in stackexchange trying to understand this relationship, but you explain it really well in a short video. Thank you so much.
Excellent explanation. I could join all the dots regarding my understanding! Thanks a lot :)
At last a good explanation... Thanks.
the best explanation I've heard on this subject!
a great video. I have learned a lot from your machine learning series. 😁
in 8:00 you say that "we can get all the eigen vector and eigen values by just doing the SVD of X' * X (sample coveriance matrix)" but I think what you mean is by just simply doing the eigen decomposition of X' * X, or ultimately SVD of X' and take the V values, am I correct?
I assume too?! i was confused a bit but this is what i got now:
X = Design matrix with the variables in the columns and the observations in the rows, the data must be centered and normalized: X = (X(ij) - avg(j))/(n-1), where n is the amount of observations
A = Covarinace matrix of X
D = Diagonal matrix containing the eigenvalues of A
S = Diagonal matrix containing the singualr values of X
X = USV^T SVD of the design matrix
A = VDV^T Eigendecomposition of the covariance matrix
Now: X^TX = A = (VS^TU^T)(USV^T) = VS^T(SV^T = VDV^T since S is a diagonal: S^TS = S^2 = D and U is orthogonal: U^T=U^-1
So the root of the eigenvalues of the covariance matrix are the same as the singular values of the design matrix
Or the other way around: the squares of the singular values are the same as the eigenvalues of the covariance matrix
Oh man, your channel its amazing, i am from brazil and i liked of your explanation about this content.. congratulations.
Thanks a ton for the encouragement!! :D
Excellent video..always had some doubt on why we take the covariance matrix when computing eigen decomposition. Thank you and had a lot of my doubts clarified by this video.
Huge pleasure! :) I also struggled with this stuff the first (many) times I looked at it.
Great video ! This clears my doubts
Really nice explanation, thanks for this! :)
Brilliant, You forgot to center the data but that's a trivial oversight.
Nice video ! Note that X has to be centered (I don't know the definition of design matrix though x) ) so that we have the a correct covariance matrix
The best explanation!
Cool video, thanks.
thx dude, it was clear explanation
Thanks ! great video
Amazing video !
Thanks Prachi!! :)
awesomeee
Awesomeeeeeeee