PCA 6 - Relationship to SVD

Поделиться
HTML-код
  • Опубликовано: 19 дек 2024

Комментарии • 27

  • @shuddR
    @shuddR 3 года назад +17

    This fills in so many gaps I had in my knowledge on PCA and SVD. Thank you

    • @kamperh
      @kamperh  3 года назад +2

      Very big pleasure! :)

  • @ozysjahputera7669
    @ozysjahputera7669 5 месяцев назад

    One of the best explanations on PCA relationship with SVD!

  • @RaviChoudhary_iitkgp
    @RaviChoudhary_iitkgp Год назад

    the best explanation of pca and how it relates to svd and the eigen value and eigen vectors 🙌

  • @RaksohOap
    @RaksohOap 2 года назад +1

    Thank you very much for this video. I spend many time in stackexchange trying to understand this relationship, but you explain it really well in a short video. Thank you so much.

  • @atrayeedasgupta2872
    @atrayeedasgupta2872 Год назад

    Excellent explanation. I could join all the dots regarding my understanding! Thanks a lot :)

  • @edgar_benitez
    @edgar_benitez 3 года назад +3

    At last a good explanation... Thanks.

  • @johnjunhyukjung
    @johnjunhyukjung Год назад

    the best explanation I've heard on this subject!

  • @m.preacher2829
    @m.preacher2829 Год назад

    a great video. I have learned a lot from your machine learning series. 😁

  • @xerocool2109
    @xerocool2109 Год назад +2

    in 8:00 you say that "we can get all the eigen vector and eigen values by just doing the SVD of X' * X (sample coveriance matrix)" but I think what you mean is by just simply doing the eigen decomposition of X' * X, or ultimately SVD of X' and take the V values, am I correct?

    • @halihammer
      @halihammer Год назад

      I assume too?! i was confused a bit but this is what i got now:
      X = Design matrix with the variables in the columns and the observations in the rows, the data must be centered and normalized: X = (X(ij) - avg(j))/(n-1), where n is the amount of observations
      A = Covarinace matrix of X
      D = Diagonal matrix containing the eigenvalues of A
      S = Diagonal matrix containing the singualr values of X
      X = USV^T SVD of the design matrix
      A = VDV^T Eigendecomposition of the covariance matrix
      Now: X^TX = A = (VS^TU^T)(USV^T) = VS^T(SV^T = VDV^T since S is a diagonal: S^TS = S^2 = D and U is orthogonal: U^T=U^-1
      So the root of the eigenvalues of the covariance matrix are the same as the singular values of the design matrix
      Or the other way around: the squares of the singular values are the same as the eigenvalues of the covariance matrix

  • @librarymy-n7b
    @librarymy-n7b 8 месяцев назад

    Oh man, your channel its amazing, i am from brazil and i liked of your explanation about this content.. congratulations.

    • @kamperh
      @kamperh  8 месяцев назад

      Thanks a ton for the encouragement!! :D

  • @mvijayvenkatesh
    @mvijayvenkatesh 3 года назад +4

    Excellent video..always had some doubt on why we take the covariance matrix when computing eigen decomposition. Thank you and had a lot of my doubts clarified by this video.

    • @kamperh
      @kamperh  3 года назад +1

      Huge pleasure! :) I also struggled with this stuff the first (many) times I looked at it.

  • @davidzhang4825
    @davidzhang4825 2 года назад

    Great video ! This clears my doubts

  • @janlukasr.2141
    @janlukasr.2141 3 года назад +2

    Really nice explanation, thanks for this! :)

  • @Trubripes
    @Trubripes 8 месяцев назад +1

    Brilliant, You forgot to center the data but that's a trivial oversight.

  • @zenchiassassin283
    @zenchiassassin283 Год назад

    Nice video ! Note that X has to be centered (I don't know the definition of design matrix though x) ) so that we have the a correct covariance matrix

  • @ex-pwian1190
    @ex-pwian1190 5 месяцев назад

    The best explanation!

  • @toprakbilici9030
    @toprakbilici9030 Месяц назад

    Cool video, thanks.

  • @bilalbayrakdar7100
    @bilalbayrakdar7100 2 года назад

    thx dude, it was clear explanation

  • @kobi981
    @kobi981 6 месяцев назад

    Thanks ! great video

  • @prachimehta9839
    @prachimehta9839 2 года назад

    Amazing video !

    • @kamperh
      @kamperh  2 года назад +1

      Thanks Prachi!! :)

  • @RitikaSarkar-vn1mr
    @RitikaSarkar-vn1mr 8 месяцев назад

    awesomeee

  • @RitikaSarkar-vn1mr
    @RitikaSarkar-vn1mr 8 месяцев назад

    Awesomeeeeeeee