Visual Explanation of Principal Component Analysis, Covariance, SVD

Поделиться
HTML-код
  • Опубликовано: 30 сен 2024
  • Linearity I, Olin College of Engineering, Spring 2018
    I will touch on eigenvalues, eigenvectors, covariance, variance, covariance matrices, principal component analysis process and interpretation, and singular value decomposition.

Комментарии • 87

  • @zacmac
    @zacmac 3 года назад +20

    Great clarity. You clearly understand your stuff from a deep level so it's easy to teach.

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 года назад

    Wow, that was quite good explanation.

  • @ramielkady938
    @ramielkady938 9 месяцев назад +1

    PS: Video is targeted at people who already have a deep knowledge of what the video is trying to explain.

  • @riazali-vi8tu
    @riazali-vi8tu 4 года назад +12

    Well explained, you should do more videos

  • @MathematicsMadeSimple1
    @MathematicsMadeSimple1 4 года назад +8

    Clear explanation. Thank you for shading more light especially on the application of eigenvalues and vectors.

  • @patyyyou
    @patyyyou 4 года назад +4

    Nicely done. It hit the right level for someone who understands the linear algebra behind Eigenvectors and Eigenvalues but still needed to make the leap of connecting a dot or two in the application of PCA to a problem. Again, thank you!

  • @anuraratnasiri5516
    @anuraratnasiri5516 4 года назад +4

    Beautifully explained! Thank you so much!

  • @Darkev77
    @Darkev77 3 года назад +2

    I do understand that eigenvalues represent the factor by which the eigenvectors are scaled, but how do they signify “the importance of certain behaviors in a system”, what other information do eigenvalues tell us other than a scaling factor? Also, why do eigenvectors point towards the spread of data?

    • @malstroemphi1096
      @malstroemphi1096 Год назад

      If you consider a raw matrix or just geometric examples eigenvalues are just a scaling factor indeed. And you cannot say much more. But here, we are talking with additional context: we know we are doing statistics and putting "data" into a covariance matrix, which means we can now add more interpretations. The eigen vector is not just some eigenvector of some matrix, it's the eigenvector of a *covariance matrix* in the context of statistics, we've put data into a matrix whose elements measure all the possible spread of data, which is why we can now say an eigenvector points towards the spread of data and its length (eigenvalue) relates to the importance of that spread.

  • @eturkoz
    @eturkoz 5 лет назад +4

    Your explanations are awesome! Thank you!

  • @szilike_10
    @szilike_10 3 года назад +1

    Believe it or not, I've been wondering a lot about the concept of covariance because every video seems to miss the reason behind the idea. But I think I kind of figured it out today before watching this video and I drew the same exact thing that is in the thumbnail. So I guess was thinking correctly : ))

  • @apesnajnin
    @apesnajnin 4 года назад +3

    Really, amazing lecture! It's make my conception clear regarding eigenvalue and eigenvector. Thanks a lot!

  • @spyhunter0066
    @spyhunter0066 2 года назад

    Around the minute of 1.36, you said "we divide by n for covariance", but we divide by n-1, instead. Please, do check on that. Thanks for the video. Maybe, I sohuld say estimated covariance has the n-1 division.

  • @TheSyntaxerror1
    @TheSyntaxerror1 5 лет назад +4

    Love this video, great work!

  • @roshinroy5129
    @roshinroy5129 2 года назад +2

    Awesome explanation!! Nobody did it better!

  • @bottom297
    @bottom297 Месяц назад

    Extremely helpful. Thank you!

  • @softpeachhy8967
    @softpeachhy8967 3 года назад +1

    1:37 shouldn’t the covariance be divided by (n-1)?

  • @danielheckel2755
    @danielheckel2755 5 лет назад +2

    Nice visual explanation of covariance!

  • @bootyhole
    @bootyhole 8 месяцев назад +1

    Excellent video, thank you!

  • @davestaggers2981
    @davestaggers2981 2 года назад +1

    Graphical interpretation of covariance is very intuitive and useful for me. Thank you.

  • @nickweimer6126
    @nickweimer6126 5 лет назад +2

    Great job explaining this

  • @vietdaoquoc7629
    @vietdaoquoc7629 10 месяцев назад

    thank you for this amazing video

  • @simonala7090
    @simonala7090 8 месяцев назад

    Would love to request an in person version

  • @Muuip
    @Muuip 2 года назад +1

    Great concise presentation, much appreciated! 👍

  • @skshahid5565
    @skshahid5565 Год назад

    Why do you stop making videos?

  • @123arskas
    @123arskas 2 года назад

    Thank you. It was beautiful

  • @EdeYOlorDSZs
    @EdeYOlorDSZs 3 года назад

    poggers explination thankyou

  • @ivandda00
    @ivandda00 3 месяца назад

    ty

  • @Timbochop
    @Timbochop 2 года назад

    Good job, no wasted time

  • @Pedritox0953
    @Pedritox0953 3 года назад

    Good explanation

  • @Agastya007
    @Agastya007 3 года назад

    Plz do more videos

  • @arjunbemarkar7414
    @arjunbemarkar7414 4 года назад +1

    How do u find eigenvalues and eigenvectors from the covariance matrix?

    • @Eta_Carinae__
      @Eta_Carinae__ 4 года назад +1

      Same as usual, right? Find lambda using det(Sigma - lambda * I) = 0, so just take lambda away from the main diagonal of the Cov. Matrix, take the determinant of that and you'd be left with some polynomial of lambda which you then solve for, each solution being a unique eigenvalue.

  • @sakkariyaibrahim2650
    @sakkariyaibrahim2650 2 года назад

    Good lecture

  • @Trubripes
    @Trubripes 6 месяцев назад

    Thanks for concisely explaining that PCA is just SVD on the covariance matrix.

  • @latanezimbardo7129
    @latanezimbardo7129 3 года назад

    1:28 I personally visualise covariance like this, I always thought i was wrong, I have never seen others doing this, how come??

  • @TechLord79
    @TechLord79 5 лет назад +1

    Very well done!

  • @m.y.s4260
    @m.y.s4260 5 лет назад +1

    awesome explanation! thx!

  • @prof.laurenzwiskott
    @prof.laurenzwiskott 2 года назад

    Very nice video. I plan to use it for my teaching. What puzzles me a bit is that the PCs you give as an example are not orthogonal to each other.

  • @saDikus1
    @saDikus1 2 года назад

    Great video! Can anyone tell how she decided that PC1 is spine length and PC2 is Body mass? Should we guess (hypothesize) this in real world scenarios?

  • @f0xn0v4
    @f0xn0v4 Год назад

    I have always dreaded statistics, but this video made these concepts so simple while connecting it to Linear algebra. Thank you so much ❤

  • @liviyabags
    @liviyabags 4 года назад

    I LOVE YOU !!!!! whattay explanation... thank you so much

  • @tusharkush7
    @tusharkush7 4 года назад +1

    This video needs a golden buzzer.

  • @blackshadowofmysoul
    @blackshadowofmysoul 3 года назад

    Best PCA Visual Explanation! Thank You!!!

  • @sebgrootus
    @sebgrootus 7 месяцев назад

    Incredible video. Genuinely exactly what i needed.

  • @matato2932
    @matato2932 Год назад

    thank you for this amazing and simple explanation

  • @Lapelu9
    @Lapelu9 Год назад

    I thought PCA was a hard concept. Your video is so great!

  • @zendanmoko5005
    @zendanmoko5005 2 года назад

    Thank you! very nice video, well explained!

  • @1291jes
    @1291jes 4 года назад

    This is excellent, Emma... I will subscribe to your videos!

  • @jordigomeztorreguitart
    @jordigomeztorreguitart 3 года назад

    Great explication. Thank you.

  • @VivekTR
    @VivekTR 3 года назад

    Hello Emma, Great job! Very nicely explained.

  • @skewbinge6157
    @skewbinge6157 3 года назад

    thanks for this simple yet very clear explanation

  • @getmotivated3619
    @getmotivated3619 5 лет назад

    You are awesome... u make a mediocre out of a knownothing.

  • @Agastya007
    @Agastya007 3 года назад

    I love the way u spelled "data" at [3:34]😁😁

  • @basavg1
    @basavg1 2 года назад

    Very Nice..pls keep posting

  • @thryce82
    @thryce82 4 года назад

    nice job was always kinda confused by this.

  • @abdulrahmanmohamed8800
    @abdulrahmanmohamed8800 4 года назад

    A very good explanation.

  • @subinnair3835
    @subinnair3835 5 лет назад +1

    Dear mam,
    How did you obtain the matrix at 5:30 ?

    • @emfreedman3905
      @emfreedman3905  5 лет назад +2

      Find the Covariance Matrix of these variables, like at 2:15, and find its eigen decomposition (find its two dominant eigenvectors). The matrix at 5:30 is the two dominant eigenvectors. Each column is an eigenvector.

    • @subinnair3835
      @subinnair3835 5 лет назад +1

      Emma Freedman thank u !
      The video's explanation was great and covered all the fundamentals required to fully understand PCA !! 😃

  • @tusharpandey6584
    @tusharpandey6584 4 года назад

    awesome explanation! make more vids pls

  • @vitokonte
    @vitokonte 4 года назад

    Very nice explanation!

  • @stephenaloia6695
    @stephenaloia6695 3 года назад

    Thank you, Ma'am!

  • @tractatusviii7465
    @tractatusviii7465 4 года назад

    investigate hedge/hogs

  • @crispinfoli9448
    @crispinfoli9448 4 года назад

    Great video, thank you!

  • @DanielDa2
    @DanielDa2 3 года назад

    great explanation

  • @KimJennie-fl3sg
    @KimJennie-fl3sg 4 года назад

    I just love the voice🙄😸

  • @nbr2737
    @nbr2737 3 года назад

    beautiful, thanks a lot!

  • @Matt-bq9fi
    @Matt-bq9fi 3 года назад

    Great explanation!

  • @siliencea9362
    @siliencea9362 4 года назад

    thank you so much!! :)

  • @haroldsu
    @haroldsu 4 года назад

    Thank you for this great lecture.

  • @mahmoudreda1083
    @mahmoudreda1083 4 года назад

    thanks A LOT

  • @DoFlamingo_1P
    @DoFlamingo_1P 4 года назад

    AWESOMEEE 🤘🤘🤘

  • @adriantorresnunez
    @adriantorresnunez 5 лет назад +2

    Best explanation I have heard from PCA. Thank you

  • @astiksachan8135
    @astiksachan8135 4 года назад

    5:12 was very good

  • @콘충이
    @콘충이 4 года назад

    Awesome!

  • @SpeakerSparkTalks
    @SpeakerSparkTalks 5 лет назад

    nicely explained

  • @checkout8352
    @checkout8352 5 лет назад

    Tnks

  • @raghav5520
    @raghav5520 5 лет назад

    Well explained

  • @astiksachan8135
    @astiksachan8135 4 года назад

    4:35

  • @AEARArg
    @AEARArg 3 года назад

    Congratulations Emma, your work is excellent!

  • @2894031
    @2894031 3 года назад

    babe var(x,x) makes no sense. either you say var(x) or cov(x,x)

  • @ABC-hi3fy
    @ABC-hi3fy 3 года назад

    No one explains why they use covariance matrix. Why not use actual data and find its igen vector/igen values. I have been watching hundreds of videos books. No one explains that. It just doesn't make sense to me to use covariance matrix. Covariance is very useless parameter. It doesn't tell you much at all.

    • @malstroemphi1096
      @malstroemphi1096 Год назад

      No it does especially using PCA. But you are right, you need actual data. Say the data are 3D points of some 3d objects, if you use this technique (build a cov matrix using the 3D points and do the PCA of it) then you will find a vector aligned with overall direction of the shape: for instance you will find the main axis of a 3d cylinder. This is quite a useful information.