Principal Component Analysis (The Math) : Data Science Concepts

Поделиться
HTML-код
  • Опубликовано: 22 сен 2019
  • Let's explore the math behind principal component analysis!
    ---
    Like, Subscribe, and Hit that Bell to get all the latest videos from ritvikmath ~
    ---
    Check out my Medium:
    / ritvikmathematics

Комментарии • 168

  • @loveena419
    @loveena419 3 года назад +14

    Finally, a video that explains the math behind PCA so clearly. Went through all the other videos and it helped a lot! Thank you!

  • @vinceb8041
    @vinceb8041 3 года назад +5

    I've been wrestling to get all intuitional and computational components for doing pca for a while, and seeing it all come together here helps tremendously! Great as always, 10/10 video :)

  • @qaarloshilaal2778
    @qaarloshilaal2778 3 года назад +21

    Thanks infinitely for all your videos, you're literally the best at explaining these concept in a clear and excellent way in order to continue with what we have to study/ do! Huge respect man.

  • @kaeruuuu_
    @kaeruuuu_ 2 года назад +4

    Necessary videos:
    1. ruclips.net/video/X78tLBY3BMk/видео.html (Vector Projections)
    2. ruclips.net/video/glaiP222JWA/видео.html (Eigenvalues & Eigenvectors)
    3. ruclips.net/video/6oZT72-nnyI/видео.html (LaGrange Multipliers)
    4. ruclips.net/video/e73033jZTCI/видео.html (Derivative of a Matrix)
    5. ruclips.net/video/152tSYtiQbw/видео.html (Covariance Matrix)

  • @pigtowndanzee
    @pigtowndanzee 4 года назад +22

    Love your teaching style. Keep these videos coming!

  • @paulbrown5839
    @paulbrown5839 3 года назад +1

    This is a very strong video. It requires proper study. I hope you do more of this great stuff. Thank You!

  • @bhajman123
    @bhajman123 3 года назад

    Byfar the most accessible description of pca...finally was able to clearly connect the covar matrix and the eigen values to variance maximization

  • @Mokielove
    @Mokielove 4 месяца назад

    Thank god I found your channel. I am studying masters degree in computer science in a prestigious university and cost me a lot of money but your channel is very useful to dig deeper and understand many things. Stay on the good work!

  • @BleachWizz
    @BleachWizz 3 года назад

    I'm loving your content, you're showing a part of math that is not usually shown. The part where you actually use it, where you make your choices and why are you choosing them. Like it's nice to understand the equations and why it gives you a 0 on the sweet spot, but it's also nice to remind that it not only works but it was build to work with that intention.
    So in the end you still need to figure out how do you get your problem to fit in one of those, what can you choose in these big generic operations to fit it into your problem.

    • @ritvikmath
      @ritvikmath  3 года назад +2

      Thanks for the feedback! I do try to focus a lot more on the "why" questions rather than the "how" questions.

  • @joachimguth6226
    @joachimguth6226 4 года назад +88

    Very well presented. You are a great teacher. Hopefully you are going to cover the entire AI space.

    • @ritvikmath
      @ritvikmath  4 года назад +41

      That is the goal!

    • @warrenbaker4124
      @warrenbaker4124 3 года назад +4

      @@ritvikmath Oh wow!!! I'm so happy to see you're taking this on. I'm a huge fan and this is a real highlight for me. Thanks for all you do!!

    • @Moiez101
      @Moiez101 Год назад

      @@ritvikmath i fully support that goal! I just started with data science bro. Loving your videos, you're a great teacher.

  • @shivamkak7981
    @shivamkak7981 10 месяцев назад +1

    Such a well curated explanation of PCA, thanks so much!

  • @martinw.9786
    @martinw.9786 2 года назад

    Thank you very much for the explanations - very very well done. Your references to the mathematical backround is key!

  • @robertbillette4671
    @robertbillette4671 2 года назад

    Like everyone else has mention, amazing clarity and style.

  • @nahidakhter8646
    @nahidakhter8646 3 года назад

    Beautifully explained! Thanks so much!

  • @mathematicality
    @mathematicality 2 года назад +1

    Simple and straight to the point. aBsolutely welldone!

  • @amaramar4969
    @amaramar4969 5 месяцев назад

    I had to go thru the prerequisite videos to clarify my concepts first, but after that this PCA explanation is amazing! I think you are equivalent to 10 college professors out there in terms of teaching skills. I hope you get that proportion money and the college professors feel ashamed and work harder to catchup to your standards. Again, amazing!

  • @jaivratsingh9966
    @jaivratsingh9966 Год назад +1

    Simply excellent!

  • @thinkingAutomata
    @thinkingAutomata 2 года назад +1

    Thanks Ritvik. Excellent explanation of PCA. Good job, well done!

  • @sandeepc2833
    @sandeepc2833 4 года назад

    Cleared most of my doubts. Thanks a lot.

  • @paulntalo1425
    @paulntalo1425 3 года назад

    You have made it clear. Thank you

  • @_arkadij
    @_arkadij 6 месяцев назад

    Very appreciative of the explanation why we end up with using vectors corresponding to the biggest Eigenvalues. Thanks so much

  • @MaxDavidsonArgentina
    @MaxDavidsonArgentina 4 года назад

    Thanks for sharing your knowledge. It's great to have people like you helping out!

    • @zilezile4942
      @zilezile4942 4 года назад

      Good morning
      If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/

  • @alphar85
    @alphar85 3 года назад

    I stopped at 01:33 and I am going to watch the other 5 videos. you are such a blessing mate.

  • @kakabudi
    @kakabudi 2 года назад

    Really great video! Thanks for explaining this concept wonderfully!

  • @christinejiang6386
    @christinejiang6386 4 месяца назад

    wow! thank you!
    I watched all the videos before watching this one, they really helps a lot!

  • @yarenlerler67
    @yarenlerler67 Год назад +1

    Ahh such a clean explanation. I really appreciate! I will have practical statics for astrophysics exam soon, and I was having some problem with the theory part. All your videos were very helpful! I hope I am gonna get a good grade from the exam. :)

  • @resoluation345
    @resoluation345 5 месяцев назад

    The best series to explain the maths behind PCA

  • @TamNguyen-qi8di
    @TamNguyen-qi8di 3 года назад +6

    Dear rivitmath,
    Thank you so much sir for your clear explanation. Even being in my last year of college, I am still struggling with the basics of statistics. With your help, I have been striving exponentially in class and looking to graduate from college in this semester. Your videos have been so so so helpful and i wish you an amazing health to continue with your content. I wish you could have been my professor in college. Thank you for putting out the high quality contents. Words can't describe how much I appreciate you, sir. Thank you. You have changed my life.

    • @ritvikmath
      @ritvikmath  3 года назад +2

      Thanks for the kind words. Wishing you much success!

  • @shashanksundi5669
    @shashanksundi5669 3 года назад

    Just perfect !! Thank you :)

  • @cameronbaird5658
    @cameronbaird5658 Год назад

    Phenomenal video, thank you for the hard work 👏

  • @133839297
    @133839297 Год назад

    You have a gift for teaching.

  • @Chill_Magma
    @Chill_Magma 11 месяцев назад

    Straight to the point and thorough you deserve to be subscribed from my 3 accounts

  • @ShubhamYadav-ut9ho
    @ShubhamYadav-ut9ho 2 месяца назад

    Amazing explanation as always

  • @DeRocks1607
    @DeRocks1607 2 месяца назад

    You are great teacher.. ultimately I understood

  • @vinceb8041
    @vinceb8041 3 года назад +4

    12:20 Quick note on why going down the list of eigenvalues is legit, the covariance matrix is a symmetric matrix, and it can be shown that if such a matrix has more than one eigenvalues that are not the same, the corresponding eigenvectors will be orthogonal.

  • @santiagolicea3814
    @santiagolicea3814 Год назад

    This is a great explanation, thanks a lot. It'll be great if you can also make a video showing a practical example with some data set, showing how you use the eigenvectors projection matrix to transform the initial data set.

  • @subhabhadra619
    @subhabhadra619 Год назад

    Awesomely represented..

  • @riteshsaha6881
    @riteshsaha6881 24 дня назад

    This is super helpful. Way better than my professor's explanation

  • @mashakozlovtseva4378
    @mashakozlovtseva4378 4 года назад +1

    Everything was clearly understood from math side! Thank you for your link on Medium account!

    • @zilezile4942
      @zilezile4942 4 года назад

      Good morning
      If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/

  • @cll2598
    @cll2598 Месяц назад

    Epic explanation

  • @proxyme3628
    @proxyme3628 Год назад

    Brilliant explanation of why eigen vector is the one from maximum optimisation, never saw such great explanation before. Wish your course is in Coursera. I do not think any text book explains the eigen value as Lagrangian Multiplier and eigen vector as maximising variance. Thanks so much.

  • @arun_kanthali
    @arun_kanthali 2 года назад

    Great Explanation.. Thank-you 👍

  • @aravindsaraswatula2561
    @aravindsaraswatula2561 Месяц назад

    Awesome video

  • @bilalbayrakdar7100
    @bilalbayrakdar7100 21 день назад

    bro you are the best, thanks for you effort

  • @nuamaaniqbal6373
    @nuamaaniqbal6373 2 года назад

    cant thank u enough!! u r truly the boss!

  • @Tankwell-cq5ky
    @Tankwell-cq5ky 2 года назад

    Very well presented - well done!😊😊

  • @muhammadghazy9941
    @muhammadghazy9941 2 года назад

    thank you man appreciate it

  • @Sriram-kj6kl
    @Sriram-kj6kl 2 года назад

    Your videos help a lot man.. Thank you 👍

  • @ahmadawad4782
    @ahmadawad4782 4 года назад

    Watched many videos about linear algebra and PCA. You're the one who made it clear for me. Thanks!

    • @zilezile4942
      @zilezile4942 4 года назад +1

      Good morning
      If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/

  • @Chill_Magma
    @Chill_Magma 11 месяцев назад

    Seeing your videos increases my confidence on math stuff :DDD

  • @erfanbayat3974
    @erfanbayat3974 2 месяца назад

    this video is amazing

  • @mmarva3597
    @mmarva3597 3 года назад

    Thank you very much !! really helpful

  • @deplo
    @deplo 3 года назад +1

    Hi Ritvikmath, thank you for your super informative videos! I took all courses on this topic but I was wondering if you could expand it with factor analysis and correspondence analysis. It would be interesting to know how different methods work and relate to each other because it would provide a deeper perspective. Thanks

  • @543phi
    @543phi 4 года назад

    Thanks for this video! As a Data Science student, your lecture helped to clarify a lot....I appreciate your teaching style.

    • @zilezile4942
      @zilezile4942 4 года назад

      Good morning
      If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/

  • @fahimfaisal4660
    @fahimfaisal4660 2 года назад

    Excellent

  • @ernestanonde3218
    @ernestanonde3218 2 года назад

    great video

  • @pratik.patil87
    @pratik.patil87 7 месяцев назад

    Thanks Ritvik, I went through multiple resources to figure out this exact questions " why does eigen vectors and eigen values of a covariance matrix represent the direction and strength of the biggest increase in variance" . Thanks your video clarifies it beautifully.
    One question still though, I understand the equation we use to maximise but why do we need the constraint(uT u =1)?

  • @gc6327
    @gc6327 4 года назад +3

    Hi Ritvik- Can you do a video on factor analysis. That would be huge! Thanks buddy!

  • @Rockyzach88
    @Rockyzach88 Год назад

    Just finished the LA section in the Deep Learning book and I can tell this is going to help supplement and fill in this gaps of understanding. Good vid.

  • @rabiizahir2885
    @rabiizahir2885 2 года назад

    Thanks a lot.

  • @berkoec
    @berkoec 3 года назад +2

    Such a well-explained video - keep up the great work!

  • @jhonportella5618
    @jhonportella5618 3 года назад +5

    Great, great video I really appreciate your effort and good methodology to teach. I have a question on the projection math. on your projection video you obtained P=XUU but here you used P=U*XU. Maybe this is a silly question but I would really appreciate if you can tell me why this equivalence is possible. Many thanks

  • @herberthubert6828
    @herberthubert6828 3 года назад +1

    you rock, thank you

  • @user-xw5cg7by6t
    @user-xw5cg7by6t Год назад

    This video is super great! I was wondering why Covariance matrix is used to compute PCA, but this video made my doubts clear!!

  • @volsurf1274
    @volsurf1274 3 года назад +1

    Concise, clear and superbly explained. Thanks!

  • @ajanasoufiane3903
    @ajanasoufiane3903 4 года назад +7

    Great video, it would be nice if you could show the big picture through the SVD decomposition :)

    • @zilezile4942
      @zilezile4942 4 года назад

      Good morning
      If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/

  • @nandhinin799
    @nandhinin799 4 года назад

    Clearly explained, helped me greatly in understanding the basis of PCA.

    • @zilezile4942
      @zilezile4942 4 года назад

      Good morning
      If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/

  • @rajathjain314
    @rajathjain314 4 года назад

    Very Intuitive, Great Job Ritvik!

    • @zilezile4942
      @zilezile4942 4 года назад

      Good morning
      If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/

  • @Cybrean1
    @Cybrean1 3 года назад +1

    Excellent presentation and delivery … wish you all the success!

  • @akrylic_
    @akrylic_ 4 года назад +3

    There's a property of transposes around 6:45 that you could have mentioned, and I got tripped up for a second. The reason why you can write u^T*(xi-xbar) as (xi-xbar) ^T*u is because
    (AB)^T =(B^T)(A^T)
    It's a cool trick, but not obvious

    • @ritvikmath
      @ritvikmath  4 года назад +1

      Very true, thanks for filling in the missing step!

    • @zechengchang3444
      @zechengchang3444 3 года назад

      Can you explain more? How does (AB)^T =(B^T)(A^T) have anything to do with u^T*(xi-xbar)? Thanks.

  • @AshishKGor
    @AshishKGor 2 года назад

    Thanks sir.

  • @fabianwinkelmann3931
    @fabianwinkelmann3931 3 года назад

    Thank you:)

  • @simranjoharle4220
    @simranjoharle4220 Год назад

    Your videos are extremely helpful! Thank you!

  • @alejandropalaciosgarcia2767
    @alejandropalaciosgarcia2767 3 года назад

    Bro, you are awsome

  • @yurongluo447
    @yurongluo447 5 месяцев назад

    Your video is helpful for us. Can you create one video to explain Independent Component Analysis in detail? Thanks.

  • @PR-ud4fp
    @PR-ud4fp Год назад

    Thanks 😊

  • @user-kw6ib6ks1q
    @user-kw6ib6ks1q 4 месяца назад

    great explanation. Really appreciate it. thanks

    • @ritvikmath
      @ritvikmath  4 месяца назад

      Glad it was helpful!

  • @odysseashlap
    @odysseashlap 4 года назад +1

    Really appreciate this! Any good book suggestion for PCA mathematical Framework in greater depth? Maybe another video (hard maths of pca)?

  • @dr.kingschultz
    @dr.kingschultz 2 года назад

    Do you have a video about instrumental variables? Because in general seems to be just regular manipulation, but in a more complex way.
    Also, do you have videos applying this concepts? Could be using R or Python. That would be very nice.

  • @knp4356
    @knp4356 4 года назад +4

    Hey Ritvik, It would be great if you can generate some problems for viewers to solve. Watching is great but if you can supplement with actual problems then it would drive the points into viewers head. You can then further post solutions on your medium site. Hopefully at least 4-5 problems per each video. I've watched many videos on DS subjects but something in your teaching method is making it simpler to understand. Thanks.

    • @ritvikmath
      @ritvikmath  4 года назад +3

      I honestly really appreciate that you're trying to help me be more effective at what I do. I think it's a great idea and I'll look into it. Thanks :)

  • @mainakmukherjee3444
    @mainakmukherjee3444 Год назад

    We find the equation of the variance of the vector, on which we are going to project the data, and then tried maximizing it, because, the vector, for which the variance will be highest (max eigen value), is gonna retain most of the information of the data, after dimensionality reduction.

  • @RealLifeKyurem
    @RealLifeKyurem 3 года назад

    Since principal component analysis is used to reduce the dimensions, thus lessen the curse of dimensionality, can you calculate the maximum amount of dimensions you need for a given dataset to find patterns?

  • @chinaminer
    @chinaminer 3 года назад

    Hello, thanks for this video and also for the others, well done! On this video I have a doubt to ask. Where can I submit the question in order to not mess comments here?

  • @suvikarhu4627
    @suvikarhu4627 2 года назад +2

    @ritvikmath 5:02 I don't understand where is this formula of projection (proj(xi)=ut xi u) coming from. The projection video does not say that. What the projection video exactly says is that the proj(xi) = (xi dot u)*u. No transpose there! Where did you get that transpose from? And the dot product is missing ?
    Another question, at 5:50 why do you take only the magnitude of the vector?

  • @GeoffryGifari
    @GeoffryGifari Месяц назад +1

    Hmmm i noticed that if two categories are strongly correlated, the plot will look close to a straight line.
    Going to multidimensional space, that "line" looks like the vector u1 in the video, on which the data are projected.
    Does that mean PCA will perform better the more correlated two (or more) categories are?

  • @brianogrady37
    @brianogrady37 Месяц назад

    I wish you specified what values represented the Principal Conponents earlier on. But great video regardless.

  • @georgegkenios486
    @georgegkenios486 3 года назад +1

    Amazing work mate!

  • @MohamedMostafa-kg6gk
    @MohamedMostafa-kg6gk 3 года назад

    Thank you for this great explanation .

  • @nirjasmuhammed
    @nirjasmuhammed 3 года назад

    thank u sir

  • @diegolazareno8020
    @diegolazareno8020 4 года назад +2

    Never stop making these videos!!! One of Logistic Regression would be nice

    • @ritvikmath
      @ritvikmath  4 года назад +5

      Hey I appreciate the kind words! I do have a vid on logistic regression here: ruclips.net/video/9zw76PT3tzs/видео.html

  • @thirumurthym7980
    @thirumurthym7980 3 года назад

    @ 4.54 - you are referring about projection video - on how you arrive projections formula. There is no such mention of U transpose in that projections video.

  • @kisholoymukherjee
    @kisholoymukherjee 2 года назад +3

    Hi ritvik, thanks for the video. Can you please tell me how the vector projection formula is being used to calculate the projection of xi on u here? The formulae in the two videos seem to be quite different. Would really appreciate if you could help understand the underlying math

    • @ArpitAnand-yd7tr
      @ArpitAnand-yd7tr Год назад

      That's just a dot product between the potential u1 and Xi. It gives the magnitude of the projection in the direction of the unit vector u

  • @thirumurthym7980
    @thirumurthym7980 3 года назад

    nice video. Very useful to me. You are also mentioning about link to couple of external resources @13.18 , could you please share?. thanks.

  • @quark37
    @quark37 Год назад

    Fun video. Thank-you. And thanks for all the pre-req videos.
    Question: I've seen other videos that describe PCA vectors as orthogonal, but using eigenvectors they would not necessarily be orthogonal, right? What is the correct way to think about the orthogonality of PCA vectors? Thanks.
    *
    I think I answered my own question. The eigenvectors in question are of the covariance matrix of the related variables. This matrix is symmetrical so the eigenvectors will be orthogonal. Correct?

  • @sidddddddddddddd
    @sidddddddddddddd Год назад

    What you've called the closed form of the covariance matrix is actually the biased estimator of the covariance matrix \Sigma. And if you divide by (N-1) instead of (N), you get the unbiased estimator of \Sigma. Awesone video! Thanks :D

  • @seetaramdantu3190
    @seetaramdantu3190 3 года назад +1

    excellent...well explained

  • @ahmad3823
    @ahmad3823 4 месяца назад

    Amazing

  • @404nohandlefound
    @404nohandlefound Год назад +2

    Could you please explain how this links to SVD

  • @mwave3388
    @mwave3388 2 года назад +1

    I'm preparing for a job interview. Thanks, the best PCA video I found.

  • @josephgan1262
    @josephgan1262 3 года назад +3

    Thanks for the amazing video! can anyone please explain why the projection is u1T . Xi * u?
    In the projection video it is ( Xi . u ) u. Are they equivalent?