PCA : the math - step-by-step with a simple example

Поделиться
HTML-код
  • Опубликовано: 23 июл 2024
  • You can buy the corresponding PDF of this video at:
    www.tilestats.com/
    In this second video about PCA, we will have a look at its math (the eigendecomposition). We will compute the PCA based on the eigenvectors of the covariance matrix.

Комментарии • 123

  • @saifqawasmeh9664
    @saifqawasmeh9664 Год назад +46

    You're probably the only one on the Internet who explained PCA mathematically! Thank you so much!

  • @RayRay-yt5pe
    @RayRay-yt5pe 11 дней назад +2

    I can't believe the concept can be explained this simply! Nice one! You have a new subscriber. I honestly think it's criminal that something this simple is made overly convoluted by other individuals.

  • @abebawt1169
    @abebawt1169 4 месяца назад +4

    After I watch this video, I feel like everyone else make PCA complicated, deliberately. Thank you for making it easy!

  • @scottzeta3067
    @scottzeta3067 Год назад +9

    This video is totally underated. If my uni's lecture is even half good as yours, I won't spend so much time.

  • @randbak1527
    @randbak1527 Год назад +3

    totally underrated video I've been searching for a simple yet informative explanation of PCA and you are the best you should be the top on on the search . thank you

  • @aindrilasaha1592
    @aindrilasaha1592 3 года назад +34

    Trust me after having spent hours on google and youtube, this is the best thing that i found on PCA, hats off to you and thanks a lot!!
    Wish you all the best for your channel.

    • @tilestats
      @tilestats  3 года назад +5

      Thank you!

    • @pipsch12
      @pipsch12 2 года назад +6

      I so agree. I don't understand why PCA is presented in such an overly complicated fashion by almost everybody. This video is so simple because it covers every step of the process and gives clear and easy explanations without unnecessary details and confusing language. THANK YOU.

    • @SanthoshKumar-dk8vs
      @SanthoshKumar-dk8vs Год назад

      True, great explanation 👏

  • @sefatergbashi
    @sefatergbashi Год назад +3

    Best lecture on PCA calculations so far!
    Thank you

  • @ramkumargorre2958
    @ramkumargorre2958 Год назад +2

    This is one of the best videos explained the PCA concept mathematically.

  • @sasakevin3263
    @sasakevin3263 2 года назад

    Your video gave me 100% understanding of PCA, before that, I know nothing about PCA. Thank you!

  • @mohdzoubi3819
    @mohdzoubi3819 Год назад +3

    It is a great video. The corresponding PDF file of this video is also great .Thank you very much.

  • @shankars4384
    @shankars4384 9 месяцев назад +1

    You are the best TileStats. I love you a lot man!

  • @notknown42
    @notknown42 Год назад +1

    Best PCA video I have seen at this platform. Well done - Greetings from Germany

  • @divyab592
    @divyab592 Год назад +1

    best explanation of PCA so far!!! thank you so much

  • @coffee-pot
    @coffee-pot Год назад

    Thank you so much. Your videos are the best and this particular video is beyond amazing.

  • @endritgooglekonto230
    @endritgooglekonto230 5 месяцев назад +1

    best tutorial ever on PCA I have ever found!

  • @michaeldouglas7641
    @michaeldouglas7641 2 года назад +15

    I would like to sincerely thank you for this video. Almost all YT maths videos only focus on the high level concepts. Finding a linear, step by step explication of the process is rare. Please do make more of these videos. Others I would love to see are: a step by step of one of the GLM's (logistic?), a SBS of gaussian process, and maybe a step by step of factor analysis. Thanks again

    • @tilestats
      @tilestats  2 года назад

      Thank you! I think my videos about logic regression will interest you. You find all my videos at
      www.tilestats.com

  • @NatnichaSujarae
    @NatnichaSujarae 2 месяца назад +1

    you're a life saver! I've been trying to understand this for daysssss and this is the only video that nailed it! Thank you so muchhh

  • @andresromeroramos5410
    @andresromeroramos5410 2 года назад +1

    I simply loved your teaching way. AWESOME video!

  • @danialb9894
    @danialb9894 Год назад +1

    Best explanation for PCA. Thank you. Wish you the best ❤❤

  • @wlt6311
    @wlt6311 7 месяцев назад +1

    Thanks for this nice video, best explaination of PCA. Others just explain without showing the calculation.

  • @ConfusedRocketShip-fv7qy
    @ConfusedRocketShip-fv7qy 6 месяцев назад

    Amazing! Best teacher for PCA

  • @firstkaransingh
    @firstkaransingh 2 года назад +6

    Excellent explanation of a very complex topic.
    Please do try to explain the SVD procedure if you can.
    Thanks 👍

  • @joaovictorf.r.s.1570
    @joaovictorf.r.s.1570 Год назад +1

    Perfect presentation! Thanks!

  • @TranHoangNam_A-km3vj
    @TranHoangNam_A-km3vj 8 месяцев назад

    you are the best teacher i ever known

  • @AbhishekVerma-kj9hd
    @AbhishekVerma-kj9hd Год назад +1

    God bless you sir what an amazing explanation I'm really touched and thank you for this video

  • @wondwossengebretsadik3334
    @wondwossengebretsadik3334 2 года назад +1

    This is an excellent explanation. Thanks a lot.

  • @gacemamine5970
    @gacemamine5970 5 дней назад +1

    Fantastic explanation👑👑👑, Thank you very much.

  • @user-dt8ei1wj2x
    @user-dt8ei1wj2x Год назад +1

    man, it's so helpful, thank you so much!!!

  • @mohamedkannou6142
    @mohamedkannou6142 6 месяцев назад +1

    Good job man!!! Thank you so much

  • @bobrarity
    @bobrarity 7 месяцев назад +1

    appreciate the video, helped a lot

  • @hammasmajeed3715
    @hammasmajeed3715 2 года назад +1

    Your videos are very helpful . Thanks

  • @gabrielfrattini4090
    @gabrielfrattini4090 2 года назад +2

    This was amazing, so clear

  • @crickethighlight555
    @crickethighlight555 Год назад +1

    Tomorrow will be my quiz I had not even attend the lecture but after watching your tutorial I am ready for quiz so Thanks 🙏

  • @SS-pn7ss
    @SS-pn7ss 8 месяцев назад +1

    thank you so much for this great video

  • @dpi3
    @dpi3 Год назад +1

    absolutely brilliant!

  • @cindywang8852
    @cindywang8852 2 года назад +1

    Very informative! Thank you!

  • @arunkumar0702
    @arunkumar0702 2 года назад +1

    Very well explained ... indeed !! Keep up the good work ..!!
    Many thanks for conceiving and producing this excellent series on PCA. I look forward to viewing your videos on other topics !!

  • @polarbear986
    @polarbear986 2 года назад +1

    This is so good, Thank you!

  • @betting55555
    @betting55555 Год назад +1

    great video, thanks!

  • @kmowl1994
    @kmowl1994 2 года назад +1

    Very helpful, thank you!

  • @workcontact9726
    @workcontact9726 Год назад +1

    great, thanks for this video

  • @AJ-fo3hp
    @AJ-fo3hp 3 года назад +1

    Thank you very much

  • @hayki_ds
    @hayki_ds Год назад +1

    Perfect
    Thanks

  • @rezafarrokhi9871
    @rezafarrokhi9871 3 года назад +1

    That is so helpful.

  • @alaghaderi9079
    @alaghaderi9079 Год назад +1

    One of the best videos about pca that I have seen. but where is svd ?:))

  • @shashanksadafule
    @shashanksadafule 2 года назад +1

    Amazing Explanation!

  • @casper8374
    @casper8374 Год назад +1

    the best 🙏🏼

  • @nikeforo2612
    @nikeforo2612 2 года назад +3

    Your videos are a godsend, extremely helpful and clear. Thanks a lot. Is there any chance you will cover Correspondence Analysis any time soon? That would nicely complement the series of videos on data dimensionality reduction techniques. Just wondering....

    • @tilestats
      @tilestats  2 года назад

      Thank you! That method is not on my list but maybe in the future. However, there will soon be a video on principal component regression.

  • @md.shafaatjamilrokon8587
    @md.shafaatjamilrokon8587 Год назад +1

    Thanks

  • @mahdi1594
    @mahdi1594 2 года назад

    bro, great job, love the way you explain things. You might see this comment copied and pasted across few of your other videos, I am just doing this for the algorithm.

  • @areejhameed3923
    @areejhameed3923 2 года назад +1

    thank you so much

  • @ankhts
    @ankhts Год назад +1

    Able to understand mathematics of PCA with your videos ...Many Thanks ... if you reading this comment do watch explanation on GLM , probably best explaination available on youtube

  • @fredbatti
    @fredbatti 2 года назад +1

    Amazing Video, very well explained. A question: Anybody knows a way to sum the eigenvectors (weights) to 1. To exactly how much of of orginal valeus contribute to the component?

    • @tilestats
      @tilestats  2 года назад +1

      Thank you! To transform the weights so that they sum to one, simply divide each weight by the sum of the weights (given that the weights are positive). However, I usually like to think of the weights as correlation coefficients as I explain in the fourth video about PCA.

    • @fredbatti
      @fredbatti 2 года назад

      @@tilestats I have find out that if we power all the weights by 2 it will end up summing to 1 ! regardless the signal. Thanks for the contribution ! Appreciate it

    • @tilestats
      @tilestats  2 года назад

      Yes, but note that the weights are usually expressed as loadings (see PCA 4 video) by most statistical software tools. The square of these loadings do not then sum up to one.

  • @eaintthu3488
    @eaintthu3488 Год назад +1

    please explain about kernel PCA

  • @nassersaed4993
    @nassersaed4993 6 месяцев назад

    Hi, thanks for the very informative tutorial, can you please explain at 11:00 how you obtained the pc scores by multiplying the eigenvector matrix with centred data?

    • @tilestats
      @tilestats  6 месяцев назад +1

      Have a look at this video, starting at about 9 min, to see how to do matrix multiplication:
      ruclips.net/video/QtAZsWseIKk/видео.html

    • @nassersaed4993
      @nassersaed4993 6 месяцев назад

      Okay, got it ! thank you so much🙏

  • @rahuldebdas5608
    @rahuldebdas5608 7 месяцев назад

    Sir can you please upload a similar mathematical video on oblige rotation of Principal components? It will be very helpful.

  • @muhammadusmanbutt3341
    @muhammadusmanbutt3341 2 года назад +1

    Can you please tell me where are the precious video related eigenvalues nd eigenvectors?

    • @tilestats
      @tilestats  2 года назад +1

      If you go to
      www.tilestats.com
      You find all my videos in a logical order.

  • @Fa94Ar
    @Fa94Ar Год назад +1

    Thanks for that video what name of book that you depend upon?

    • @tilestats
      @tilestats  Год назад

      I mainly used internet to learn ML.

  • @oscarernestocl9319
    @oscarernestocl9319 Год назад

    for the example that starts @18:00 , first you have vector [-2/ 3] , then you multiply by the covariance matrix to get vector: [8 / 12] (to transform the vector), and then you multiply again [8/12] by the covariance matrix to get the direction of the eigenvector. However, in the second example you dont transform the vector and just multiply the initial one [4/1] by the covariance matrix. So my question is: why it is necessary to transform the vector in the first case? Thank youu!!!

    • @tilestats
      @tilestats  Год назад +1

      I just show one iteration in the second example but the more iterations you do (multiply the new vector with the covariance matrix), the closer you will get to the eigenvector.

  • @anmolpardeshi3138
    @anmolpardeshi3138 2 месяца назад

    I see that you centered the data. Is only centering required for "standardization" or scaling is also normally done such that the mean =0; standard deviation=1? this will then change the covariance matrix since variance of individual dimensions will equal 1.

    • @tilestats
      @tilestats  2 месяца назад

      It is not a requirement, mathematically, to standardize your data (mu = 0, SD = 1), but it is highly recommended, especially if you have variables with a large difference in the variance. I discuss that in the next video about PCA:
      ruclips.net/video/dh8aTKXPKlU/видео.html

  • @KS-df1cp
    @KS-df1cp 2 года назад +1

    Great but not sure how you got normalized values of eigen vectors. Can you please direct me towards that video or step you skipped? Thanks. Also, what are the eigen vectors that you get for eigen value 0.32? My simplified value of y is -0.72 x I dont know why you got 0.81.

    • @tilestats
      @tilestats  2 года назад

      ruclips.net/video/9CT0jnem4vM/видео.html
      Starts at around 8 min.

    • @KS-df1cp
      @KS-df1cp 2 года назад

      @@tilestats Got it and I forgot to take the sqrt of the denominator :/ thank you again

    • @mwanganamubita9617
      @mwanganamubita9617 Год назад

      @@tilestats Thanks for this very informative video. I have one question - For lambda = 0.32, I am getting y = -0.73 when x =1, the normalized vector with unit length of 1 is [0.81, -0.59] instead of [-0.81, 0.59]. Please verify and advise

    • @tilestats
      @tilestats  Год назад +2

      If you set x= 1, you get [0.81, -0.59], but if you set y=1, you will get [-0.81, 0.59]. If you set x to 1, or y to 1, is arbitrary because both vectors are eigenvectors to the covariance matrix (they just point in the opposite direction). Both vectors will give the same variance of PC2.

    • @mwanganamubita9617
      @mwanganamubita9617 Год назад

      @@tilestats Many thanks for the explanation. Much appreciated!

  • @RuiLima1981
    @RuiLima1981 9 месяцев назад

    minute 6.17, how did you get the value 3.84? Should it not be 35.2?

    • @tilestats
      @tilestats  9 месяцев назад

      4.4 x 8 - 5.6 x 5.6 = 3.84

  • @tonyhuang9001
    @tonyhuang9001 5 месяцев назад +1

    Love from China😘

  • @ramankaur5657
    @ramankaur5657 Год назад

    hi, instead of center(ing) the data, is it also viable to standardise the data?

    • @tilestats
      @tilestats  Год назад

      Sure, have a look at the next video:
      ruclips.net/video/dh8aTKXPKlU/видео.html

    • @ramankaur5657
      @ramankaur5657 Год назад

      ​@@tilestats Thanks, I just watched it! Hoping you could help me with following as well: if I am applying the eigenvectors to another set of new data (with same variables as the original data) (i.e., not the original data i ran PCA on), I assume I should also standardise the new data before applying the eigenvector (weighting) on the new data?

  • @wanqin3396
    @wanqin3396 Год назад

    why for the standarlization of data did not need to divide standard deviation

    • @tilestats
      @tilestats  Год назад

      Here I only center the data, but you can also standardize as I do in this video
      ruclips.net/video/dh8aTKXPKlU/видео.html

  • @sainivasgandham7982
    @sainivasgandham7982 4 месяца назад

    why did you take n-1 while calculating the covariance matrix

    • @tilestats
      @tilestats  4 месяца назад

      Because that is how you calculate the variance. Have a look at this video if you like to know more:
      ruclips.net/video/pLH1QA4F9uE/видео.html

  • @yoonchaena3137
    @yoonchaena3137 2 года назад +2

    Thanks~! I want to but this channel stock~!, it will be bigger one.

  • @zero8wow342
    @zero8wow342 Год назад

    Please why others don't center the data first before using it to form the covariance matrix

    • @tilestats
      @tilestats  Год назад

      You do not need to center the data to compute a covariance matrix. You will get the same matrix with uncentered data because the spread of the data does not depend on the mean. The reason why I center the data in this video is because that is the first step in PCA.

  • @mrbilalkhan
    @mrbilalkhan 7 дней назад

    video lecture on Eigenvector and Eigenvalues mentioned at 05:31 can be found at ruclips.net/video/9CT0jnem4vM/видео.html

  • @user-pw8ft3pu8w
    @user-pw8ft3pu8w Год назад

    9:17 how to do normalization?

    • @tilestats
      @tilestats  Год назад

      Have a look at around 8 min in this video:
      ruclips.net/video/9CT0jnem4vM/видео.html

    • @user-pw8ft3pu8w
      @user-pw8ft3pu8w Год назад

      @@tilestats OKAY THANK YOU

  • @bhavyakalwar8131
    @bhavyakalwar8131 Год назад

    Tile stats best

  • @preethiagarwal5355
    @preethiagarwal5355 Год назад

    U cud hv explained how to calculate eigen values as part of this itself ...to make us watch other videos causes loosung of interest...sry uts not a one stop shop. Y dont u make it comprehensive

    • @tilestats
      @tilestats  Год назад

      Because I try to keep the videos below 20 min and then I cannot include all details that I have covered in previous videos. This video is just one, out of many, in my course:
      ruclips.net/p/PLLTSM0eKjC2fZqeVFWBBBr8KSqnBIPMQD

    • @preethiagarwal5355
      @preethiagarwal5355 Год назад

      @@tilestats wow 👏 thanx

  • @arunkumar0702
    @arunkumar0702 2 года назад

    I executed the steps in python .. I notice that the Matrix of Eigen Vectors returned by the sklearn ..
    pc = PCA(n_components = 2)
    pc.components_ is as follows:
    [ [-0.58906316, -0.80808699],
    [-0.80808699, 0.58906316] ]
    Whereas the one that you have calculated is:
    [ [ -0.80808699 , 0.58906316],
    [ 0.58906316 , 0.80808699 ] ]
    It would help if you could help me understand this difference . What am I missing ??

    • @tilestats
      @tilestats  2 года назад

      It seems like your function rotate the data counter clockwise, which explains the difference. It does not matter for the results. You may try to switch order of the input variables to see of that change the output.

  • @karodada8005
    @karodada8005 2 года назад +1

    Great video, thanks !