Canonical correlation analysis - explained

Поделиться
HTML-код
  • Опубликовано: 1 фев 2025

Комментарии • 46

  • @knightzhang8387
    @knightzhang8387 2 года назад +11

    Wow, this is by far the only tutorial demonstrating a clear description of the CCA, and how to compute it. Thanks!

  • @tsunghanhsieh9085
    @tsunghanhsieh9085 2 года назад +3

    Oh My! This is the best explanation about CCA I have ever seen.

  • @Tom-sp3gy
    @Tom-sp3gy 4 месяца назад +2

    Beautiful explanation … 3 min into the video and I understood the whole gist of CCA! Thankyou so much !!! Whoever said that complicated things cannot be explained simply?

  • @joshuagervin2845
    @joshuagervin2845 Год назад +1

    Thanks!

  • @golshanshakeebaee868
    @golshanshakeebaee868 2 года назад

    Thank you very much for your clear explanation. Just wanted to say your voice is very similar to Professor Schmidt. Keep up the good work. best regards :)

  • @khushpatelmd
    @khushpatelmd 2 года назад

    You are the best stats professor!! Thanks so much

    • @tilestats
      @tilestats  2 года назад

      Thank you!

    • @ernestamoore4385
      @ernestamoore4385 2 года назад

      @@tilestats Excellent video. One question though: How to choose whether to use CCA or PLS? The difference is that PLS maximises the covariance between the datasets whereas CCA maximises the correlation.

  • @milrione8425
    @milrione8425 Год назад +1

    So well explained!! Thank you!!

  • @Davide-yg5ny
    @Davide-yg5ny 2 года назад +1

    you're a life-saver

  • @mgpetrus
    @mgpetrus 7 месяцев назад

    Thanks for your very didatical demostration. I was wondering why you didn't mentioned about the data transformation and the data standarlization previous start the analysis, mainly because the blood preasure and body size have distinct scales.

    • @tilestats
      @tilestats  7 месяцев назад

      Yes, you can standardize the data but you will get the same correlations with un-standardized data because you later on instead standardize the scores as I explain at 10:56.

  • @aakashyadav1589
    @aakashyadav1589 2 года назад

    Your stats videos are great.

  • @杨佳祎-t3f
    @杨佳祎-t3f Год назад +1

    Thanks a lot! Very helpful!

  • @mdmahmudulhasanmiddya9632
    @mdmahmudulhasanmiddya9632 2 года назад

    U r very knowledgeable person.

  • @dr024
    @dr024 9 месяцев назад +1

    very clear! Thank you =)

  • @EashwarMurali
    @EashwarMurali Год назад

    Is there further theory behind the equation introduced at 6:25? Can you suggest some reading material for concrete proofs?

    • @tilestats
      @tilestats  Год назад

      Check wiki
      en.wikipedia.org/wiki/Canonical_correlation

  • @KS-df1cp
    @KS-df1cp 2 года назад

    What would have happened if we did not take inverse at 6:46 timestamp? What if we multiply all of them as it is? Thank you.

  • @nadhilala
    @nadhilala 2 года назад

    thank you so much for your explanation! it is very helpful

  • @ernestamoore4385
    @ernestamoore4385 2 года назад

    Excellent video. One question though: How to choose whether to use CCA or PLS? The difference is that PLS maximises the covariance between the datasets whereas CCA maximises the correlation.

    • @tilestats
      @tilestats  2 года назад

      I would use CCA for correlation and PLS for regression. I have a video about PLS as well:
      ruclips.net/video/Vf7doatc2rA/видео.html

  • @yaweli2968
    @yaweli2968 8 месяцев назад

    Can you share a link to a nice multivariate linear regression dataset with at least 4 dependent variable and at least 2 outcome variables if possible?

  • @ebrahimfeghhi1777
    @ebrahimfeghhi1777 Год назад

    Great lecture

  • @Bommi-oz7rs
    @Bommi-oz7rs 8 месяцев назад +1

    Is anybody having step by step notes for this sum.. Pls reply

  • @JsoProductionChannel
    @JsoProductionChannel 2 года назад +1

    Thank you

  • @shaoneesaha6073
    @shaoneesaha6073 6 месяцев назад

    Despite of negative coefficient value/ taller person has lower bp/heavier person has high bp. This is not clear to me. I also faced such type of result in CCA but cant interpret the result. Would anyone plz define me.

    • @tilestats
      @tilestats  4 месяца назад

      This is just a small data set so do not draw any biologic conclusion from it.

  • @zk1560
    @zk1560 2 года назад

    Hi, I tried to reproduce what you are showing here in python but I got totally different results. The calculations that you are showing are on the numbers shown in the video or are you using something else as input?

    • @tilestats
      @tilestats  2 года назад

      Yes, I used the example data in R. What is your output?

  • @youssefsamernarouz8608
    @youssefsamernarouz8608 Год назад

    Thank youuuu

  • @Edward__1e
    @Edward__1e 6 месяцев назад

    Don't miss out on a chat with Binance's CEO about the future - exclusive interview

  • @halilibrahimakgun7569
    @halilibrahimakgun7569 9 месяцев назад

    Eigen vectors for Rx and Ry are wrong, different results calculated. Are yu sure about calculating eigen value of Rx and Ry. First and second eigen vectors and eigen values places are different.

    • @tilestats
      @tilestats  9 месяцев назад

      If you run the following code in R for, for example, Ry,
      mat=matrix(c(-0.164,0.430,
      -0.322,0.722),2,2)
      eigen(mat)
      you will get the following eigenvectors and eigenvalues:
      $values
      [1] 0.51939343 0.03860657
      $vectors
      [,1] [,2]
      [1,] 0.4262338 -0.8463918
      [2,] -0.9046130 0.5325607
      Please share your own calculations so that I can have a look.

    • @halilibrahimakgun7569
      @halilibrahimakgun7569 9 месяцев назад

      Ry = [ -0.164 -0.322
      0.430 0.722]
      But your given code in R ,
      is transpose of this matrix.
      You give input matrix false. Or should we take transpose before taking eigenvectors? @tilestats

    • @tilestats
      @tilestats  9 месяцев назад

      No, you fill in the numbers by column in R. If you like to fill in by rows instead, you do like this, which will give the exact same matrix and eigenvectors:
      mat=matrix(c(-0.164,-0.322,
      0.430,0.722),2,2,byrow = TRUE)
      eigen(mat)

    • @halilibrahimakgun7569
      @halilibrahimakgun7569 9 месяцев назад

      @@tilestats A = np.array([[-0.164, -0.322], [0.430, 0.722]])
      # Calculate eigenvalues and eigenvectors
      eigenvalues, eigenvectors = np.linalg.eig(A)
      print("Eigenvalues:", eigenvalues)
      print("Eigenvectors:", eigenvectors)
      This code prints reverse of it,
      I dont know why there is difference in python

    • @tilestats
      @tilestats  9 месяцев назад

      The way you rotate the data is arbitrary so it does not matter if you get the reverse values. The eigenvalues are correct, right?