Canonical correlation analysis - explained

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 44

  • @knightzhang8387
    @knightzhang8387 2 года назад +8

    Wow, this is by far the only tutorial demonstrating a clear description of the CCA, and how to compute it. Thanks!

  • @tsunghanhsieh9085
    @tsunghanhsieh9085 Год назад +2

    Oh My! This is the best explanation about CCA I have ever seen.

  • @golshanshakeebaee868
    @golshanshakeebaee868 2 года назад

    Thank you very much for your clear explanation. Just wanted to say your voice is very similar to Professor Schmidt. Keep up the good work. best regards :)

  • @mgpetrus
    @mgpetrus Месяц назад

    Thanks for your very didatical demostration. I was wondering why you didn't mentioned about the data transformation and the data standarlization previous start the analysis, mainly because the blood preasure and body size have distinct scales.

    • @tilestats
      @tilestats  Месяц назад

      Yes, you can standardize the data but you will get the same correlations with un-standardized data because you later on instead standardize the scores as I explain at 10:56.

  • @Davide-yg5ny
    @Davide-yg5ny Год назад +1

    you're a life-saver

  • @milrione8425
    @milrione8425 Год назад +1

    So well explained!! Thank you!!

  • @khushpatelmd
    @khushpatelmd 2 года назад

    You are the best stats professor!! Thanks so much

    • @tilestats
      @tilestats  2 года назад

      Thank you!

    • @ernestamoore4385
      @ernestamoore4385 Год назад

      @@tilestats Excellent video. One question though: How to choose whether to use CCA or PLS? The difference is that PLS maximises the covariance between the datasets whereas CCA maximises the correlation.

  • @dr024
    @dr024 3 месяца назад +1

    very clear! Thank you =)

  • @user-vl4gn2ob3j
    @user-vl4gn2ob3j Год назад +1

    Thanks a lot! Very helpful!

  • @joshuagervin2845
    @joshuagervin2845 Год назад +1

    Thanks!

  • @JsoProductionChannel
    @JsoProductionChannel Год назад +1

    Thank you

  • @mdmahmudulhasanmiddya9632
    @mdmahmudulhasanmiddya9632 2 года назад

    U r very knowledgeable person.

  • @Bommi-oz7rs
    @Bommi-oz7rs 3 месяца назад +1

    Is anybody having step by step notes for this sum.. Pls reply

  • @Edward__1e
    @Edward__1e Месяц назад

    Don't miss out on a chat with Binance's CEO about the future - exclusive interview

  • @nadhilala
    @nadhilala 2 года назад

    thank you so much for your explanation! it is very helpful

  • @aakashyadav1589
    @aakashyadav1589 2 года назад

    Your stats videos are great.

  • @yaweli2968
    @yaweli2968 3 месяца назад

    Can you share a link to a nice multivariate linear regression dataset with at least 4 dependent variable and at least 2 outcome variables if possible?

  • @ebrahimfeghhi1777
    @ebrahimfeghhi1777 Год назад

    Great lecture

  • @shaoneesaha6073
    @shaoneesaha6073 25 дней назад

    Despite of negative coefficient value/ taller person has lower bp/heavier person has high bp. This is not clear to me. I also faced such type of result in CCA but cant interpret the result. Would anyone plz define me.

  • @ernestamoore4385
    @ernestamoore4385 Год назад

    Excellent video. One question though: How to choose whether to use CCA or PLS? The difference is that PLS maximises the covariance between the datasets whereas CCA maximises the correlation.

    • @tilestats
      @tilestats  Год назад

      I would use CCA for correlation and PLS for regression. I have a video about PLS as well:
      ruclips.net/video/Vf7doatc2rA/видео.html

  • @zk1560
    @zk1560 Год назад

    Hi, I tried to reproduce what you are showing here in python but I got totally different results. The calculations that you are showing are on the numbers shown in the video or are you using something else as input?

    • @tilestats
      @tilestats  Год назад

      Yes, I used the example data in R. What is your output?

  • @KS-df1cp
    @KS-df1cp 2 года назад

    What would have happened if we did not take inverse at 6:46 timestamp? What if we multiply all of them as it is? Thank you.

  • @EashwarMurali
    @EashwarMurali Год назад

    Is there further theory behind the equation introduced at 6:25? Can you suggest some reading material for concrete proofs?

    • @tilestats
      @tilestats  Год назад

      Check wiki
      en.wikipedia.org/wiki/Canonical_correlation

  • @youssefsamernarouz8608
    @youssefsamernarouz8608 Год назад

    Thank youuuu

  • @halilibrahimakgun7569
    @halilibrahimakgun7569 3 месяца назад

    Eigen vectors for Rx and Ry are wrong, different results calculated. Are yu sure about calculating eigen value of Rx and Ry. First and second eigen vectors and eigen values places are different.

    • @tilestats
      @tilestats  3 месяца назад

      If you run the following code in R for, for example, Ry,
      mat=matrix(c(-0.164,0.430,
      -0.322,0.722),2,2)
      eigen(mat)
      you will get the following eigenvectors and eigenvalues:
      $values
      [1] 0.51939343 0.03860657
      $vectors
      [,1] [,2]
      [1,] 0.4262338 -0.8463918
      [2,] -0.9046130 0.5325607
      Please share your own calculations so that I can have a look.

    • @halilibrahimakgun7569
      @halilibrahimakgun7569 3 месяца назад

      Ry = [ -0.164 -0.322
      0.430 0.722]
      But your given code in R ,
      is transpose of this matrix.
      You give input matrix false. Or should we take transpose before taking eigenvectors? @tilestats

    • @tilestats
      @tilestats  3 месяца назад

      No, you fill in the numbers by column in R. If you like to fill in by rows instead, you do like this, which will give the exact same matrix and eigenvectors:
      mat=matrix(c(-0.164,-0.322,
      0.430,0.722),2,2,byrow = TRUE)
      eigen(mat)

    • @halilibrahimakgun7569
      @halilibrahimakgun7569 3 месяца назад

      @@tilestats A = np.array([[-0.164, -0.322], [0.430, 0.722]])
      # Calculate eigenvalues and eigenvectors
      eigenvalues, eigenvectors = np.linalg.eig(A)
      print("Eigenvalues:", eigenvalues)
      print("Eigenvectors:", eigenvectors)
      This code prints reverse of it,
      I dont know why there is difference in python

    • @tilestats
      @tilestats  3 месяца назад

      The way you rotate the data is arbitrary so it does not matter if you get the reverse values. The eigenvalues are correct, right?