23: Mahalanobis distance

Поделиться
HTML-код
  • Опубликовано: 3 июл 2024
  • Multivariate distance with the Mahalanobis distance. Using eigenvectors and eigenvalues of a matrix to rescale variables.
  • НаукаНаука

Комментарии • 69

  • @Nobody-md5kt
    @Nobody-md5kt Год назад +5

    This is fantastic. I'm a software engineer currently learning about why our cosine similarity functions aren't doing so hot on our large embeddings vector for a large language model. This helps me understand what's happening behind the scenes much better. Thank you!

    • @zainshiraz5239
      @zainshiraz5239 8 дней назад

      Can you share your observations regarding the research?

  • @tyronelagore1479
    @tyronelagore1479 2 года назад +4

    BEAUTIFULLY Explained. It would have been great to see the transformed plot to understand the effect it has, though you did explain it quite well verbally.

  • @qqq_Peace
    @qqq_Peace 4 года назад +1

    Excellent explanation of scaling covariance within the data. And linking it to PCA is nice to understand the behind ideas!

  • @dom6002
    @dom6002 4 месяца назад +1

    It's remarkable how inept professors are at explaining the simplest of concepts. You have surpassed most of mine, thank you very much.

    • @yee6365
      @yee6365 4 месяца назад +1

      Well this is an applied statistics course, so it's way more useful than most theoretical ones

  • @alvarezg.adrian
    @alvarezg.adrian 8 лет назад

    Great! Understanding concepts is better than copy formulas. Thank you for your conceptual explanation.

  • @cries3168
    @cries3168 2 года назад +2

    Great video, love you style of explanation, really good to follow along! Much better than my stats lecturer!

  • @sheenanasim
    @sheenanasim 6 лет назад

    Wonderful explanation!! Even the very beginner can pick this up. Thanks!

  • @anthonykoedyk715
    @anthonykoedyk715 Год назад +2

    Thank you for explaining the link between eigen vectors and mahalnobis distance. Been learning both with no linkage between them!

  • @chelseyli7478
    @chelseyli7478 2 года назад

    Thank you!. You made me clear about eigenvector ,eigenvalues and Mahalanobis distance. Best video on these topics.

  • @s3d871
    @s3d871 3 года назад

    Great job, saved my time a lot!

  • @seyedmahdihosseini6748
    @seyedmahdihosseini6748 3 года назад

    Perfect explanation. thorough understanding of underlying mathematics concepts

  • @sacman3001
    @sacman3001 8 лет назад +1

    Awesome explanation! Thank you for posting!

  • @pavster3
    @pavster3 4 года назад

    Excellent video - very clear. THanks very much for posting

  • @monta7834
    @monta7834 7 лет назад +3

    Great introduction to the problem and explanation of the basis. Wish I could have found this earlier before having wasted so much time going through those videos/articles done by people who could only tell complicated stuff in more complicated manners.

  • @tinAbraham_Indy
    @tinAbraham_Indy 4 месяца назад

    I truly enjoy watching this tutorial. Thank you

  • @vangelis9911
    @vangelis9911 3 года назад

    Good job in explaining a rather complicated concept, thank you

  • @leonardocerliani3479
    @leonardocerliani3479 3 года назад

    Amazing video! Thank you so much!

  • @vishaljain4915
    @vishaljain4915 Год назад

    Could not have gotten confused even if i tried to, really clear explanation

  • @the_iurlix
    @the_iurlix 5 лет назад

    So clear!! Thanks man!

  • @sanjaykrish8719
    @sanjaykrish8719 3 года назад

    Simply superbb.. You made my day

  • @thinhphan5404
    @thinhphan5404 4 года назад

    Thank you. This video help me a lot.

  • @ajeetis
    @ajeetis 8 лет назад

    Nicely explained. Thank you!

  • @kamilazdybal
    @kamilazdybal 6 лет назад

    Great video, thank you!

  • @thuongdinh5990
    @thuongdinh5990 8 лет назад

    awesome job ,thank you!

  • @liuzeyuan
    @liuzeyuan 2 года назад

    very explained thank you so much matt

  • @domenicodifraia7338
    @domenicodifraia7338 4 года назад

    Great video man! Thanks a lot! : )

  • @bautistabaiocchi-lora1339
    @bautistabaiocchi-lora1339 3 года назад

    Really well explained. Thank you.

  • @mojtabakhayatazad2944
    @mojtabakhayatazad2944 Год назад

    A very good video for anyone who wants to feel math like physics

  • @LuisRIzquierdo
    @LuisRIzquierdo 2 года назад +3

    Great video, thank you so much!! Just a minor comment that you probably know, but I think it was not clear in the video at around 8:27: eigenvalues do not have to be integers, they can be scalar (in general, they are complex numbers), and the set of eigenvalues are a property of the linear transformation (i.e. of the matrix). You can scale any eigenvector, and it will still have the same eigenvalue associated with it. In any case, thank you so much for your excellent video!

  • @oldfairy
    @oldfairy 4 года назад

    Thank you, Great explanation. subscribed your channel after this video

  • @KayYesYouTuber
    @KayYesYouTuber 5 лет назад

    Beautiful explanation. thank you

  • @muskduh
    @muskduh Год назад

    Thanks for the video

  • @aashishadhikari8144
    @aashishadhikari8144 3 года назад +1

    Came to learn Mahalanobis distance, understood wny Mahalanobis distance is defined that way, what PCA does. :D Thanks.

  • @jonaspoffyn
    @jonaspoffyn 7 лет назад +3

    Small remark: at the slide where you do the matrix by vector multiplication (@6:42) the colours are definitely wrong. The results are correct but the colours for both rows should be:
    black*red+grey*blue

  • @pockeystar
    @pockeystar 6 лет назад +1

    How is this inverse of covariance matrix linked with shrinkage on the eigenvector?

  • @ojussinghal2501
    @ojussinghal2501 2 года назад

    This video is such a gem 🤓

  • @HyunukHa
    @HyunukHa 3 года назад

    Clear explanation.

  • @souravde6116
    @souravde6116 3 года назад +1

    Lots of doubt.
    Q1) If x is a vector defined by x = [x1;x2;x3...;xn], what will be size of covariance matrix C?
    Q2) If x is a matrix of M-by-N dimension, where M is no. of the state vectors and N is the total no. of respective observations of each vector in a different time instant, then how to calculate Mahalanobis norm and what is its final size of D and what is the inference we can get from this metric?
    Q3) If x is a matrix of N-by-N dimension, then also how to calculate Mahalanobis norm and what is its final size of D and what is the inference we can get from this metric?

  • @linduchyable
    @linduchyable 8 лет назад +1

    Hello, is the process of removing outliers from a variable more than one time considered manipulating or changing the data?i have loans for public. its mean .17093 st.dv .955838 skewness 7.571 kurtosis 61.436 most of the cases of this loan is an outliers after several times of ranking and replacing the missing values with the mean i reach this output mean .2970 stdv .22582 skewness 2.301 kurtisos 3.885 and it ends ub to be positively skewed.
    i dont know what to do shall i keep it this way or take the first one or do i have to continue knowing that the percentiles 5, 10, 25,50 and 75 ends up with the same number. please help:(

  • @user-ps8st5vo2h
    @user-ps8st5vo2h 8 лет назад

    thank you!

  • @1982Dibya
    @1982Dibya 8 лет назад +14

    Great Video..But could you please explain how inverse covariance and eigen vector relate to mahalanobis distance in detail..That would be very helpful

    • @PD-vt9fe
      @PD-vt9fe 3 года назад +1

      I have the same question. After doing some research, it turns out that eigenvectors can help with the multiplication step. More specifically, symmetric S can be written as S = P * D * P_T; P consists of eigenvectors and it's an orthogonal matrix, D is a diagonal matrix with eigenvalues, and P_T is the transpose matrix of P. It can help to speed up the calculation.

  • @linkeris7994
    @linkeris7994 5 лет назад

    very useful!

  • @muratcan__22
    @muratcan__22 5 лет назад +1

    Why do we need to remove the covariance in the data?

  • @deashehu2591
    @deashehu2591 8 лет назад +4

    Thank you Sir ! We need more intuition and less formulas. Please do more videos....

  • @deepakkumarshukla
    @deepakkumarshukla 4 года назад

    Best!

  • @deepakjain4481
    @deepakjain4481 Месяц назад

    thanks a lot

  • @anindadatta164
    @anindadatta164 2 года назад

    A clear statement of conclusion in the video would have been appreciated by beginers e.g MD is Z square score of a multivariate sample, calculated after removing the collinearity among the variables.

  • @danspeed93
    @danspeed93 4 года назад

    Thanks clear

  • @shourabhpayal1198
    @shourabhpayal1198 2 года назад

    Amazing sir

  • @colinweaver2097
    @colinweaver2097 3 года назад +1

    Is there a good textbook that covers this?

  • @achillesarmstrong9639
    @achillesarmstrong9639 6 лет назад

    nice video

  • @raditz2488
    @raditz2488 3 года назад

    @7:35 may be there is a typo and the eigen vectors are wrongly put in. The eigen vectors as per my calculations are [-0.85623911 -0.5165797 ] and [ 0.5165797 -0.85623911]. Can any one verify this?

  • @mithunLOL
    @mithunLOL 3 года назад

    Multiplication with the eigenvector doesn't necessarily have to be an integer multiplied with the eigenvector?! It could be any scalar?

  • @zaphbeeblebrox5333
    @zaphbeeblebrox5333 2 года назад +2

    "Square n-dim matrices have n eigenvectors". Not true. eg. a matrix that represents a rotation has no eigenvalues or eigenvectors.

  • @TheGerakas
    @TheGerakas 7 лет назад +7

    Your voice sounds like Tom Hanks!

  • @bettys7298
    @bettys7298 5 лет назад

    Hi Matthew, I do have a problem when using R to compute it. Could you help me fixing the problem? Thank you so much in advance! Here's the error and how I tried to fix it but failed:
    1. the error:
    > mahal = mahalanobis(x,
    + colMeans(x)
    + cov(x, use="pairwise.complete.obs"))
    Error: unexpected symbol in:
    " colMeans(x)
    cov"
    2. the fix:
    is.array(nomiss[, -c(1,2)]) (----->result= False)
    x

    • @lydiakoutrouditsou8514
      @lydiakoutrouditsou8514 5 лет назад

      you've created an object called temArray, and then tried to run the analysis on an object called temPArray?

  • @1982Dibya
    @1982Dibya 8 лет назад

    Could you please explain how Mahalanobis distance is related to Eigen vector.The video is very good and helpful but if you could explain how to use it from Eigen vector

    • @MatthewEClapham
      @MatthewEClapham  8 лет назад +3

      The eigen vector is a direction. Essentially, the points are rescaled by compressing them in the eigenvector directions, but by different amounts along each eigenvector. This removes covariance in the data. That's basically what the Mahalanobis distance does.

    • @muratcan__22
      @muratcan__22 5 лет назад

      @@MatthewEClapham Why do we need to remove the covariance in the data in the first place?

    • @bhupensinha3767
      @bhupensinha3767 5 лет назад

      @@muratcan__22 : Hope you have the answer by now !!!

    • @cesarvillalobos1778
      @cesarvillalobos1778 5 лет назад

      @@muratcan__22 The Euclidean distance problem.

    • @cesarvillalobos1778
      @cesarvillalobos1778 5 лет назад

      @@muratcan__22 Going a little in deep: The covariance is a property of random variables, but for use Euclidean distance you have a set of points with its positions and the distance between them namely you dont have random variables, so doesnt make sense talk about covariance. The trick is: random variables.

  • @XarOOraX
    @XarOOraX 9 месяцев назад

    This story seems straight forward - yet, after 8 minutes I still am clueless as where it is going to lead. Maybe it is just me, but when I need to learn something, I don't want a long tension arc: Oh, what is going to happen next... I want to start with a great picture of what is going to happen, and then fill in the details one after another, so I can sit and marvel, how the big initial problem step by step dissolves into smaller and understandable pieces. Inversing the story, starting from the conclusion, going to the basics also allows to stop once you understood enough.

  • @StefanReii
    @StefanReii 4 года назад

    Well explained, thank you!