Multivariate normal distributions

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024
  • The mathematical form of the multivariate normal (Gaussian) distribution, and five useful properties of this distribution

Комментарии • 20

  • @debasiskar4662
    @debasiskar4662 5 месяцев назад +1

    Analogy with bivariate is shown very elegantly. Thanks.

  • @FauzaanSharieff
    @FauzaanSharieff 7 месяцев назад +2

    Mr Baker's teaching is very well and clear! And am I the only one who thinks he looks and sounds like Woody from Toy Story?

  • @karina_tai5857
    @karina_tai5857 Год назад +4

    Thank you! Concepts are very well-explained!

  • @esmaeilmorshedi3573
    @esmaeilmorshedi3573 2 года назад +3

    Perfect, Professor Baker!

  • @JianaMeng
    @JianaMeng Год назад +2

    very clear explanation! thanks

  • @ryanfox2478
    @ryanfox2478 Год назад +1

    Clear explanation. Thank you.

  • @mlfacts7973
    @mlfacts7973 7 месяцев назад

    Great video. Thank you!

  • @rahul_a22
    @rahul_a22 4 месяца назад +1

    After finding so many videos over the topic my research ends here...

  • @sirkelvinmalunga
    @sirkelvinmalunga 10 месяцев назад +1

    thank you!

  • @ks.4494
    @ks.4494 6 месяцев назад

    Important concepts, clear explained! thanks

  • @klevisimeri607
    @klevisimeri607 8 месяцев назад

    Thank you!

  • @abhinavdaggubelli991
    @abhinavdaggubelli991 Месяц назад

    But, we can't say anything about the independence b/w two random variables provided the Covariance between them is zero, right? Then how is 4th property working? Can you clarify please.

  • @ankursingh5555
    @ankursingh5555 Год назад +1

    Thank you

  • @user-hr8uj4qw4k
    @user-hr8uj4qw4k 7 месяцев назад

    Is there a way to derive the marginal pdf of each component X_i without resorting to the moment generating function?

  • @mohamedelmoghrabi5169
    @mohamedelmoghrabi5169 2 месяца назад

    Is it possible for a PDF copy of the lecture

  • @inaswulanramadhani4036
    @inaswulanramadhani4036 Год назад +1

    Excuse me, sir. Thank you for the video. But I don’t understand yet. Can you please give us an example about how to find variance and covariance of random vector, if expected values is real number?

    • @berke-ozgen
      @berke-ozgen 9 месяцев назад +1

      For variance, you need to construct an E[(X-E[X])]. E'[(X-E[X])] which is the second moment. This will give Var[X]. For the cov[X,Y], you should calculate E[X-mx, Y-my] .E'[X-mx, Y-my] (be careful because this includes Transpose.) Then you will have nxn matrix ((nx1) x (1xn) matrix gives nxn matrix). The diagonal of that matrix will include variances of each vector and the other terms are just covariances between related xi and yi. (m for mean) Hope that clears!

  • @spyhunter0066
    @spyhunter0066 2 года назад

    How can we get for n random x values in one data set , n mean values? I mean if I have a data set with Gaussian shape, shouldn't have only one mean and sigma. Bu the way, I have a histogram showing counts per channel up to 1024 channels for instance.
    The example you gave at the minute of 13.14, how would you construct it if you had one mean and one sigma but a vector of random variables as in my example I tried to explain above? (instead of having the mean and Sigma matrices) Actually, in that exmple at the minute of 15.00, you decided to change x matrix having x1 and x1 variables to x (underlined matrix with x1 vector and x2 vector . That confused everything.

    • @berke-ozgen
      @berke-ozgen 9 месяцев назад

      I hope this answers: In X vector, we have x1, x2 and so xn. These xi variables also a vector containing numbers. So X is initially a set of random vectors. Then you will have different mean values for each x1, x2 and so xi.

  • @user-je4xw6tx3k
    @user-je4xw6tx3k 11 месяцев назад

    why my professor is not you.