Machine Learning 3.2 - Linear Discriminant Analysis (LDA) and Quadratic Discriminant Analysis (QDA)

Поделиться
HTML-код
  • Опубликовано: 21 сен 2024

Комментарии • 28

  • @lizzy1138
    @lizzy1138 3 года назад +4

    Thanks for this! I needed to clarify these methods in particular, was reading about them in ISLR

  • @Spiegeldondi
    @Spiegeldondi Год назад +1

    A very good and concise explanation, even starting with the explanation of likelihood. Very well done!

  • @JappieYow
    @JappieYow 3 года назад +7

    Interesting and clear explanation! Thank you very much, this will help me in writing my thesis!

  • @gingerderidder8665
    @gingerderidder8665 3 месяца назад

    This beats my MIT lecture. WIll be coming back for more!

  • @neftalisalazar2352
    @neftalisalazar2352 7 месяцев назад +1

    I enjoyed watching your video, thank you. I will watch more of your videos on machine learning videos thank you!

  • @huilinchang8027
    @huilinchang8027 4 года назад +8

    Awesome lecture, thank you professor!

  • @Sam1998Here
    @Sam1998Here Месяц назад

    Thank you for your explanation. I also think at 8:15 the multivariate normal distribution's probability density function should have $\sqrt{|\Sigma|}$ in the denominator (rather than $|\Sigma|$ as you have currently) and it also may be helpful to viewers to let them know that $p$ represents the dimension of the space we are considering

  • @ofal4535
    @ofal4535 Год назад +2

    i was trying to read it my self but you made it so much simpler

  • @Dhdhhhjjjssuxhe
    @Dhdhhhjjjssuxhe Год назад +2

    Good job. It is very easy to follow and understand

  • @vi5hnupradeep
    @vi5hnupradeep 3 года назад +4

    Thankyou so much ! Cleared a lot of my doubts

  • @geo123473
    @geo123473 10 месяцев назад +1

    Very great video! Thank you professor!! :)

  • @spencerantoniomarlen-starr3069
    @spencerantoniomarlen-starr3069 Год назад +1

    10:48 ohhhhh, I was just going back and forth between the sections on LDA and QDA in three different textbooks (An Introduction to Statistical Learning, Applied Predictive Analytics, and Elements of Statistical Learning) for well over an hour and that multivariate normal pdf was really throwing me off big time. Mostly because of the capital sigma to the negative 1st power term, I didn't realize it was literally a capital sigma, I kept thinking it was a summation of something!

  • @黃楷翔-h8j
    @黃楷翔-h8j 2 года назад +2

    Very useful information, thanks you professor!

    • @billbasener8784
      @billbasener8784  2 года назад

      I am glad its helpful! Thanks for the kind words.

  • @zhengcao6529
    @zhengcao6529 3 года назад +1

    You are so great. Keep up please.

  • @jaafarelouakhchachi6170
    @jaafarelouakhchachi6170 5 месяцев назад +1

    can you share these slides in the videos with me?

  • @MrRynRules
    @MrRynRules 3 года назад +1

    Thank you sir, well explained.

  • @pol4624
    @pol4624 2 года назад

    very good video, thank you professor

    • @billbasener8784
      @billbasener8784  2 года назад

      I am glad it is helpful. Thank you for the kind words!

  • @kaym2332
    @kaym2332 3 года назад +1

    Hi! If the classes are assumed to be normally distributed, does that subsume that the features making up an observations are normally distributed as well?

    • @billbasener8784
      @billbasener8784  3 года назад +1

      Yes. If the each class has a multivariate normal distribution than each individual feature variable ihas a single variable normal distribution.

  • @saunokchakrabarty8384
    @saunokchakrabarty8384 Год назад

    How do you get the values of 0.15 and 0.02? I'm getting different values.

    • @rmharp
      @rmharp 11 месяцев назад

      Agreed. I got approximately 0.18 and 0.003, respectively.

  • @CottonChristian-e3r
    @CottonChristian-e3r 6 дней назад

    Young Carol Harris Ruth Clark Jessica

  • @haitaoxu3468
    @haitaoxu3468 3 года назад

    could you share the slide?