Introduction to Machine Learning - 06 - Linear discriminant analysis

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024

Комментарии • 21

  • @googlesong8679
    @googlesong8679 Месяц назад

    this is the best LDA video I have seen. thank you so much.

  • @saketdeshmukh6881
    @saketdeshmukh6881 3 года назад +11

    I wish I had found this before my masters. intuitive with right amount of mathematical rigor.

  • @YuchengLin
    @YuchengLin 2 года назад +3

    So wonderfully presented! Whenever I started to feel there was much math, some cute drawings appeared to give me simple and visceral intuition.

  • @jiajieli5138
    @jiajieli5138 3 года назад +1

    Highly recommended Machine Learning Instruction!

  • @AD-ox4ng
    @AD-ox4ng Год назад +1

    This is my guess for the number of parameters (in the covariance matrix alone) at 38:16:
    Full - p^2 (There are p*p distinct elements)
    Diagonal - p (There are only p distinct elements along diagonal, all else is 0)
    Spehrical - 1 (Same as diagonal but equal variance in all dimensions, so only one number to compute)
    If the model is separate, multiply the number above by 2, otherwise 1.
    Add 2p to account for the mean vectors as well. (There are p distinct means to calculate for each of the two classes)

  • @woodworkingaspirations1720
    @woodworkingaspirations1720 Год назад +1

    This solved my problem. Thank you sir. Needed a summarized view of the math. Perfect.

  • @TheCrmagic
    @TheCrmagic 3 года назад +3

    Sir, You are a great teacher.

  • @IamMoreno
    @IamMoreno 2 года назад +1

    simply beautifully explained, sir you have all my gratitude

  • @micahdelaurentis6551
    @micahdelaurentis6551 3 года назад +2

    These have been excellent videos so far

  • @severian6879
    @severian6879 Год назад

    Excellent explaination! Thank u very much!

  • @xiaochelsey880
    @xiaochelsey880 2 года назад

    Great video. Thank you so much for showing all the math!

  • @Jeremy-zs3nn
    @Jeremy-zs3nn 3 года назад

    Thanks for posting - very helpful video. I did get a bit confused with some of the notation. Looking at the slide titled estimating gaussian parameters (25:49) - the covariance matrix we're estimating is indexing over Ck which is the subset of the design matrix for which Y=k? are X and mu_k both matrixes or is mu_k a vector?

    •  3 года назад +2

      Thanks. Let me see... x_i is a vector (sample number i). mu_k is a vector (average over all samples belonging to class k, so with Y=k). Sigma_k is a matrix (covariance matrix over all samples belonging to class k). I usually use lowercase bold for vectors and uppercase bold for matrices.

    • @Jeremy-zs3nn
      @Jeremy-zs3nn 3 года назад

      @ great, thank you for the quick reply!

  • @calcifer7776
    @calcifer7776 3 года назад

    this is gold, thank you

  • @vincentole
    @vincentole 3 года назад

    Great videos! Thank you for this.

  • @nauraizsubhan01
    @nauraizsubhan01 3 года назад

    Sir can you please tell
    Does this course offers any course related to robotics and autonomous systems, during the program.

  • @CootiePruitt
    @CootiePruitt 3 года назад

    👍 Great video - thank you!

  • @indigod3323
    @indigod3323 3 года назад +1

    Very great teacher, I wish I could study in Tubingen

  • @hfz.arslan
    @hfz.arslan 3 года назад

    Sir can you please share the slides or notes thanks

  • @sunshinebabe6203
    @sunshinebabe6203 3 года назад

    Thank you! :)