This is my guess for the number of parameters (in the covariance matrix alone) at 38:16: Full - p^2 (There are p*p distinct elements) Diagonal - p (There are only p distinct elements along diagonal, all else is 0) Spehrical - 1 (Same as diagonal but equal variance in all dimensions, so only one number to compute) If the model is separate, multiply the number above by 2, otherwise 1. Add 2p to account for the mean vectors as well. (There are p distinct means to calculate for each of the two classes)
Thanks for posting - very helpful video. I did get a bit confused with some of the notation. Looking at the slide titled estimating gaussian parameters (25:49) - the covariance matrix we're estimating is indexing over Ck which is the subset of the design matrix for which Y=k? are X and mu_k both matrixes or is mu_k a vector?
3 года назад+2
Thanks. Let me see... x_i is a vector (sample number i). mu_k is a vector (average over all samples belonging to class k, so with Y=k). Sigma_k is a matrix (covariance matrix over all samples belonging to class k). I usually use lowercase bold for vectors and uppercase bold for matrices.
this is the best LDA video I have seen. thank you so much.
I wish I had found this before my masters. intuitive with right amount of mathematical rigor.
So wonderfully presented! Whenever I started to feel there was much math, some cute drawings appeared to give me simple and visceral intuition.
Highly recommended Machine Learning Instruction!
This is my guess for the number of parameters (in the covariance matrix alone) at 38:16:
Full - p^2 (There are p*p distinct elements)
Diagonal - p (There are only p distinct elements along diagonal, all else is 0)
Spehrical - 1 (Same as diagonal but equal variance in all dimensions, so only one number to compute)
If the model is separate, multiply the number above by 2, otherwise 1.
Add 2p to account for the mean vectors as well. (There are p distinct means to calculate for each of the two classes)
This solved my problem. Thank you sir. Needed a summarized view of the math. Perfect.
Sir, You are a great teacher.
simply beautifully explained, sir you have all my gratitude
These have been excellent videos so far
Excellent explaination! Thank u very much!
Great video. Thank you so much for showing all the math!
Thanks for posting - very helpful video. I did get a bit confused with some of the notation. Looking at the slide titled estimating gaussian parameters (25:49) - the covariance matrix we're estimating is indexing over Ck which is the subset of the design matrix for which Y=k? are X and mu_k both matrixes or is mu_k a vector?
Thanks. Let me see... x_i is a vector (sample number i). mu_k is a vector (average over all samples belonging to class k, so with Y=k). Sigma_k is a matrix (covariance matrix over all samples belonging to class k). I usually use lowercase bold for vectors and uppercase bold for matrices.
@ great, thank you for the quick reply!
this is gold, thank you
Great videos! Thank you for this.
Sir can you please tell
Does this course offers any course related to robotics and autonomous systems, during the program.
👍 Great video - thank you!
Very great teacher, I wish I could study in Tubingen
Sir can you please share the slides or notes thanks
Thank you! :)