- Видео 41
- Просмотров 1 469 525
Kilian Weinberger
США
Добавлен 1 фев 2018
The DBSCAN Clustering Algorithm Explained
DBSCAN has become one of my favorite clustering algorithms.
The original paper is here: www.dbs.ifi.lmu.de/Publikationen/Papers/KDD-96.final.frame.pdf
(This video is part of the CS4780 Machine Learning Class.)
The original paper is here: www.dbs.ifi.lmu.de/Publikationen/Papers/KDD-96.final.frame.pdf
(This video is part of the CS4780 Machine Learning Class.)
Просмотров: 3 340
Видео
CS4780 Transformers (additional lecture 2023)
Просмотров 7 тыс.Год назад
A brief explanation of the Transformer Architecture used in GPT-3, ChatGPT for language modelling. (Uploaded here, for those who missed class due to the unusually nice weather :-) )
On the Importance of Deconstruction in Machine Learning Research
Просмотров 6 тыс.3 года назад
This is a talk I gave in December 2020 at the NeurIPS Retrospective Workshop. I explain why it is so important to carefully analyze your own research contributions through the story of 3 recent publications from my research group at Cornell University. In all three cases did we first invent something far more complicated, only to realize that the gains could be attributed to something far simpl...
Machine Learning Lecture 18 "Review Lecture II" -Cornell CS4780 SP17
Просмотров 11 тыс.4 года назад
Machine Learning Lecture 18 "Review Lecture II" -Cornell CS4780 SP17
In-class Kaggle Competition in less than 5 Minutes
Просмотров 12 тыс.5 лет назад
The Fall 2018 version of CS4780 featured an in-class Kaggle competition. The stuents had 3 weeks to beat my submission, for which I only had 5 minutes. Some students challenged me to show a screencast of me actually training and uploading the model in time, so here you go. Happy Xmas.
Machine Learning Lecture 22 "More on Kernels" -Cornell CS4780 SP17
Просмотров 23 тыс.5 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote13.html www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote14.html
Machine Learning Lecture 37 "Neural Networks / Deep Learning" -Cornell CS4780 SP17
Просмотров 15 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote20.pdf
Machine Learning Lecture 36 "Neural Networks / Deep Learning Continued" -Cornell CS4780 SP17
Просмотров 14 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote20.pdf
Machine Learning Lecture 35 "Neural Networks / Deep Learning" -Cornell CS4780 SP17
Просмотров 20 тыс.6 лет назад
Machine Learning Lecture 35 "Neural Networks / Deep Learning" -Cornell CS4780 SP17
Machine Learning Lecture 34 "Boosting / Adaboost" -Cornell CS4780 SP17
Просмотров 17 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html
Machine Learning Lecture 33 "Boosting Continued" -Cornell CS4780 SP17
Просмотров 17 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html
Machine Learning Lecture 31 "Random Forests / Bagging" -Cornell CS4780 SP17
Просмотров 46 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote18.html If you want to take the course for credit and obtain an official certificate, there is now a revamped version (with much higher quality videos) offered through eCornell ( tinyurl.com/eCornellML ). Note, however, that eCornell does charge tuition for this version.
Machine Learning Lecture 21 "Model Selection / Kernels" -Cornell CS4780 SP17
Просмотров 28 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote11.html www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote12.html
Machine Learning Lecture 32 "Boosting" -Cornell CS4780 SP17
Просмотров 35 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html
Machine Learning Lecture 30 "Bagging" -Cornell CS4780 SP17
Просмотров 25 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote18.html
Machine Learning Lecture 29 "Decision Trees / Regression Trees" -Cornell CS4780 SP17
Просмотров 44 тыс.6 лет назад
Machine Learning Lecture 29 "Decision Trees / Regression Trees" -Cornell CS4780 SP17
Machine Learning Lecture 28 "Ball Trees / Decision Trees" -Cornell CS4780 SP17
Просмотров 30 тыс.6 лет назад
Machine Learning Lecture 28 "Ball Trees / Decision Trees" -Cornell CS4780 SP17
Machine Learning Lecture 27 "Gaussian Processes II / KD-Trees / Ball-Trees" -Cornell CS4780 SP17
Просмотров 30 тыс.6 лет назад
Machine Learning Lecture 27 "Gaussian Processes II / KD-Trees / Ball-Trees" -Cornell CS4780 SP17
Machine Learning Lecture 26 "Gaussian Processes" -Cornell CS4780 SP17
Просмотров 71 тыс.6 лет назад
Machine Learning Lecture 26 "Gaussian Processes" -Cornell CS4780 SP17
Machine Learning Lecture 25 "Kernelized algorithms" -Cornell CS4780 SP17
Просмотров 14 тыс.6 лет назад
Machine Learning Lecture 25 "Kernelized algorithms" -Cornell CS4780 SP17
Machine Learning Lecture 24 "Kernel Support Vector Machine" -Cornell CS4780 SP17
Просмотров 19 тыс.6 лет назад
Machine Learning Lecture 24 "Kernel Support Vector Machine" -Cornell CS4780 SP17
Machine Learning Lecture 23 "Kernels Continued Continued" -Cornell CS4780 SP17
Просмотров 14 тыс.6 лет назад
Machine Learning Lecture 23 "Kernels Continued Continued" -Cornell CS4780 SP17
Machine Learning Lecture 20 "Model Selection / Regularization / Overfitting" -Cornell CS4780 SP17
Просмотров 21 тыс.6 лет назад
Machine Learning Lecture 20 "Model Selection / Regularization / Overfitting" -Cornell CS4780 SP17
Machine Learning Lecture 19 "Bias Variance Decomposition" -Cornell CS4780 SP17
Просмотров 48 тыс.6 лет назад
Machine Learning Lecture 19 "Bias Variance Decomposition" -Cornell CS4780 SP17
Machine Learning Lecture 17 "Regularization / Review" -Cornell CS4780 SP17
Просмотров 16 тыс.6 лет назад
Machine Learning Lecture 17 "Regularization / Review" -Cornell CS4780 SP17
Machine Learning Lecture 16 "Empirical Risk Minimization" -Cornell CS4780 SP17
Просмотров 27 тыс.6 лет назад
Machine Learning Lecture 16 "Empirical Risk Minimization" -Cornell CS4780 SP17
Machine Learning Lecture 15 "(Linear) Support Vector Machines continued" -Cornell CS4780 SP17
Просмотров 26 тыс.6 лет назад
Machine Learning Lecture 15 "(Linear) Support Vector Machines continued" -Cornell CS4780 SP17
Machine Learning Lecture 14 "(Linear) Support Vector Machines" -Cornell CS4780 SP17
Просмотров 43 тыс.6 лет назад
Machine Learning Lecture 14 "(Linear) Support Vector Machines" -Cornell CS4780 SP17
Machine Learning Lecture 13 "Linear / Ridge Regression" -Cornell CS4780 SP17
Просмотров 34 тыс.6 лет назад
Machine Learning Lecture 13 "Linear / Ridge Regression" -Cornell CS4780 SP17
Machine Learning Lecture 12 "Gradient Descent / Newton's Method" -Cornell CS4780 SP17
Просмотров 47 тыс.6 лет назад
Machine Learning Lecture 12 "Gradient Descent / Newton's Method" -Cornell CS4780 SP17
👍
I will start my machine learning journey, I have the knowledge of stat , calculus, linear algebra.Should I follow this series?. From comments it seems this is a very very good lecture series. Or this would be too advance for a newbie like me?
At 39:53 the example seems a little bit off to me. I would expect Newton's method to converge in a single step (the function looks like a quadratic).
👍
👍
compression comparison for cross entropy is just damn good....
Thanks a lot. Wished I could have attended 'ML for Data Science' as well
Thank you Professor !!!
"If you can't read my handwriting, you are not alone" :)
Thank you Sir !!!
Thanks a lot !!
Thank you !!
Thanks a lot !!
Thanks a lot Sir !!
Thanks !!!
Thanks !!!
Thank you Professor !!!
The time watching these lectures passes in no time !! Thanks, Professor!
Thank you for the amazing Lecture !!!
Thank you so much Sir !!!
He is a fantastic teacher. And all the energetic teaching has given him a bad throat.
I like your way of teaching more than Andrew or anyone else, cause they assume students or anyone watching is at least tiny intelligent, which just isnt the case. After all we are all just stupid, except for few people, or that one kid in class
And here i am rewatching this for the 3-th time, cause i didnt take notes. This time i wont make that mistake, i already filled a whole notebook with this. Thanks Kilian, sad there isnt anything more tho
JULIA
Super cool! Thanks!
best lecture on naive bayes.
in lecture notes there is a spelling error in the concluding summary "week classifier"
lectures so good, kilian on my poster wall. :)
Are the slides for the first part of the class available?
I have been watching your lectures for years now. I must say, the style of teaching is the best ! Every-time I need a refresher on some topic, your ML series is the first I think of. Thank you for the amazing content! 😃
wonderful lecture
A Great Beginning..excited for the next
I always tried to help, each time i did the cult took advantage now its their problem, and i dont know if i can help anybody everything has changed since last night
Raise your hand if that makes sense… crickets… ok moving on!
dead mouse got me
Prof, record small videos on SVM, SVR and Soft margin SVM, if possible.
Pure gold. The lectures are excellent. The humor is spot on. Many many thanks.
wow, you are a good teacher!
Legendary course!
I'm curious if someone was actually stealing all the notes
geez does anyone ever wonder what beast of a middle school Killian went to? 37:29
End score tallly: Under-grad = 255 Grad = 310 Grad Won!!
I cannot remember the last time I laughed so much while learning about neural networks. They usually make me cry! 😂 On the serious note, I have read NNs so many times but could not understand why putting them together was actually improving the performance. Thanks for the visualization!
Not sure why this playlist doesn't come on top when one searches for ML on youtube. Andrew Ng might be a good researcher but he's not a very good teacher. Killian teaches in such a good manner, that I never once felt bored of felt as if I'm studying. Thanks Killian, You're a Gem.
Is there any way we can get access to the projects for this course?
Amazing!!!!!!!!!!!!!!!!!!!!!!!!
We have a quiz question in lecture notes : "How does k affect the classifier? What happens if k = n? What happens if k = 1?" I do not think it is discussed in lectures. In my opinion, k is the only hyperparameter in this algorithm. For k = n, we are taking mode of the entire dataset labels as the output for test point, where as for k =1 , it will be assigned label that of the closest nearest neighbor. I have a doubt here, as we are using distance metric, what if we have 2 points(for simplicity) that are at equal distance to test point and have different labels. What happens in that case for k = 1? Similarly, for k = n, if we have equal proportion of binary class labels, how does mode works in that case?
Yes, for k=n it is the mode and k=1 is the nearest neighbor. If the label assignment is a draw (e.g. two points are equidistant) a common option is break ties randomly.
@@kilianweinberger698 Thank you for the answer Prof. Weinberger and for this amazing series as well.!
thank you professor .
Everyone seems to have a different version. AdaGrad doesn't always work. The sum of the dot product of the gradient gets too big UNLESS one scales it down. Also, AdaGrad works best with a line search. All variations work best with a line search.
i am wondering teenager hahaha!