Kilian Weinberger
Kilian Weinberger
  • Видео 41
  • Просмотров 1 469 525
The DBSCAN Clustering Algorithm Explained
DBSCAN has become one of my favorite clustering algorithms.
The original paper is here: www.dbs.ifi.lmu.de/Publikationen/Papers/KDD-96.final.frame.pdf
(This video is part of the CS4780 Machine Learning Class.)
Просмотров: 3 340

Видео

CS4780 Transformers (additional lecture 2023)
Просмотров 7 тыс.Год назад
A brief explanation of the Transformer Architecture used in GPT-3, ChatGPT for language modelling. (Uploaded here, for those who missed class due to the unusually nice weather :-) )
On the Importance of Deconstruction in Machine Learning Research
Просмотров 6 тыс.3 года назад
This is a talk I gave in December 2020 at the NeurIPS Retrospective Workshop. I explain why it is so important to carefully analyze your own research contributions through the story of 3 recent publications from my research group at Cornell University. In all three cases did we first invent something far more complicated, only to realize that the gains could be attributed to something far simpl...
Machine Learning Lecture 18 "Review Lecture II" -Cornell CS4780 SP17
Просмотров 11 тыс.4 года назад
Machine Learning Lecture 18 "Review Lecture II" -Cornell CS4780 SP17
In-class Kaggle Competition in less than 5 Minutes
Просмотров 12 тыс.5 лет назад
The Fall 2018 version of CS4780 featured an in-class Kaggle competition. The stuents had 3 weeks to beat my submission, for which I only had 5 minutes. Some students challenged me to show a screencast of me actually training and uploading the model in time, so here you go. Happy Xmas.
Machine Learning Lecture 22 "More on Kernels" -Cornell CS4780 SP17
Просмотров 23 тыс.5 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote13.html www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote14.html
Machine Learning Lecture 37 "Neural Networks / Deep Learning" -Cornell CS4780 SP17
Просмотров 15 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote20.pdf
Machine Learning Lecture 36 "Neural Networks / Deep Learning Continued" -Cornell CS4780 SP17
Просмотров 14 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote20.pdf
Machine Learning Lecture 35 "Neural Networks / Deep Learning" -Cornell CS4780 SP17
Просмотров 20 тыс.6 лет назад
Machine Learning Lecture 35 "Neural Networks / Deep Learning" -Cornell CS4780 SP17
Machine Learning Lecture 34 "Boosting / Adaboost" -Cornell CS4780 SP17
Просмотров 17 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html
Machine Learning Lecture 33 "Boosting Continued" -Cornell CS4780 SP17
Просмотров 17 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html
Machine Learning Lecture 31 "Random Forests / Bagging" -Cornell CS4780 SP17
Просмотров 46 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote18.html If you want to take the course for credit and obtain an official certificate, there is now a revamped version (with much higher quality videos) offered through eCornell ( tinyurl.com/eCornellML ). Note, however, that eCornell does charge tuition for this version.
Machine Learning Lecture 21 "Model Selection / Kernels" -Cornell CS4780 SP17
Просмотров 28 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote11.html www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote12.html
Machine Learning Lecture 32 "Boosting" -Cornell CS4780 SP17
Просмотров 35 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html
Machine Learning Lecture 30 "Bagging" -Cornell CS4780 SP17
Просмотров 25 тыс.6 лет назад
Lecture Notes: www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote18.html
Machine Learning Lecture 29 "Decision Trees / Regression Trees" -Cornell CS4780 SP17
Просмотров 44 тыс.6 лет назад
Machine Learning Lecture 29 "Decision Trees / Regression Trees" -Cornell CS4780 SP17
Machine Learning Lecture 28 "Ball Trees / Decision Trees" -Cornell CS4780 SP17
Просмотров 30 тыс.6 лет назад
Machine Learning Lecture 28 "Ball Trees / Decision Trees" -Cornell CS4780 SP17
Machine Learning Lecture 27 "Gaussian Processes II / KD-Trees / Ball-Trees" -Cornell CS4780 SP17
Просмотров 30 тыс.6 лет назад
Machine Learning Lecture 27 "Gaussian Processes II / KD-Trees / Ball-Trees" -Cornell CS4780 SP17
Machine Learning Lecture 26 "Gaussian Processes" -Cornell CS4780 SP17
Просмотров 71 тыс.6 лет назад
Machine Learning Lecture 26 "Gaussian Processes" -Cornell CS4780 SP17
Machine Learning Lecture 25 "Kernelized algorithms" -Cornell CS4780 SP17
Просмотров 14 тыс.6 лет назад
Machine Learning Lecture 25 "Kernelized algorithms" -Cornell CS4780 SP17
Machine Learning Lecture 24 "Kernel Support Vector Machine" -Cornell CS4780 SP17
Просмотров 19 тыс.6 лет назад
Machine Learning Lecture 24 "Kernel Support Vector Machine" -Cornell CS4780 SP17
Machine Learning Lecture 23 "Kernels Continued Continued" -Cornell CS4780 SP17
Просмотров 14 тыс.6 лет назад
Machine Learning Lecture 23 "Kernels Continued Continued" -Cornell CS4780 SP17
Machine Learning Lecture 20 "Model Selection / Regularization / Overfitting" -Cornell CS4780 SP17
Просмотров 21 тыс.6 лет назад
Machine Learning Lecture 20 "Model Selection / Regularization / Overfitting" -Cornell CS4780 SP17
Machine Learning Lecture 19 "Bias Variance Decomposition" -Cornell CS4780 SP17
Просмотров 48 тыс.6 лет назад
Machine Learning Lecture 19 "Bias Variance Decomposition" -Cornell CS4780 SP17
Machine Learning Lecture 17 "Regularization / Review" -Cornell CS4780 SP17
Просмотров 16 тыс.6 лет назад
Machine Learning Lecture 17 "Regularization / Review" -Cornell CS4780 SP17
Machine Learning Lecture 16 "Empirical Risk Minimization" -Cornell CS4780 SP17
Просмотров 27 тыс.6 лет назад
Machine Learning Lecture 16 "Empirical Risk Minimization" -Cornell CS4780 SP17
Machine Learning Lecture 15 "(Linear) Support Vector Machines continued" -Cornell CS4780 SP17
Просмотров 26 тыс.6 лет назад
Machine Learning Lecture 15 "(Linear) Support Vector Machines continued" -Cornell CS4780 SP17
Machine Learning Lecture 14 "(Linear) Support Vector Machines" -Cornell CS4780 SP17
Просмотров 43 тыс.6 лет назад
Machine Learning Lecture 14 "(Linear) Support Vector Machines" -Cornell CS4780 SP17
Machine Learning Lecture 13 "Linear / Ridge Regression" -Cornell CS4780 SP17
Просмотров 34 тыс.6 лет назад
Machine Learning Lecture 13 "Linear / Ridge Regression" -Cornell CS4780 SP17
Machine Learning Lecture 12 "Gradient Descent / Newton's Method" -Cornell CS4780 SP17
Просмотров 47 тыс.6 лет назад
Machine Learning Lecture 12 "Gradient Descent / Newton's Method" -Cornell CS4780 SP17

Комментарии

  • @amorphous8826
    @amorphous8826 Час назад

    👍

  • @szaybhattacharya1022
    @szaybhattacharya1022 10 часов назад

    I will start my machine learning journey, I have the knowledge of stat , calculus, linear algebra.Should I follow this series?. From comments it seems this is a very very good lecture series. Or this would be too advance for a newbie like me?

  • @adosar7261
    @adosar7261 21 час назад

    At 39:53 the example seems a little bit off to me. I would expect Newton's method to converge in a single step (the function looks like a quadratic).

  • @amorphous8826
    @amorphous8826 День назад

    👍

  • @amorphous8826
    @amorphous8826 2 дня назад

    👍

  • @srisaisubramanyamdavanam9912
    @srisaisubramanyamdavanam9912 8 дней назад

    compression comparison for cross entropy is just damn good....

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 15 дней назад

    Thanks a lot. Wished I could have attended 'ML for Data Science' as well

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 15 дней назад

    Thank you Professor !!!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 15 дней назад

    "If you can't read my handwriting, you are not alone" :)

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 15 дней назад

    Thank you Sir !!!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 16 дней назад

    Thanks a lot !!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 16 дней назад

    Thank you !!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 17 дней назад

    Thanks a lot !!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 17 дней назад

    Thanks a lot Sir !!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 17 дней назад

    Thanks !!!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 17 дней назад

    Thanks !!!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 19 дней назад

    Thank you Professor !!!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 20 дней назад

    The time watching these lectures passes in no time !! Thanks, Professor!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 21 день назад

    Thank you for the amazing Lecture !!!

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 23 дня назад

    Thank you so much Sir !!!

  • @shashanksharma1498
    @shashanksharma1498 Месяц назад

    He is a fantastic teacher. And all the energetic teaching has given him a bad throat.

  • @jandrzej1264
    @jandrzej1264 Месяц назад

    I like your way of teaching more than Andrew or anyone else, cause they assume students or anyone watching is at least tiny intelligent, which just isnt the case. After all we are all just stupid, except for few people, or that one kid in class

  • @jandrzej1264
    @jandrzej1264 Месяц назад

    And here i am rewatching this for the 3-th time, cause i didnt take notes. This time i wont make that mistake, i already filled a whole notebook with this. Thanks Kilian, sad there isnt anything more tho

  • @filiprechtorik4938
    @filiprechtorik4938 Месяц назад

    JULIA

  • @gledguri
    @gledguri Месяц назад

    Super cool! Thanks!

  • @lambukushireddy424
    @lambukushireddy424 Месяц назад

    best lecture on naive bayes.

  • @filthyfillium
    @filthyfillium Месяц назад

    in lecture notes there is a spelling error in the concluding summary "week classifier"

  • @thatsharma1066
    @thatsharma1066 Месяц назад

    lectures so good, kilian on my poster wall. :)

  • @Exhora
    @Exhora Месяц назад

    Are the slides for the first part of the class available?

  • @bansaloni
    @bansaloni Месяц назад

    I have been watching your lectures for years now. I must say, the style of teaching is the best ! Every-time I need a refresher on some topic, your ML series is the first I think of. Thank you for the amazing content! 😃

  • @filipgaming1233
    @filipgaming1233 Месяц назад

    wonderful lecture

  • @vocabularybytesbypriyankgo1558
    @vocabularybytesbypriyankgo1558 Месяц назад

    A Great Beginning..excited for the next

  • @Dragon-Slay3r
    @Dragon-Slay3r 2 месяца назад

    I always tried to help, each time i did the cult took advantage now its their problem, and i dont know if i can help anybody everything has changed since last night

  • @officialstylechild
    @officialstylechild 3 месяца назад

    Raise your hand if that makes sense… crickets… ok moving on!

  • @andreariboni4242
    @andreariboni4242 3 месяца назад

    dead mouse got me

  • @IrfanSaleem541
    @IrfanSaleem541 3 месяца назад

    Prof, record small videos on SVM, SVR and Soft margin SVM, if possible.

  • @05me39
    @05me39 3 месяца назад

    Pure gold. The lectures are excellent. The humor is spot on. Many many thanks.

  • @AakarshNair
    @AakarshNair 3 месяца назад

    wow, you are a good teacher!

  • @daniilzaytsev2040
    @daniilzaytsev2040 3 месяца назад

    Legendary course!

  • @rakinbaten7305
    @rakinbaten7305 3 месяца назад

    I'm curious if someone was actually stealing all the notes

  • @rakinbaten7305
    @rakinbaten7305 4 месяца назад

    geez does anyone ever wonder what beast of a middle school Killian went to? 37:29

  • @30saransh
    @30saransh 4 месяца назад

    End score tallly: Under-grad = 255 Grad = 310 Grad Won!!

  • @WellItsNotTough
    @WellItsNotTough 4 месяца назад

    I cannot remember the last time I laughed so much while learning about neural networks. They usually make me cry! 😂 On the serious note, I have read NNs so many times but could not understand why putting them together was actually improving the performance. Thanks for the visualization!

  • @30saransh
    @30saransh 4 месяца назад

    Not sure why this playlist doesn't come on top when one searches for ML on youtube. Andrew Ng might be a good researcher but he's not a very good teacher. Killian teaches in such a good manner, that I never once felt bored of felt as if I'm studying. Thanks Killian, You're a Gem.

  • @30saransh
    @30saransh 4 месяца назад

    Is there any way we can get access to the projects for this course?

  • @30saransh
    @30saransh 4 месяца назад

    Amazing!!!!!!!!!!!!!!!!!!!!!!!!

  • @WellItsNotTough
    @WellItsNotTough 4 месяца назад

    We have a quiz question in lecture notes : "How does k affect the classifier? What happens if k = n? What happens if k = 1?" I do not think it is discussed in lectures. In my opinion, k is the only hyperparameter in this algorithm. For k = n, we are taking mode of the entire dataset labels as the output for test point, where as for k =1 , it will be assigned label that of the closest nearest neighbor. I have a doubt here, as we are using distance metric, what if we have 2 points(for simplicity) that are at equal distance to test point and have different labels. What happens in that case for k = 1? Similarly, for k = n, if we have equal proportion of binary class labels, how does mode works in that case?

    • @kilianweinberger698
      @kilianweinberger698 3 месяца назад

      Yes, for k=n it is the mode and k=1 is the nearest neighbor. If the label assignment is a draw (e.g. two points are equidistant) a common option is break ties randomly.

    • @WellItsNotTough
      @WellItsNotTough 3 месяца назад

      @@kilianweinberger698 Thank you for the answer Prof. Weinberger and for this amazing series as well.!

  • @eliasboulham
    @eliasboulham 4 месяца назад

    thank you professor .

  • @pnachtwey
    @pnachtwey 5 месяцев назад

    Everyone seems to have a different version. AdaGrad doesn't always work. The sum of the dot product of the gradient gets too big UNLESS one scales it down. Also, AdaGrad works best with a line search. All variations work best with a line search.

  • @gwonchanyoon7748
    @gwonchanyoon7748 5 месяцев назад

    i am wondering teenager hahaha!