Prof Patrick Winston has sadly passed away on July 19, 2019 rest in peace , the knowledge you’ve passed to thousand of students is your legacy and its forever thank you
I've been watching a lot of MIT, Stanford, Harvard, Princeton lectures, but this... This was phenomenal, hands down the best lecture I've ever seen. Rest in Peace Prof
my fluid mechanics professor wasn't that good. she would use books and notes throughout her lecture to write the equations on the board. however, she had a good way of teaching that made everything understandable.
31:39 Why do I need to find the maximum value of the L value? I was looking for the minimum value of 1/2 * |W|^2, but I don't understand why you're looking for the maximum value of L as you move on to L.
@@알라알라-p5t1/2w2 has a constraint so you use a lagrengian multiplier alphai multiplied with the constraint and add it to the intial equation.You treat this L as a new minimizing solution to minimize the original equation with the constraint.
Don't be jealous of any initial condition. In the end it won't matter. You will get there if you are resourceful person anyway, and if you aren't such a person then no initial condition going to help.
I just came from Andrew Ng's ML course in order to understand SVMs better. I found something quite interesting. Andrew gets the optimization criterion at 21:49 from an altogether different place. He arrives at SVMs by modifying the logistic regression's cost function, and the optimization criterion emerges from the regularization portion of the cost function. He then explains why that leads to a maximum margin. In contrast, this professor starts by obtaining the margin width algebraically with the intention of maximizing it, and then explains why that leads to separating data. Pretty cool.
I didn't really want to watch this video from how long it was and I just wanted to get a quick rundown on the topic of SVMs, but I saw the comments and decided to watch the whole thing and, my god, am I glad I did. What an incredible lecturer and he made the topic crystal clear. Anyone struggling with SVMs should 100% find 50 minutes to just sit down and watch this and you'll be so glad you did.
This is how things should always be taught. Patience, deep understanding and passion to teach. I wish I would've had a professor like him as a graduate student.
+immcguyver07 Don't forget the ability to work with or for top researchers, the exposure you get to other amazing students with a wide variety of programming backgrounds, the clubs you can join to collaborate with these other students, being part of a pipeline that regularly sends people to silicon valley which allows them to pay off student debt within a few years, or being part of a pipeline that regularly sends people to top grad schools which is the only way to get a job in academia.
He's always out of breath, as if about to die. That's very annoying to me. lobtyu - Just read the papers, you'll be a much better researcher and for free.
What a gem of an lecture, Trying to understand equations directly just makes you mug up somethings and like you never understand it fully. What I found is that when you actually run yourself through a simulation of what the inventor of the equation did and follow the footsteps than things start making sense eventually and you arrive at solution and you think huh that wasn't to hard. Rest in Peace Prof Patrick Winston world needs more professors like you really man. This is first video lecture of his I am watching and I can feel what a great man we lost!!
This lecture by Patrick Winston is simply amazing! His way of teaching is one of the most insightful approaches to highly technical subjects that I have ever encountered. I am so grateful to MIT for letting students all over the world to learn from such people as him. The lecture on boosting is also very good.
This professor is very, very well spoken when it comes to explaining SVM's. Clear, concise, focused on one instance of one issue at a time... Many professors try to show you the entirety of the math while walking you through the conceptual ideas and it makes SVM's very difficult to learn! This man is quite the opposite! Great work! (And by math I didn't mean just showing the margins, graphs etc.. meant proof of the equations which two of my professors did in two separate machine learning/data science classes.)
He always teachs in a regular way, with a good sense of humor...you never get tired...you dont just learn algorithms, you learn how to think, how to innovate....Last year I watched 6.034 and right after the day that I finished the course I found that he passed away exactly one year before....god bless you professor , I never had an opportunity to meet you In real life but you were my best teacher and inspirator and I never forget you...rest in peace prof winson
One of the finest lecture in machine learning that i would have ever encountered in my life. Such a clarity of concepts and clear concise explanation to it is indeed appreciable. RIP Prof Patrick You surely amazed your students with the kind of the work you have done in this field.. Thank you sir for intriguing a mind in this field
Machine learning is one of the worst taught classes in schools today - lecturers who are too into implementations and don't understand the basics well enough themselves, don't have motivation to teach well, and overcrowded classes because everyone wants to be a data scientist.. Thank you MIT for releasing this gem into the public domain for millions to watch.. This was easily one of the best SVM lectures ever!!
One of my favourite math lectures on the internet. I probably wrote the same comment some years ago, but here I go again. Thanks to Professor Winston and everybody else who made this available. I do teach math myself and I deeply admire how he boils it down to the essentials without leaving anything important out. Just looking at how little there is on each board and how clearly the beauty of the subject shines through ... a true master class. And I always thought we should teach more math history, so it is great to hear from him how the ideas actually developed.
I am grateful that I was given the opportunity to participate in this lesson. I really put a lot of thought into getting the double problem of SVM really into my head. Prof. Patrick Winston was the one who made it click for me. It is sad to read in the comments that we lost a great teacher who helped to make the world a smarter place.
The "widest street approach" ... oh man! Perhaps the only lecturer that can throw "gems" like that + the story at the end... In three words he explained everything!
When more than half the comments are along the lines of "best SVM explanation I've seen", you know you've stumbled upon a gem of a lecture. Great work, I'll be checking out as many of your other lectures as I can because of this.
what do you expect? this is the difference between going to MIT and going everywhere else. It's not the knowledge that's the difference. It's how the teachers are able to relate the material in a very palatable fashion, that's the difference. I sometimes rue in my old age what I missed (because I didn't go to a good school) because i see the difference in my own understanding of things compared to those that went to good schools. It's not that I am not capable of understanding. It's that they have an in-depth grasp of the material because they sat under the tutilage of people like this professor. That's the difference between MIT and coursera or any other school folks.
This single video is much more powerful than all videos available on youtube about SVM. so, lucky to found his lecture. Simplicity of teaching at its best. Love you prof. RIP.
one of the most inspirational lectures ever. Gave me the same energy and motivation like my first courses at Engineering school trying to bring together Finite Element Methods, approximation theory and functional analysis and the code in machine language or fortran.
Like all students everywhere, I was watching this lecture and thinking, "if only I had had a teacher like Prof. Winston when I started in physics, I would have ...." Or at least, I would have had an easier time in all my other courses, and later in doing research, or just learning new things, like modern machine learning.
Note for myself and others: the reason in English why (dL / dw)[(1/2) |w|^2] = w is because dL/dw is a directional derivative. Equivalently, we are rotating the coordinate system such that the w direction is an axis, and taking the partial derivative with respect to w. We can now treat |w|^2 just like x^2 if we're doing normal calculus, particularly because |x|^2 = x^2 for all x.
Thanks a lot man i was confused but went on with the lecture since i didnt want to get distracted, i went in the comments anyway and saw this within a single scroll , You dropped this: 👑
This lecture is one of the best lectures i've ever get so far. Everything is wonderfully explained. I finally understand the maths behind SVM! Thank you so much for Mr. Patrick Winston and MIT for making this lecture accessible!
I came here for few minutes just to consolidate my SVM's Math foundation , and I found myself Stuck to the lecture till the end with a pencil and sheet of paper to take notes. > Amazing professor . I just learned he passed away . His contribution for those who are thirsty for learning is tremendous . الله يرحمه ويحسن إليه.
As a side note, what he explained here is called the hard-margin, then we have hinge loss to extend hard-margin to soft-margin. Sometimes math is just so beautiful I think they deserve a place in museums, they need to be hung up there for people to appreciate ;)
Amazing lecture, i watched almost 3 times back to back, its always gives you always refreshing thoughts. My honour to see this lecture thanks prof, still teaching so many students like me. You are simply great.
For everyone watching, note that there is a mistake on the board. At 19:20 , a student asked a correct clarification. w dot xPlus should be (1-b) , whereas w dot xMinus should be (-1-b), then you can get the 2/norm(w) equation.
The perfect lecture on SVM imo. Will always share with my students. At 19:44, it is actually -1-b instead of 1+b as perhaps someone in audience pointed out but that is also what is need to get the expression for the margin.
I was watching the video and thinking "Wow, the pace of the lectures at MIT is pretty fast. These students must be really bright to follow the professor. No wonder that I'm not studying there". At the end, I found that, unbeknownst to me, I was watching it all along at the 1.25 speed.
For the record..Ive started watching one lecture and now I am watching the whole course...Patrick Winston is a marvellous teacher and I wish to watch *everything* this guy has to teach. Are there any other courses he teaches? If so, please record and put them online!
00:00 Support vector machines is about dividing a space with decision boundaries 06:34 Introduction to support vector machines 13:11 Maximizing the width of the street separating the pluses from the minuses. 19:35 Maximize the width of the street under constraints using Lagrange multipliers. 00:33 Vector w is a linear sum of the vectors in the sample set. 31:10 Optimization depends only on the dot product of pairs of samples 37:29 Linearly inseparable spaces can be transformed to a more convenient space. 43:49 Support Vector Machines use kernel functions to transform data into another space for better perspective
amazing explanation !!! Most other tutors skip the algebra part which makes learning SVM a black box but this delineated explanation of prof patrick is amazingly simple and thorough. Thanks Prof patrick and MIT opencourseware.
After going through many articles and online courses, i still didn't understood the idea of SVM clearly. This one is surely the best video on SVM available online. Thanks a lot professor.
Thought I would add to the voices of approval. I've just completed an elementary Machine Learning course (SVM wasn't on it), and have watched quite a few youtube videos, including those from Andrew Ng. The clarity of language, display, sequence of demonstration and speed of this lesson are absolutely spot on. Thanks !
genius explanation here. very simple to understand, I love digging into the underlining mathematics thats involved with these algorithms. The idea that a kernel function can transform the nonlinear inseparable data into a higher dimensional space that is linear is simple mind boggling to me. So cool of a concept and really simple to realize once you get the mathematics behind it.
Wow, that is an excellent lecture on SVM, thank you! Hang in there until the "miracle" part that starts at 43:30; then he shows the transformations that make SVM amazing.
Best explanation ever! Can't believe he explained all these complicated things (SVM and kernel) in only 50 mins! (I took 90 min in this part of ML lecture in our university but that made me very confused). Thank you so much! prof.
Prof Patrick Winston has sadly passed away on July 19, 2019
rest in peace , the knowledge you’ve passed to thousand of students is your legacy and its forever
thank you
Very bad news :-(( RIP.
A real loss Prof Patrick Winston has dedicated himself to give knowledge on AI.
Very sad news RIP prof ;(
Oh damn...RIP
Rest in peace!
I've been watching a lot of MIT, Stanford, Harvard, Princeton lectures, but this... This was phenomenal, hands down the best lecture I've ever seen.
Rest in Peace Prof
Tremendous respect for any professor who writes out the entire math on board and does not use notes to do so.
my fluid mechanics professor wasn't that good. she would use books and notes throughout her lecture to write the equations on the board. however, she had a good way of teaching that made everything understandable.
31:39
Why do I need to find the maximum value of the L value?
I was looking for the minimum value of 1/2 * |W|^2, but I don't understand why you're looking for the maximum value of L as you move on to L.
@@알라알라-p5t1/2w2 has a constraint so you use a lagrengian multiplier alphai multiplied with the constraint and add it to the intial equation.You treat this L as a new minimizing solution to minimize the original equation with the constraint.
@@pranavtagoredid you really compared fluid mechanics equations to this basic linear sh*t? No, seriously?
@@ChibuRawkNot sure why you have to be a jerk.
RIP, the best storyteller and lecturer of all time!
He was an amazing professor who knew how to stimulate students’ curiosity.
Feel blessed to have attended his lectures live and work under his supervision. Rest in peace Prof. You will always be an inspiration to me.
I love gun soooo much
I’m jealous of every single student in this class. And thank god i am alive and can watch this on youtube.
why jealous ?
@@romanemul1 coz he can't use the bathrooms there.
@@romanemul1 of a world class education
I wouldn’t be jealous at all, you’re getting the same education for FREE
Don't be jealous of any initial condition. In the end it won't matter. You will get there if you are resourceful person anyway, and if you aren't such a person then no initial condition going to help.
That is the best chalk I've ever seen.
I just came from Andrew Ng's ML course in order to understand SVMs better. I found something quite interesting. Andrew gets the optimization criterion at 21:49 from an altogether different place. He arrives at SVMs by modifying the logistic regression's cost function, and the optimization criterion emerges from the regularization portion of the cost function. He then explains why that leads to a maximum margin. In contrast, this professor starts by obtaining the margin width algebraically with the intention of maximizing it, and then explains why that leads to separating data.
Pretty cool.
Same here. Now I'm trying to interrelate the parameters of the two approaches.
lol same here, cheers. Great course from Prof. Andrew too but I couldn't understand everything so I was looking for alternative lecture
Same here!
I'd argue this approach is working better for me.
Which course are you taking? Thanks in advance!
RIP ! Prof Winston!!!
you inspired lots of ppl!
:(
:(
He dead? That's very sad. He seemed like a nice fellow.
Professor if you ever read this, THANK YOU. I was actually sad the lecture ended eventually. The world needs more teaching like yours.
MIT offering free courses on RUclips in the early moves is the ultimate education move. I respect it.
I didn't really want to watch this video from how long it was and I just wanted to get a quick rundown on the topic of SVMs, but I saw the comments and decided to watch the whole thing and, my god, am I glad I did. What an incredible lecturer and he made the topic crystal clear. Anyone struggling with SVMs should 100% find 50 minutes to just sit down and watch this and you'll be so glad you did.
This is how things should always be taught. Patience, deep understanding and passion to teach. I wish I would've had a professor like him as a graduate student.
This is why MIT is MIT. Good work Prof and Thank you to the team. We hope to see more lectures related to Machine learning and Data science from MIT.
pushpender pareek, yes. for about $50,000, they will give you access to a year's worth of additional lectures.
+immcguyver07
Don't forget the ability to work with or for top researchers, the exposure you get to other amazing students with a wide variety of programming backgrounds, the clubs you can join to collaborate with these other students, being part of a pipeline that regularly sends people to silicon valley which allows them to pay off student debt within a few years, or being part of a pipeline that regularly sends people to top grad schools which is the only way to get a job in academia.
He's always out of breath, as if about to die. That's very annoying to me. lobtyu - Just read the papers, you'll be a much better researcher and for free.
What a gem of an lecture, Trying to understand equations directly just makes you mug up somethings and like you never understand it fully. What I found is that when you actually run yourself through a simulation of what the inventor of the equation did and follow the footsteps than things start making sense eventually and you arrive at solution and you think huh that wasn't to hard. Rest in Peace Prof Patrick Winston world needs more professors like you really man. This is first video lecture of his I am watching and I can feel what a great man we lost!!
This lecture by Patrick Winston is simply amazing! His way of teaching is one of the most insightful approaches to highly technical subjects that I have ever encountered. I am so grateful to MIT for letting students all over the world to learn from such people as him. The lecture on boosting is also very good.
This professor is very, very well spoken when it comes to explaining SVM's. Clear, concise, focused on one instance of one issue at a time... Many professors try to show you the entirety of the math while walking you through the conceptual ideas and it makes SVM's very difficult to learn! This man is quite the opposite! Great work! (And by math I didn't mean just showing the margins, graphs etc.. meant proof of the equations which two of my professors did in two separate machine learning/data science classes.)
He always teachs in a regular way, with a good sense of humor...you never get tired...you dont just learn algorithms, you learn how to think, how to innovate....Last year I watched 6.034 and right after the day that I finished the course I found that he passed away exactly one year before....god bless you professor , I never had an opportunity to meet you In real life but you were my best teacher and inspirator and I never forget you...rest in peace prof winson
RIP Patrick!!!It is sad you are no longer with us. You are a great teacher..
The historical part of Vapnik’s story is very inspiring.
Best explanation of SVM on internet !
It is good!!! , I found a better one: ruclips.net/video/SHBFk1ULNlE/видео.html
One of the finest lecture in machine learning that i would have ever encountered in my life. Such a clarity of concepts and clear concise explanation to it is indeed appreciable. RIP Prof Patrick You surely amazed your students with the kind of the work you have done in this field.. Thank you sir for intriguing a mind in this field
Simply loved it! Don't have any words for the professor who taught the sophisticated concepts with such simplicity...
RIP Prof Patrick this lecture is gold , Never saw anyone explain all the tiny details this smooth in less than an hour
RIP Professor, the world needs more people like you.
i have an exam tommorow in india and Prof Patrick teached me what my indian प्रोफ़ेसर
couldnt teach me in a whole semester. You sir saved my life...
Machine learning is one of the worst taught classes in schools today - lecturers who are too into implementations and don't understand the basics well enough themselves, don't have motivation to teach well, and overcrowded classes because everyone wants to be a data scientist..
Thank you MIT for releasing this gem into the public domain for millions to watch.. This was easily one of the best SVM lectures ever!!
Ackchyually... this is under a Creative Commons License (NC-BY-SA) and not in the Public Domain ...but we are glad you enjoyed it! =D
One of my favourite math lectures on the internet. I probably wrote the same comment some years ago, but here I go again. Thanks to Professor Winston and everybody else who made this available. I do teach math myself and I deeply admire how he boils it down to the essentials without leaving anything important out. Just looking at how little there is on each board and how clearly the beauty of the subject shines through ... a true master class. And I always thought we should teach more math history, so it is great to hear from him how the ideas actually developed.
I am grateful that I was given the opportunity to participate in this lesson. I really put a lot of thought into getting the double problem of SVM really into my head. Prof. Patrick Winston was the one who made it click for me. It is sad to read in the comments that we lost a great teacher who helped to make the world a smarter place.
You are lucky my friend, i wish if i could.
The "widest street approach" ... oh man! Perhaps the only lecturer that can throw "gems" like that + the story at the end... In three words he explained everything!
When more than half the comments are along the lines of "best SVM explanation I've seen", you know you've stumbled upon a gem of a lecture. Great work, I'll be checking out as many of your other lectures as I can because of this.
what do you expect? this is the difference between going to MIT and going everywhere else. It's not the knowledge that's the difference. It's how the teachers are able to relate the material in a very palatable fashion, that's the difference. I sometimes rue in my old age what I missed (because I didn't go to a good school) because i see the difference in my own understanding of things compared to those that went to good schools. It's not that I am not capable of understanding. It's that they have an in-depth grasp of the material because they sat under the tutilage of people like this professor. That's the difference between MIT and coursera or any other school folks.
it's annoying
BRILLIANT. Massive respect for the knowledge and simplicity of the professor here.
damn! this instructor's lines are so damn crisp!
mike johnston Bob ikr
Thank you MIT and Prof Wilson for making this public.
Talk about looking under the hood and discovering the hidden complexities.
I wish my math prof had his sense of humor and conciseness! Maby I would be doing my math PhD now instead of coding
Kernel trick, that has to be one of the most beautiful ideas I've seen (so far) in any branch of mathematics.
This single video is much more powerful than all videos available on youtube about SVM. so, lucky to found his lecture. Simplicity of teaching at its best. Love you prof. RIP.
RIR Prof. Winston - you are still alive through the knowledge you are spreading.
10:23 the way he interacts with the student. so nice.
Someone give this man a medal. Pure brilliance. Thanks for sharing.
"If you can't explain it simply, you don't understand it well enough."
~ Einstein
Professor Winston clearly understands the topics he teaches.
one of the most inspirational lectures ever. Gave me the same energy and motivation like my first courses at Engineering school trying to bring together Finite Element Methods, approximation theory and functional analysis and the code in machine language or fortran.
Like all students everywhere, I was watching this lecture and thinking, "if only I had had a teacher like Prof. Winston when I started in physics, I would have ...." Or at least, I would have had an easier time in all my other courses, and later in doing research, or just learning new things, like modern machine learning.
One of the best videos on SVM, which also explains the Kernalization so well.
Note for myself and others: the reason in English why (dL / dw)[(1/2) |w|^2] = w is because dL/dw is a directional derivative. Equivalently, we are rotating the coordinate system such that the w direction is an axis, and taking the partial derivative with respect to w. We can now treat |w|^2 just like x^2 if we're doing normal calculus, particularly because |x|^2 = x^2 for all x.
Thanks a lot man i was confused but went on with the lecture since i didnt want to get distracted, i went in the comments anyway and saw this within a single scroll ,
You dropped this: 👑
omg thank you so much, I've been looking around for the past few days to get past that step
i guess dL/dw means consider the vector of partials (dL/dw^1,...,dL/dw^n) where w=(w^1,...,w^n). i can't really make sense of your comment
This lecture is one of the best lectures i've ever get so far. Everything is wonderfully explained. I finally understand the maths behind SVM! Thank you so much for Mr. Patrick Winston and MIT for making this lecture accessible!
Helped me with my Statistical Machine Learning class. Thank you Professor. RIP
I came here for few minutes just to consolidate my SVM's Math foundation , and I found myself Stuck to the lecture till the end with a pencil and sheet of paper to take notes.
> Amazing professor . I just learned he passed away . His contribution for those who are thirsty for learning is tremendous .
الله يرحمه ويحسن إليه.
A combination of this lecture with a 10min lecture on SVMs by Victor Lavrenko worked amazing for me!
I went to that video and found it really useful, thanks for sharing.
Shadi Rahimian .. I really don't know what is this.. :-( I. very basic
thank you shadi itis god advise for victor video
Thanks alot , his video was very useful
Thanks for telling us
As a side note, what he explained here is called the hard-margin, then we have hinge loss to extend hard-margin to soft-margin.
Sometimes math is just so beautiful I think they deserve a place in museums, they need to be hung up there for people to appreciate ;)
Amazing lecture, i watched almost 3 times back to back, its always gives you always refreshing thoughts. My honour to see this lecture thanks prof, still teaching so many students like me. You are simply great.
The power of the rewind button in learning is actually phenomenal!
For everyone watching, note that there is a mistake on the board. At 19:20 , a student asked a correct clarification.
w dot xPlus should be (1-b) , whereas w dot xMinus should be (-1-b), then you can get the 2/norm(w) equation.
It's written as 1+b because negative value of w dot xMinus is considered, so not a mistake.
@@sajay96 I guess he corrected it at 19:53
The perfect lecture on SVM imo. Will always share with my students. At 19:44, it is actually -1-b instead of 1+b as perhaps someone in audience pointed out but that is also what is need to get the expression for the margin.
I was watching the video and thinking "Wow, the pace of the lectures at MIT is pretty fast. These students must be really bright to follow the professor. No wonder that I'm not studying there". At the end, I found that, unbeknownst to me, I was watching it all along at the 1.25 speed.
eVul6 OMG... Thank you!! I just realized I was watching it at 1.5x
i did the exact same thing
I am watching it with normal speed, I guess I need to set it on 0.5
Thanx for the commend. I watched it on normal speed, but thought that this guy is very slow, so I put it on 1.25. now it's fine.
eVul6 donkey
Easily the best video available on internet on SVMs till date,
May his soul rest in peace.
Wow. Best lecture for SVM I ever watched. Thanks a lot, MIT OpenCourseWare and Patrick Winston.
This teacher has the greatest balls for machine learning!!! Best SVM explanation ever!
For the record..Ive started watching one lecture and now I am watching the whole course...Patrick Winston is a marvellous teacher and I wish to watch *everything* this guy has to teach. Are there any other courses he teaches? If so, please record and put them online!
Best SVM lecture I have seen. This professor does a great job of teaching the concept of SVM and the thought process behind it.
Wow. This guys is the best teaching SVM.
00:00 Support vector machines is about dividing a space with decision boundaries
06:34 Introduction to support vector machines
13:11 Maximizing the width of the street separating the pluses from the minuses.
19:35 Maximize the width of the street under constraints using Lagrange multipliers.
00:33 Vector w is a linear sum of the vectors in the sample set.
31:10 Optimization depends only on the dot product of pairs of samples
37:29 Linearly inseparable spaces can be transformed to a more convenient space.
43:49 Support Vector Machines use kernel functions to transform data into another space for better perspective
Perfect explanation. so much better than anything else online
This is one of the best MIT lectures online!
amazing explanation !!!
Most other tutors skip the algebra part which makes learning SVM a black box but this delineated explanation of prof patrick is amazingly simple and thorough.
Thanks Prof patrick and MIT opencourseware.
Rip professor Winston. Still watching these for a refresher. I forgot how clear and yet engaging his 034 lectures were
That's unfortunate
How did he die??
@@cosmicwanderer891 basically natural causes as far as I know. He was still doing research in the hospital according to some of his advisees.
Best explanation I have seen so far. Much better than Andrew Ng in my opinion.
The best explanation of SVM I have come across. Hats off to Prof Patrick.
What a professor, may he rest in peace
After going through many articles and online courses, i still didn't understood the idea of SVM clearly. This one is surely the best video on SVM available online. Thanks a lot professor.
This is a brilliant lecture!
what an amazing lecture and professor. This is what its all about. Some people are just blessed lecturers and he is for sure one of those people.
best explaination! I saw many materials that is very hard to understand!
hello
This is one of the best explanation of SVM i have ever seen. This professor made this complex concept so easy to understand. KUDOS to him!!
MAN, that was good!! probably the best introduction to SVM available online.
These lectures on RUclips make world a better place to live in 🥰🥰🥰🥰
Thank you for providing us the content MIT-OCW
One of the best lectures I've seen, so concise and easy to follow :D
Thought I would add to the voices of approval.
I've just completed an elementary Machine Learning course (SVM wasn't on it), and have watched quite a few youtube videos, including those from Andrew Ng.
The clarity of language, display, sequence of demonstration and speed of this lesson are absolutely spot on.
Thanks !
Amazing lecture. Thank you and MIT in general. we love your priceless support to global education
Absolutely marvelous! Today, my professor at Texas A&M tried to emulate what this gentleman did 7 yrs ago.
i watched several times back and forth, finally, i THINK i understand
Excellent explanation Professor Winston. You have the rare skill of explaining both the math and its motivation clearly to a novice audience.
One of the best lectures I ever heard-methodical & extremely helpful!Thank you.I will definitely come back for more - appreciate this.
genius explanation here. very simple to understand, I love digging into the underlining mathematics thats involved with these algorithms. The idea that a kernel function can transform the nonlinear inseparable data into a higher dimensional space that is linear is simple mind boggling to me. So cool of a concept and really simple to realize once you get the mathematics behind it.
12:40 , that guy just saved me from suicide, I was like, "wtf, where did that w vector disappear!!" 😂😂😅😅
Me too......
Same here lol
What a beautiful lecture. Thank you, Prof. RIP Prof Patrick Winston.
Wow, that is an excellent lecture on SVM, thank you! Hang in there until the "miracle" part that starts at 43:30; then he shows the transformations that make SVM amazing.
Best explanation ever! Can't believe he explained all these complicated things (SVM and kernel) in only 50 mins! (I took 90 min in this part of ML lecture in our university but that made me very confused). Thank you so much! prof.
12:42 Thanks, Brett, whoever you are. Panicked for a few minutes until you chimed in 😂🙌🏾
lol same.
Really nice class. This professor managed to go through some tricky topics maintaining the simplicity and coherence of his argument.
Excellent! Simple explanation right down to the basics.
I pay my sincere thanks to the professor for an extraordinary lecture. Amazing. Good Teachers are the Gods.
Shotout to Vapnik and Winston, loves and respects from Turkey :)
I am very thankful for all people that worked to bring this amazing lecture from Prof. Patrick Winston to people around the world.
great lecture! Thanks MIT OCW
I liked the story he ended the lecture with. Lecturing is quite a talent.
"This needs to be in a tool bag of every civilized person"
Oh wow, at MIT they have a very specific idea of what 'civilized' means
😂😂😂😂
For a second I was going to try to defend them but honestly, I think they kinda do
Let's imagine that everyone knows how to separate pluses from minuses optimally.
The world would be a... I guess it would be the same.
He stated necessary conditions. Not sufficient.
How beautiful a proof can be if taught in the correct way! Brilliant!
Very good lecture, clear explanation and good pace :)
One correction:
44:30 (u*v+1)^n is a polynomial kernel, not a linear kernel.
put n=1
generally, this equation is linear and the value of 'n' denotes the dimension
This is by far the best lecture on support vector machine. just amazing lecture- a must watch