Support StatQuest by buying my books The StatQuest Illustrated Guide to Machine Learning, The StatQuest Illustrated Guide to Neural Networks and AI, or a Study Guide or Merch!!! statquest.org/statquest-store/
Excellent! There is a higher dimensional space where this video is linearly seperable from anything else on youtube. What I love is that you use both math and intuition in good measure. You dont sacrifice intutition over math or math over intuition like most other attempts. This balance you ve got here is excellent.
This channel is so amazing。For the past few months I have been trying to catch up on concepts in statistics that my university never taught so that I have enough knowledge to go into the data science and machine learning fields。 The way you teach concepts in clear concise and short videos is extremely valuable。I have learned much in such a short time from just watching your videos and taking handwritten notesーthank you for all the hard work you have put in delivering this invaluable knowledge!Please continue making videos!
@@statquest Hey Josh, what would be an intuitive way to understand how SVM uses the high dimensional relationship between each pair of points to make the actual classification
@@arunavsaikia2678 This is a good question. The dot product between two points, which we use to create the "high-dimensional relationships," can be interpreted in a geometric way that includes the cosine of the angle between the two points multiplied by their magnitudes (the distances from the origin). With that in mind, check out slide 18 in this PDF: web.mit.edu/6.034/wwwbob/svm.pdf
the mathematical reasoning behind the radial kernel has been plaguing me for so long, finally after many tries now it starts to click and my mind can visualize better what is happening and why. Thank you so much :)
I have gone through 85% of the full list and found this series extremely useful. The instructions are simple to understand and give the sufficient overview into Machine learning. Highly recommend for starters like me. Looking forward to the advanced parts, e.g. Deep learning. Many thanks!
SVM and the Radial is actually pretty advanced, so you've made huge progress. The current series, on XGBoost, is also very advanced. After this, I'll do deep learning and neural networks.
When I was struggling to understand what the use of kernel is intuitively, I found this video in StatQuest series. Now, this seems to have fixed my shaky comprehension about the kernel! Your video series are one of my favorite explanations for base of ML. I'm so glad if you'd keep making these kinds of interesting videos on your pace, BAM!
nobody explains the concepts better than you do. I have to study ML for a project and I haven't found a channel better than yours. That is why I have a request: please make a video on Support Vector Regression.
Damn, good job dude. At first I felt like I was being talked down to, but eventually grew to like it lol. You're way better at teaching this stuff than my professor is.
The best machine learning statistics video. Came from confused at a coursera course for data science, taught by u mich faculty, and this video does that 100000x better. Thank you so much!
Thank you for making one of the best videos out there for understanding SVM (and log likelihood maximization, and countless other concepts). I am going to make a good contribution to your Patreon once I get start earning because you so so deserve it, omigosh.
The math for this is very complex in the books, but watching this video helped me get a good idea about the concept and thus now that i am seeing those books, im able to understand how that math is working Your way of visual explanation is simply quite easy to understand! Thank you so much for these videos, they are very helpful :)
Oh man, thanks you for your videos, i mean, you're really awesome. You don't only explain the concepts, but also you keep it real and fun. I have learned a lot from you, when i have money i will donate every penny of it.
Not gonna lie , I have read a few other books understanding how RBF computes relationship between data points in infinite dimension ... none of them are as simple and comprehensive as your video. Thanks a lot
This video and in fact the whole playlist of machine learning is so amazing. Your way of teaching makes it so easy to understand the mathematics behind these concepts. Don't ever stop making these videos!
Thanks for creating this amazing video. After watching the lecture on RBF from Caltech I was so lost and felt so bad since it was the first concept that I didnt understand at all. Your video gave some good intuition why it works and how. Thank you Statquest :D
There is soo much of effort put into making these videos and it has come out soo welll !! When you die...You ll leave behind a legacy and will be known as a legend !!..
I love your videos!!! I understand this content better, even better than my data science lecture at uni. I hope you keep up the great work, I'm officially gonna get some Statquest merch to support this chanel.
The initial singing and the double, triple, quadruple bam grows on you, didn't like them much at first but it is now an essential part of the learning experience for me.
I wish I could be taught by you physically. I know nothing about machine learning and I am going through some of the topics for my internship and I cannot tell you how easy you are making things for me. Quadruple BAMM!!
i sure am going to eat some snacks haha When i finally understood what that 0.11 actually was, i was like 👁👄👁. Thank you for this jaw-dropping explanation Mr.Josh
Ahhh thankyou electronic engineering for having difficult mathematics, it’s makes it easy to branch out into more statistical domains such as machine learning and still be able to keep up, also equips me with other techniques such as Fourier and Laplace transforms which can be useful in data analysis and feature extraction. Great derivation btw
Hello, at the time of 14:34, you multiply both parts of dot product by the square root of the first term to make the radial kernel all in one dot product. Why do you do that? I cannot get it :(
Support StatQuest by buying my books The StatQuest Illustrated Guide to Machine Learning, The StatQuest Illustrated Guide to Neural Networks and AI, or a Study Guide or Merch!!! statquest.org/statquest-store/
You make statistics and machine learning so much fun. Your channel is binge watch worthy. Keep spreading good education in fun way. :)
Wow, thanks
Excellent! There is a higher dimensional space where this video is linearly seperable from anything else on youtube. What I love is that you use both math and intuition in good measure. You dont sacrifice intutition over math or math over intuition like most other attempts. This balance you ve got here is excellent.
Wow, thank you!
Holly cow this one is flying high. The guy who figured out all the maths must have been on fire!
bam!
@@hawaiicashew3237 and it sets fire to everyone who wants to learn this 🥵🔥
This video deserves an Oscar. Seriously, that was incredible. Infinite BAM!
Thanks! BAM! :)
This channel is so amazing。For the past few months I have been trying to catch up on concepts in statistics that my university never taught so that I have enough knowledge to go into the data science and machine learning fields。
The way you teach concepts in clear concise and short videos is extremely valuable。I have learned much in such a short time from just watching your videos and taking handwritten notesーthank you for all the hard work you have put in delivering this invaluable knowledge!Please continue making videos!
Thank you very much! :)
@@statquest Hey Josh, what would be an intuitive way to understand how SVM uses the high dimensional relationship between each pair of points to make the actual classification
@@arunavsaikia2678 This is a good question. The dot product between two points, which we use to create the "high-dimensional relationships," can be interpreted in a geometric way that includes the cosine of the angle between the two points multiplied by their magnitudes (the distances from the origin). With that in mind, check out slide 18 in this PDF: web.mit.edu/6.034/wwwbob/svm.pdf
My university skimmed over RBF, but then had 15 marker about it on the midterm.. Now studying for finals, and wish I had this video for the midterm.
Yes please
the mathematical reasoning behind the radial kernel has been plaguing me for so long, finally after many tries now it starts to click and my mind can visualize better what is happening and why. Thank you so much :)
Bam!
I have gone through 85% of the full list and found this series extremely useful. The instructions are simple to understand and give the sufficient overview into Machine learning. Highly recommend for starters like me. Looking forward to the advanced parts, e.g. Deep learning. Many thanks!
SVM and the Radial is actually pretty advanced, so you've made huge progress. The current series, on XGBoost, is also very advanced. After this, I'll do deep learning and neural networks.
When I was struggling to understand what the use of kernel is intuitively, I found this video in StatQuest series. Now, this seems to have fixed my shaky comprehension about the kernel! Your video series are one of my favorite explanations for base of ML. I'm so glad if you'd keep making these kinds of interesting videos on your pace, BAM!
Thank you very much! :)
Now we can eat snacks! Thank you so much, your visual explanation makes things so much easier to understand.
Glad it was helpful!
i bet there is a place in heaven named statquest where you gonna live an internal life there
Thank you very much!!!! :)
* a place between heaven and earth with the biggest margin possible
@@philipkopylov3058 psst, in a flat affine subspace of dimension 2
You are able to make us visualize the abstract part of the mathematics. You are a genius. Thank you so much for all your amazing videos.
Thanks!
The way you explain the math is astounding! I hope you'll continue making videos like this!
Thanks, will do!
Infinite Bam! This is the most understandable ML video I ever watched. Thank you for sharing this.
Glad it was helpful!
Watching Josh is like feeling like Flash of statistical concepts.. Every skipped concepts in stat class seems becoming crystal clear here..
bam! :)
What an interesting application of the Taylor Series. Such a beautiful explanation, thank you!
This is actually the 3rd place I've seen the Taylor Series in Machine Learning - so it's a super useful trick.
A beautiful video, I had tears of joy after watching this. Sir you are amazing!
Wow, thank you!
This man just answered questions I didn’t even know I had!😂 Excellent job thank you for the videos!
Happy to help!
nobody explains the concepts better than you do. I have to study ML for a project and I haven't found a channel better than yours. That is why I have a request: please make a video on Support Vector Regression.
I was finding it hard to understand the concept of RBF and this video helped me immensely. Thank you Josh for the amazing work that you doing.
Thank you very much! :)
Your videos are like magic, making such a difficult derivation look so much easy. God Bless You
Thanks a lot 😊!
Damn, good job dude. At first I felt like I was being talked down to, but eventually grew to like it lol. You're way better at teaching this stuff than my professor is.
Glad you like it. I try to teach the way I want to be taught myself. I'm not super good at this stuff, so I try to keep it simple.
calculation noises are so realistic and horizon widening experience
BAM!
The best machine learning statistics video. Came from confused at a coursera course for data science, taught by u mich faculty, and this video does that 100000x better. Thank you so much!
Awesome!!! I'm glad my video was helpful. :)
Thank you for making one of the best videos out there for understanding SVM (and log likelihood maximization, and countless other concepts). I am going to make a good contribution to your Patreon once I get start earning because you so so deserve it, omigosh.
nananananananana StatQuest!
Thank you very much! :)
The math for this is very complex in the books, but watching this video helped me get a good idea about the concept and thus now that i am seeing those books, im able to understand how that math is working
Your way of visual explanation is simply quite easy to understand! Thank you so much for these videos, they are very helpful :)
Happy to help! :)
Every time I watch your visualized explanation, I just got amazed
Thank you!
Oh man, thanks you for your videos, i mean, you're really awesome. You don't only explain the concepts, but also you keep it real and fun. I have learned a lot from you, when i have money i will donate every penny of it.
Wow, thanks!
you are the best math professor I ever had.. thanks a lot!!
Wow, thanks!
What are you Josh? Clear - Done, Concise- Done, Amazing -Done, Infinite BAM!!
HOORAY! :)
Not gonna lie , I have read a few other books understanding how RBF computes relationship between data points in infinite dimension ... none of them are as simple and comprehensive as your video.
Thanks a lot
Thank you!
I find some beep boop sounds a bit cringe, but it's crazy how good you are at explaining and showing things step-by-step. Thank you so much !
bam! :)
This video and in fact the whole playlist of machine learning is so amazing. Your way of teaching makes it so easy to understand the mathematics behind these concepts. Don't ever stop making these videos!
Thank you very much!!!! :)
3hours lectures in 15 mins, and it's super funny. Super Bam for StatQuest
Thanks!
One of the most clearly explained proofs I've seen in a while
Thank you very much! :)
As I continue watching your video the satisfaction of understanding BAMSS EXPONENTIALLLY!!!
:)
Thanks for creating this amazing video. After watching the lecture on RBF from Caltech I was so lost and felt so bad since it was the first concept that I didnt understand at all. Your video gave some good intuition why it works and how. Thank you Statquest :D
Awesome! :)
Thank you so much for making all these ML and stats terms so understandable! Great work!
My pleasure!
wow, god bless you
we need good teachers like you
Thank you very much! :)
There is soo much of effort put into making these videos and it has come out soo welll !!
When you die...You ll leave behind a legacy and will be known as a legend !!..
Thank you very much! :)
I love your videos!!! I understand this content better, even better than my data science lecture at uni. I hope you keep up the great work, I'm officially gonna get some Statquest merch to support this chanel.
Awesome! :)
Amazing video, this saved me for my ML midterm. THANK YOU.
Glad it helped!
this video put an instant smile on my face
Wow, thank you!
Such a nice, crystal, clear explanation!! Awesome job!!!
:)
Very clear explanations and far better than the videos on udemy !!
Thank you! :)
The initial singing and the double, triple, quadruple bam grows on you, didn't like them much at first but it is now an essential part of the learning experience for me.
bam
this explanation made it look too easy. Good job . Thanks for making this video.
Thank you! :)
This video is soooooo amazing, I have learned a lot from your videos and they are really funny and incredible!
Thanks!
Thanks!
HOORAY! Thank you so much for supporting StatQuest!!! BAM! :)
I don't understand much English but I notice that you are a lot of fun teaching.
Thank you very much! :)
but how underrated is this video
:)
"pipipu pipipu" hits every time 😂
bam! :)
wow
wow
wow,
the relationship between two objects in infinite dimension.
absolutely beautiful and amazing. thanks for ML and you :)
Thanks!
I wish I could be taught by you physically. I know nothing about machine learning and I am going through some of the topics for my internship and I cannot tell you how easy you are making things for me. Quadruple BAMM!!
Bam! I'm glad you enjoy the videos. :)
If I ever get a job in data science, it'll be thanks to this guy.
bam! :)
The guy who came up with RBF is genius.
yep
תודה!
WOW! Thank you very much for supporting StatQuest!!! TRIPLE BAM! :)
Thanks for the wonderful video! It really helps in both forming the intuitive, as well as connecting key math concept together!
Thanks! :)
Hey man, I have to thank you a lot for describing these things so well !
Thank you very much !
Thank you!
holy shit, i didnt expect series expansion to come at the end. so cool
bam! :)
I benefit so much from your video. From a Chinese Ph.D.
Awesome! :)
Thank you very much for this video! I learnt a lot from this step-by-step math guide! Great to eat snacks too!
Double bam! :)
@15:12, you should have nuclear BAM!!!!!!! for such revelations, awesome series loved every part. Thanks for your good work.
Awesome!
I love this channel. Much love! :)
Thank you! :)
Just amazing stuff man. God bless you, love from Indian..!!
Thank you very much! :)
This channel is absolute gold ! Thanks for your help mate , and also you should consider teaching mathematics too.
I'll keep that in mind! :)
worth to spend times on! thank you Josh!
Thank you!
you deserve 100m subscribers
Thank you!
thank you .. you make things super easy to understand.. amazingly good
Thanks!
you saved my grades in data mining and machine learning courses
Hooray!
Please take our professor's job. We need you.
:)
This is such a great lecture!!
Thanks! :)
No words for you sir
You are great!
Bam! :)
This was so funny and educational, thanks man
Thanks!
Wow. Just WOW. Hella good explanation!
Glad you liked it!
+ 3 views, thanks for awesome tutorials Josh.
Thanks!
i sure am going to eat some snacks haha
When i finally understood what that 0.11 actually was, i was like 👁👄👁.
Thank you for this jaw-dropping explanation Mr.Josh
Thanks!
god bless, this channel is amazing
Thank you! :)
Amazing teaching! Thank you sooo much!
Glad it was helpful!
GREAT MAN, GREAT CHANNEL.
Thank you so much 👍
15:18 I would say "INFINITE BAM!!!"
YES!
I was SO hoping for that to happen! hahaha I was expecting this part to be the largest BAM he ever did hahah
@Eyal Barazan I would recommend starting with the first video in this series: ruclips.net/video/efR1C6CvhmE/видео.html
wow, this actually rocks but I've lost after taylor series expansion of e to the x. nice try be back here later!
Take your time. It's a lot of material to cover.
@@statquest definitely, thanks for the quests and also replies, you help new data scientists to grow!
Namastey, you are the best person.
Thank you! :)
Can’t believe I used my knowledge on Taylor series expansion. Thanks for not wasting precious brain space for that
BAM! The Taylor series actually pops up a bunch in machine learning (Gradient Boost and XGBoost etc.)
You're a life saver.
bam!
I would have ended the song like "...I know that it sounds kinda Crazy but its actually NOT THAT LOUSY." Great video
Noted!
I Love the videos you make keep up the good work!! BAM!!
Thanks! Will do!
Ahhh thankyou electronic engineering for having difficult mathematics, it’s makes it easy to branch out into more statistical domains such as machine learning and still be able to keep up, also equips me with other techniques such as Fourier and Laplace transforms which can be useful in data analysis and feature extraction. Great derivation btw
Thank you!
Awesome explanation 👍
Thanks!
wtf, I love that part 9:25, u enlightened me
:)
This blew my mind lob u Josh ty
Thanks!
you should collab with Phoebe to create some StatQuest jingles!!!! love it
:)
I learned RBF from the Gaussian process, and seems the idea of "kernal" has numerous applications!
Yes!
Hello,
at the time of 14:34, you multiply both parts of dot product by the square root of the first term to make the radial kernel all in one dot product. Why do you do that? I cannot get it :(
We do this to show how the RBF kernel is equivalent to a polynomial kernel with infinite dimensions.
You helps me understand what my professor says in ml mandarin, Thanks
:)
Just Loved the way you explained the proof using Taylor series : ]
Thanks! :)
This man is genius
Thanks!
I love the "beep boop boop" part of this video !! 🤣
:)
OK got what diamension relationship means now. You the best
I just regret not found your channel during my degree.
better late than never! :)