Good lecture, I'm not from an English speaking country, but your spelling was really clear and, most importantly, the lecture itself was linear, with a clear reasoning, had a good pace and was not confusing at all. Thank you very much.
Around 13:28, one can easily choice extremely high gamma, making the matrix close to diagonal with just 1 on a diagonal, and vanishingly small elements out of diagonal. This makes solution w_m=y_m, for every m. Which is quiet obvious, all elements in the sum are small, except for the one that is exp(0) == 1. If there are repeating points, there are probably some minor tricks to do that too, maybe with w_m=y_m/k, where k is repetition count of a point. For other values of gamma, it is not obvious if the solution exists all the time. In fact I think you can select different gamma for each dimension, essentially making it diagonal form, or maybe even arbitrary diagonalizable form (in some different linear basis), and it will still work.
This is exactly what the professor mentioned on the impact of gamma value choice. With extremely high value of gamma(s), you virtually generate a discrete function h(x) that maps h(xn) = yn. It has no smoothness, hence resulting terrible generalization problem.
Loved the RBF model and Lloyd's algorithm! I even started to better understand SVM in this lecture after you've compared it with RBF and neural networks. Thanks.
16:46 for the graphs regarding the gammas, what do the axes represent. My first thought was xs and ys, but how do we plot for the part of the curve for which we do not have the points for, since we can only have the points that we are trying to fit the model to? Thanks in advance.
How could SVM be expanded to non-binary classification, and does it still perform as well on non-binary classification as binary classification? I know he said that they don't perform as well on real-valued functions, but does that refer to anything that isn't binary?
According to my understanding, SVMs can be definitely expanded to non-binary aka multi-label classification and it still performs as good as it would on a binary classification problem.
Very nice lecture, thanks for sharing. SVM can only be used for binary classification. However, there are techniques to use it in non-binary classification as well. Is it the same for RBF as well? The classification example was also binary in this lecture.
the video could have been better... if you would have shown x and x_n and y_n all those on a graph during initial minutes of the video. Although it was helpful, but I had to struggle a lot before catching up to all the notations used, as I am not having mathematical background.
Just wanted to say, thank you so much for this series. It's really good.
Good lecture, I'm not from an English speaking country, but your spelling was really clear and, most importantly, the lecture itself was linear, with a clear reasoning, had a good pace and was not confusing at all. Thank you very much.
wow wow....enlightenment every detail explained and it makes sense.This is pure understanding
wow wow wow! Lucky to have this lecture!
Around 13:28, one can easily choice extremely high gamma, making the matrix close to diagonal with just 1 on a diagonal, and vanishingly small elements out of diagonal. This makes solution w_m=y_m, for every m. Which is quiet obvious, all elements in the sum are small, except for the one that is exp(0) == 1. If there are repeating points, there are probably some minor tricks to do that too, maybe with w_m=y_m/k, where k is repetition count of a point.
For other values of gamma, it is not obvious if the solution exists all the time.
In fact I think you can select different gamma for each dimension, essentially making it diagonal form, or maybe even arbitrary diagonalizable form (in some different linear basis), and it will still work.
This is exactly what the professor mentioned on the impact of gamma value choice. With extremely high value of gamma(s), you virtually generate a discrete function h(x) that maps h(xn) = yn. It has no smoothness, hence resulting terrible generalization problem.
Loved the RBF model and Lloyd's algorithm! I even started to better understand SVM in this lecture after you've compared it with RBF and neural networks. Thanks.
Excellent explaination, thanks a lot!
Thank you very much sir. Your lectures give me more interests in Machine Learning :)
Very good lecture...
The book should cover also these topics. :-)
Thank you for your work and for sharing it.
There are e-chapters for these subjects, as said in the professor's webpage work.caltech.edu/textbook.html
hi, I've just found the additional chapters. They're available here: amlbook.com/support.html
@@tungo7941It seems that the last e-chapter loaded is e-Chapter 9 - Learning Aides. Have you uploaded another one? Thank's for your job.
@@fradamor No, I've just found out these chapters, it seems that chapter 9 is the last one. :)
Absolutely well done and definitely keep it up!!! 👍👍👍👍👍
Best online material about RBF!
Thank you for this!
This rocked!
Thanks!
Really helpful!
16:46 for the graphs regarding the gammas, what do the axes represent. My first thought was xs and ys, but how do we plot for the part of the curve for which we do not have the points for, since we can only have the points that we are trying to fit the model to? Thanks in advance.
i think y-axis is the h(x) and x-axis is for x sub 1 until x sub N. cmiiw
Excellent explanation thanks a lot :)
Outstanding lecture Professor!
This video is awesome. Really helpful
Excellent!!!!
Great Lecture! Thanks a lot!
fantastic
RBF 17:40
Thanks for this inputs :-)
good speaker so clear
Amazing! Excellent!
This guy is good!. Ottimo.
Wow, Thank you Sir!
Thanks a lot
👍👍👍👍👍👍👍👍👍
How could SVM be expanded to non-binary classification, and does it still perform as well on non-binary classification as binary classification? I know he said that they don't perform as well on real-valued functions, but does that refer to anything that isn't binary?
According to my understanding, SVMs can be definitely expanded to non-binary aka multi-label classification and it still performs as good as it would on a binary classification problem.
center point=. 0, 0.5 ,1
Evaluation point= 0 ,0.5 ,0.6 ,0.7 ,0.8 ,1
Shape parameter=3
function=e^sinπx
Solve this by using MQ RBF interpretation technique
Great!!
thanks
I think the conjecture is valid for an n-1 dimensional space, awaiting the proof for an infinte-dimensional space... if I'm wrong I apologise
Very nice lecture, thanks for sharing.
SVM can only be used for binary classification. However, there are techniques to use it in non-binary classification as well. Is it the same for RBF as well? The classification example was also binary in this lecture.
From the regularization variational problem, does one get the one with movable centers or does one get the one with data points as centers?
It does not give you automatically the clusters. So yes, it is RBF on data points.
求一期中国大陆的DDR5内存接口芯片澜起科技官网大解析
the video could have been better... if you would have shown x and x_n and y_n all those on a graph during initial minutes of the video. Although it was helpful, but I had to struggle a lot before catching up to all the notations used, as I am not having mathematical background.