36:07 in the way that it is shown in the slides, I think if kernel-width is large, we are effectively considering smaller region(instead of larger) since the weights will be more damped/smaller now. Correct me if I am wrong.
Should we always have K as odd value ? This is because we classify x based on the class to which majority of the nearest neighbors belong. If we have even value of K then there is a possibility that equal number of neighbors belong to multiple classes.
I have a question , for new instance we have only x value , so basically (x,0) type of value and it will always be nearest to the lowest y of that x or some( x2,0) type of neighboutr.
You are lil bit confused and that's totally okay, for a new x (Instance ) of course we don't have y and that's what we are trying to find, by applying knn let's say for(k=3) we found 3 nearest points (these points came after applying euclidean distance ) and now the majority points will tell what should be our y. You can also think like that if 3 nearest neighbours have y=1, 2 , 2 then majority is 2 so the y for new instance (x) will also belong to majority class which is 2.
ig you just compute avg distance to k-nearest neighbors in the training set itself for the training point. so for 1-nearest error is 0 as point is first closest to itself and its an accurate estimation!
Swathi numbering means what?? you mean like alphabets the way kids starts in order.. m sry plz dont mind but its like that and videos are not published by me.
Thank you for this great video. It has helped me immensely !
NPTEL is serving nation in true sense, we should learn from them.
Machine Learning by Prof. Sudeshna Sarkar
Basics
1. Foundations of Machine Learning (ruclips.net/video/BRMS3T11Cdw/видео.html)
2. Different Types of Learning (ruclips.net/video/EWmCkVfPnJ8/видео.html)
3. Hypothesis Space and Inductive Bias (ruclips.net/video/dYMCwxgl3vk/видео.html)
4. Evaluation and Cross-Validation (ruclips.net/video/nYCAH8b5AQ0/видео.html)
5. Linear Regression (ruclips.net/video/8PJ24SrQqy8/видео.html)
6. Introduction to Decision Trees (ruclips.net/video/FuJVLsZYkuE/видео.html)
7. Learning Decision Trees (ruclips.net/video/7SSAA1CE8Ng/видео.html)
8. Overfitting (ruclips.net/video/y6SpA2Wuyt8/видео.html)
9. Python Exercise on Decision Tree and Linear Regression (ruclips.net/video/lIBPIhB02_8/видео.html)
Recommendations and Similarity
10. k-Nearest Neighbours (ruclips.net/video/PNglugooJUQ/видео.html)
11. Feature Selection (ruclips.net/video/KTzXVnRlnw4/видео.html )
12. Feature Extraction (ruclips.net/video/FwbXHY8KCUw/видео.html)
13. Collaborative Filtering (ruclips.net/video/RVJV8VGa1ZY/видео.html)
14. Python Exercise on kNN and PCA (ruclips.net/video/40B8D9OWUf0/видео.html)
Bayes
16. Baiyesian Learning (ruclips.net/video/E3l26bTdtxI/видео.html)
17. Naive Bayes (ruclips.net/video/5WCkrDI7VCs/видео.html)
18. Bayesian Network (ruclips.net/video/480a_2jRdK0/видео.html)
19. Python Exercise on Naive Bayes (ruclips.net/video/XkU09vE56Sg/видео.html)
Logistics Regession and SVM
20. Logistics Regression (ruclips.net/video/CE03E80wbRE/видео.html)
21. Introduction to Support Vector Machine (ruclips.net/video/gidJbK1gXmA/видео.html)
22. The Dual Formation (ruclips.net/video/YOsrYl1JRrc/видео.html)
23. SVM Maximum Margin with Noise (ruclips.net/video/WLhvjpoCPiY/видео.html)
24. Nonlinear SVM and Kernel Function (ruclips.net/video/GcCG0PPV6cg/видео.html)
25. SVM Solution to the Dual Problem (ruclips.net/video/Z0CtYBPR5sA/видео.html)
26. Python Exercise on SVM (ruclips.net/video/w781X47Esj8/видео.html)
Neural Networks
27. Introduction to Neural Networks (ruclips.net/video/zGQjh_JQZ7A/видео.html)
28. Multilayer Neural Network (ruclips.net/video/hxpGzAb-pyc/видео.html)
29. Neural Network and Backpropagation Algorithm (ruclips.net/video/T6WLIbOnkvQ/видео.html)
30. Deep Neural Network (ruclips.net/video/pLPr4nJad4A/видео.html)
31. Python Exercise on Neural Networks (ruclips.net/video/kTbY20xlrbA/видео.html)
Computational Learning Theory
32. Introduction to Computational Learning Theory (ruclips.net/video/8hJ9V9-f2J8/видео.html)
33. Sample Complexity: Finite Hypothesis Space (ruclips.net/video/nm4dYYP-SJs/видео.html)
34. VC Dimension (ruclips.net/video/PVhhLKodQ7c/видео.html)
35. Introduction to Ensembles (ruclips.net/video/nelJ3svz0_o/видео.html)
36. Bagging and Boosting (ruclips.net/video/MRD67WgWonA/видео.html)
Clustering
37. Introduction to Clustering (ruclips.net/video/CwjLMV52tzI/видео.html)
38. Kmeans Clustering (ruclips.net/video/qg_M37WGKG8/видео.html)
39. Agglomerative Clustering (ruclips.net/video/NCsHRMkDRE4/видео.html)
40. Python Exercise on means Clustering (ruclips.net/video/qs7vES46Rq8/видео.html)
Tutorial I (ruclips.net/video/uFydF-g-AJs/видео.html)
Tutorial II (ruclips.net/video/M6HdKRu6Mrc/видео.html )
Tutorial III (ruclips.net/video/Ui3h7xoE-AQ/видео.html)
Tutorial IV (ruclips.net/video/3m7UJKxU-T8/видео.html)
Tutorial VI (ruclips.net/video/b3Vm4zpGcJ4/видео.html)
Solution to Assignment 1 (ruclips.net/video/qqlAeim0rKY/видео.html)
36:07 in the way that it is shown in the slides, I think if kernel-width is large, we are effectively considering smaller region(instead of larger) since the weights will be more damped/smaller now. Correct me if I am wrong.
Thank you Madam!
Thank you M'am!
Thank you Prof.Sudeshna Sarkar & Anirban Santara!
Hi Ma'am,
U r such a great teacher, ur teaching method helped me a lot.
Thank you so much for making such great videos.
Great Session
Great Lecture
Your teaching methodology is simply amazing ma'am.
Nice teaching
Very simple and succinct explanation and thank you very Madam
Lectures on Machine learning in English / HIndi: ruclips.net/p/PLGeIxG41Dh351Tapkofz0WktplooH5C6s
Great Instructor , poor cameraman
Machine learning made easy with her
thankyou dear mam
Should we always have K as odd value ? This is because we classify x based on the class to which majority of the nearest neighbors belong. If we have even value of K then there is a possibility that equal number of neighbors belong to multiple classes.
If you have 2 class chose odd k and k should not be the multiple of class.
I have a question , for new instance we have only x value , so basically (x,0) type of value and it will always be nearest to the lowest y of that x or some( x2,0) type of neighboutr.
Please let me know if my question is unclear the main point we dont have Y then how distance is calculated
You are lil bit confused and that's totally okay, for a new x (Instance ) of course we don't have y and that's what we are trying to find, by applying knn let's say for(k=3) we found 3 nearest points (these points came after applying euclidean distance ) and now the majority points will tell what should be our y.
You can also think like that if 3 nearest neighbours have y=1, 2 , 2 then majority is 2 so the y for new instance (x) will also belong to majority class which is 2.
Thank you, Mam you are teaching very good, but lectures are not in a sequence and missing.
how do we get training error in this, when we're not even training?
ig you just compute avg distance to k-nearest neighbors in the training set itself for the training point. so for 1-nearest error is 0 as point is first closest to itself and its an accurate estimation!
the way she pronounces ALGORITHM is funny! :D :D
Yep. Because you don't know how to pronounce it.
Ma'am is speaking the finest English. Go check out her profile and you'll know how good she is
@@705pratik9 okay!
@@OmkaarMuley Yep take care man. Always wear a mask.
what does "classes are spherical" means?
it means the distribution of data in this case our training data have a spherical form. it is distributed throughout the x-y plane.
Plot of decision boundaries using MATLAB:
ruclips.net/video/uql5RbM9GHI/видео.html
You got even the heading of the video wrong Prof. The algorithm is called k-Nearest neighbors!
Text boot or else PDF can you send me link for downloading ur PDF materials please mam. It is useful for my project work
U can download these lecture transcript from neptel site
try to give the numbering for lectures
okk Swathi
is these videos are published by you??
Swathi numbering means what??
you mean like alphabets the way kids starts in order..
m sry plz dont mind but its like that and videos are not published by me.
it does not mean like that .....i mean try to publish in an appropriate orde.....it is better to understand others in a correct way...thank you
No thanks