Correction! @8:30: The probability of an instance getting excluded or being not selected in the nth draw is: (1-1/n)^n. So, it needs a little bit of correction.
7:59 how come the probability will be as mentioned if we are considering both sample size and dataset size as n, then all the instances are chosen, so probability will be 1. right??
Wrong at 6:20, Replacement means that for every bag, samples are drawn from the complete datatset regardless of what went inside previous bag. It does not mean that inside each bag you will have repeated observations.
Why the lectures are all over the place, there is no sequence. If a title for the video is given, it would be easy to search and view what is required to view.
That's what confused me when I watched. The formula she was mentioning should be the probability that it is excluded rather than included in my opinion.
@@Data Crunching , well said every 1 of 10 we meet these ppl who comment abt language than subject @ asjad4 , if meaning understood forget about grammar
You are absolutely the best teacher I have run into since my foray intyto data science. Bravo!
Correction! @8:30: The probability of an instance getting excluded or being not selected in the nth draw is: (1-1/n)^n. So, it needs a little bit of correction.
True!
Mam, at 8:21 how come sample size and total data size will be of n. Sample always will be smaller than total data
Thank you, dear Sudesha. It is very good to see women teaching ML topics, most of the videos are from men.
Thank you Madam for such a lecture on this topic.
Prof Sripati, AoT
Mam u r god i passed this subject caz of u only....guruji u r great
7:59 how come the probability will be as mentioned if we are considering both sample size and dataset size as n, then all the instances are chosen, so probability will be 1. right??
I also have the same doubt
Wrong at 6:20, Replacement means that for every bag, samples are drawn from the complete datatset regardless of what went inside previous bag. It does not mean that inside each bag you will have repeated observations.
good explanation madam
Liked the wy the weights in Boosting were explained
Maam where are you now, 2022 needs you
Awesome.
Is there any playlist for these lectures or we have to do boosting in random forest ?
hhaha!!
A bit late but Complete lecture series is available at
nptel.ac.in/courses/106/105/106105152/
Ma'am please consider the feedback in the comments section, you are hampering the image of IIT
At 2:43, the learner will underfit, not overfit with the small dataset?
Vivek Viswanath got it.. thanks!
Why the lectures are all over the place, there is no sequence. If a title for the video is given, it would be easy to search and view what is required to view.
nptel.ac.in/courses/106/105/106105152/ you can follow this course.
At 8:30 the probability will be 1 - that quantity
That's what confused me when I watched. The formula she was mentioning should be the probability that it is excluded rather than included in my opinion.
@@gregoryjester5167 You're right
The probability will be 1/8
@@gregoryjester5167 Yes! Thank you!
Thank you very much mam
Please mam provide notes for every topic
mam,iam invented bagging machine i have some doubts
The formula is wrong it should be 1 - (1- 1/n) ^ n
no, this is the equation of non existence of a data point in a sample set, while she mentioned the existence equation.
Yes. Formula at 8 55 is wrong. That's indicating , if one sample is not getting selected in di.
Horrible english. Could not understand some words.
@@Data Crunching , well said every 1 of 10 we meet these ppl who comment abt language than subject
@ asjad4 , if meaning understood forget about grammar