Bagging and Boosting

Поделиться
HTML-код
  • Опубликовано: 31 авг 2016

Комментарии • 37

  • @uchennanwosu5327
    @uchennanwosu5327 2 года назад +2

    You are absolutely the best teacher I have run into since my foray intyto data science. Bravo!

  • @tigrayrimey6418
    @tigrayrimey6418 2 года назад +6

    Correction! @8:30: The probability of an instance getting excluded or being not selected in the nth draw is: (1-1/n)^n. So, it needs a little bit of correction.

  • @SaurabhGupta-id4tw
    @SaurabhGupta-id4tw 5 лет назад

    Liked the wy the weights in Boosting were explained

  • @capuleto126
    @capuleto126 6 лет назад +4

    Thank you, dear Sudesha. It is very good to see women teaching ML topics, most of the videos are from men.

  • @sripatimukhopadhyay5504
    @sripatimukhopadhyay5504 3 года назад +1

    Thank you Madam for such a lecture on this topic.
    Prof Sripati, AoT

  • @tolifeandlearning3919
    @tolifeandlearning3919 2 года назад

    Awesome.

  • @anirudhsrivastava3530
    @anirudhsrivastava3530 2 года назад

    Maam where are you now, 2022 needs you

  • @Rochanism
    @Rochanism 6 лет назад

    Thank you very much mam

  • @PratikPrajapati84
    @PratikPrajapati84 5 лет назад +4

    Is there any playlist for these lectures or we have to do boosting in random forest ?

    • @aditya16688
      @aditya16688 4 года назад

      hhaha!!

    • @deepakkumarshukla
      @deepakkumarshukla 4 года назад

      A bit late but Complete lecture series is available at
      nptel.ac.in/courses/106/105/106105152/

  • @ashifkasala6897
    @ashifkasala6897 6 лет назад +3

    Mam u r god i passed this subject caz of u only....guruji u r great

  • @operatingsystem4996
    @operatingsystem4996 Год назад

    good explanation madam

  • @saurabhchoudhary4572
    @saurabhchoudhary4572 8 месяцев назад

    Ma'am please consider the feedback in the comments section, you are hampering the image of IIT

  • @harshalbiradar569
    @harshalbiradar569 6 лет назад

    Please mam provide notes for every topic

  • @pragun1993
    @pragun1993 5 лет назад

    Wrong at 6:20, Replacement means that for every bag, samples are drawn from the complete datatset regardless of what went inside previous bag. It does not mean that inside each bag you will have repeated observations.

    • @baapuji
      @baapuji 4 года назад +2

      She never said that "each bag must have repeated observations", it's just a random example she took. She could have taken an example here none of the item repeated itself.

  • @siddharthmagadum16
    @siddharthmagadum16 2 года назад +1

    7:59 how come the probability will be as mentioned if we are considering both sample size and dataset size as n, then all the instances are chosen, so probability will be 1. right??

    • @augustine8142
      @augustine8142 9 месяцев назад

      I also have the same doubt

  • @joowlee
    @joowlee 6 лет назад +2

    At 2:43, the learner will underfit, not overfit with the small dataset?

    • @joowlee
      @joowlee 6 лет назад

      Vivek Viswanath got it.. thanks!

  • @sunderrajan6172
    @sunderrajan6172 7 лет назад +1

    Why the lectures are all over the place, there is no sequence. If a title for the video is given, it would be easy to search and view what is required to view.

    • @joelphilip2942
      @joelphilip2942 3 года назад +1

      nptel.ac.in/courses/106/105/106105152/ you can follow this course.

  • @sureshch3667
    @sureshch3667 5 лет назад

    mam,iam invented bagging machine i have some doubts

  • @debojyotisaha196
    @debojyotisaha196 6 лет назад +3

    At 8:30 the probability will be 1 - that quantity

    • @gregoryjester5167
      @gregoryjester5167 5 лет назад +9

      That's what confused me when I watched. The formula she was mentioning should be the probability that it is excluded rather than included in my opinion.

    • @AS-hm9km
      @AS-hm9km 5 лет назад +2

      @@gregoryjester5167 You're right

    • @jaggu6409
      @jaggu6409 4 года назад +1

      The probability will be 1/8

    • @umang9997
      @umang9997 Год назад

      @@gregoryjester5167 Yes! Thank you!

  • @bosepukur
    @bosepukur 4 года назад +1

    The formula is wrong it should be 1 - (1- 1/n) ^ n

    • @kedarnath4777
      @kedarnath4777 4 года назад

      no, this is the equation of non existence of a data point in a sample set, while she mentioned the existence equation.

    • @AmitSingh-jo8ob
      @AmitSingh-jo8ob 4 года назад +2

      Yes. Formula at 8 55 is wrong. That's indicating , if one sample is not getting selected in di.

  • @asjad4u
    @asjad4u 6 лет назад +1

    Horrible english. Could not understand some words.

    • @awesome_harish
      @awesome_harish 5 лет назад +5

      @​@Data Crunching , well said every 1 of 10 we meet these ppl who comment abt language than subject
      @ asjad4 , if meaning understood forget about grammar