Introduction to Ensembles

Поделиться
HTML-код

Комментарии • 14

  • @anudishjain
    @anudishjain 4 года назад +9

    The error should be less than 0.5 so that accuracy is greater than 0.5 for 2-Class Classification at each weak learner.
    The conditions at 7:42 and 12:54 are contradictory.

    • @veeravignesh18
      @veeravignesh18 4 года назад +3

      at 7:42 she meant accuracy should be greater than 0.5 and in 12:54 she meant error..

  • @umang9997
    @umang9997 Год назад +1

    7:42
    I think there is a correction,
    Works well only if the individual classifiers
    Error rate < 0.5 and errors are independent

  • @imranh1225
    @imranh1225 2 года назад

    18:40 - the probability should be subtracted from 1 to get P(class = 1), right?

  • @tolifeandlearning3919
    @tolifeandlearning3919 2 года назад

    Awesome

  • @karthikvkannan2839
    @karthikvkannan2839 4 года назад

    Is bias and variance should be low ?
    or it should be in between in since neither low or high bias , so that our error will be low . .

  • @arka.outside
    @arka.outside 5 лет назад

    Thank You Madam ^_^

  • @roshanekka9687
    @roshanekka9687 Год назад

    Thank You Ma'am

  • @chonavalendez7201
    @chonavalendez7201 4 года назад

    Thank you for the information, I am a graduating student of electrical engineering and I want to learn the machine learning is it ok for you to post another video related machine learning?. . Thank you very much. .

    • @aikanshpriyam3977
      @aikanshpriyam3977 4 года назад

      nptel.ac.in/courses/106/105/106105152/ you can find her machine learning course here

  • @sreekalyani395
    @sreekalyani395 6 лет назад +1

    what does weak learners means?

    • @pacuignis6923
      @pacuignis6923 6 лет назад +1

      Suppose that the error rate of any classifier range between 0 and 1. Ideally we want classifiers whose error rate is close to 0. Suppose for one kind of classifier, the error rate is 0.4 or even 0.3 which is less than 30% but not good enough. Such a classifier is a weak classifier. The goal of ensemble learning is to build a strong classifier, with many weak classifiers as bases, whose error is closer to 0. The bases can be the same kind of classifiers trained on a subset of the training data and/or different hyper-parameters or they can be totally different kind of classifiers.

    • @RobTheQuant
      @RobTheQuant 5 лет назад +4

      weak learner=is a learner not very good at predicting. Maybe can do only 60% right predictions. Combining weak learners you can get a strong one that is good at predicting.