How to find the Entropy and Information Gain in Decision Tree Learning by Mahesh Huddar

Поделиться
HTML-код
  • Опубликовано: 28 сен 2020
  • How to find the Entropy and Information Gain in Decision Tree Learning by Mahesh Huddar
    In this video, I will discuss how to find entropy and information gain given a set of training examples in constructing a decision tree.
    Machine Learning - • Machine Learning
    Big Data Analysis - • Big Data Analytics
    Data Science and Machine Learning - Machine Learning - • Machine Learning
    Python Tutorial - • Python Application Pro...
    entropy nptel,
    entropy explained,
    entropy data mining,
    entropy data mining example,
    entropy based discretization example data mining,
    entropy machine learning,
    entropy machine learning example,
    entropy calculation machine learning,
    entropy in machine learning in hindi,
    information gain decision tree,
    gain decision tree,
    information gain and entropy in the decision tree,
    information gain,
    information gain and entropy,
    information gain feature selection,
    information gain calculation,
    information gain and Gini index,
    information gain and Gini index,
    information gain for continuous-valued attributes

Комментарии • 80

  • @prateekmishra6245
    @prateekmishra6245 2 года назад +2

    very lucid explanation. Thanks a ton Prof.

  • @sachinahankari
    @sachinahankari 3 года назад +8

    How root node depends on information gain of attributes.?.. Simply super Explanation..If you get questions from listeners.. You understand that they like your videos

  • @aleksandrg2717
    @aleksandrg2717 2 года назад +2

    Thank you for the clear explanation!

  • @vighneshchavan9226
    @vighneshchavan9226 3 года назад +2

    Thank you very much sir, completely cleared my doubt.

  • @terryterry3733
    @terryterry3733 3 года назад +24

    HI sir this is best explanation i have seen so far. in my Datascience i shared this link to my friends . thanks for your support and
    in Issues in Decision Tree Learning. can u pls post the ans for last two questions ( handling arributes with different costs
    and
    alternative measures for selecting attributes )

  • @maheshm8671
    @maheshm8671 3 года назад +2

    Clean and neat explanation Thank you sir

  • @str89z
    @str89z 3 года назад

    thank you. very much appreciated

  • @vanshsharma7498
    @vanshsharma7498 3 года назад

    Thank u so much , so helpful video

  • @siddheshkumbhar389
    @siddheshkumbhar389 3 года назад

    Thank u so much sir ❤️🙏🏻

  • @CherryLoveart
    @CherryLoveart 3 года назад

    Excellent!

  • @vishaljhaveri6176
    @vishaljhaveri6176 2 года назад

    Thank you sir.

  • @itsidiotsdreaminsider3663
    @itsidiotsdreaminsider3663 3 года назад

    Thank you sir

  • @kelvinkirwa4887
    @kelvinkirwa4887 Год назад +1

    thanks for this wonderful example .

  • @purplishjen
    @purplishjen Год назад +3

    Thank you so much! I was able to submit my assignment in my masters because of this video

    • @MaheshHuddar
      @MaheshHuddar  Год назад +2

      Welcome
      Do like share and subscribe

  • @joaogavazzisp
    @joaogavazzisp 3 месяца назад

    Really good explanation, thank you.

    • @MaheshHuddar
      @MaheshHuddar  3 месяца назад

      Welcome
      Do like share and subscribe

  • @mahanteshg609
    @mahanteshg609 3 года назад

    Very nice

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 8 месяцев назад +2

    your presentation is excellent and clear. thank you for making these videos available to everyone.

    • @MaheshHuddar
      @MaheshHuddar  8 месяцев назад +1

      Welcome
      Do like share and subscribe

  • @leemii4363
    @leemii4363 2 года назад

    Thank you!

    • @MaheshHuddar
      @MaheshHuddar  2 года назад

      Welcome
      Do like share and subscribe

  • @bhavisri6318
    @bhavisri6318 2 года назад

    Superb explanation

    • @MaheshHuddar
      @MaheshHuddar  2 года назад

      Thank You.
      Do like share and subscribe

  • @apr670
    @apr670 3 месяца назад

    Fantastic explanation 🎉

    • @MaheshHuddar
      @MaheshHuddar  3 месяца назад +1

      Welcome
      Do like share and subscribe

  • @ashutoshmahajan7199
    @ashutoshmahajan7199 3 года назад +1

    I am a masters student of data science at a german university and this has helped me thoroughly for my ML exam! Thank you dear sir!

    • @ankitpathak5
      @ankitpathak5 3 года назад

      Bhai bhai bhai..i am also giving Attendance here 🤣💁‍♂️ exact 10hrs before exam

    • @inesjesus4577
      @inesjesus4577 Год назад

      @@ankitpathak5 and me 4 hours before the exam xD

  • @EndlesRidge
    @EndlesRidge 2 года назад

    Hello Sir, awesome job on this video.

    • @MaheshHuddar
      @MaheshHuddar  2 года назад

      Thank You
      Do like share and subscribe

  • @testaunni1284
    @testaunni1284 7 месяцев назад

    Thank you very much🙏

    • @MaheshHuddar
      @MaheshHuddar  7 месяцев назад

      Welcome
      Do like share and subscribe

  • @loveparks
    @loveparks 2 года назад +5

    This was so easy to understand!! Thank You so much

    • @MaheshHuddar
      @MaheshHuddar  2 года назад +1

      Welcome
      Do like share and subscribe

  • @parthsoni7789
    @parthsoni7789 Год назад

    Best explanation

    • @MaheshHuddar
      @MaheshHuddar  Год назад

      Thank You
      Do like share and subscribe

  • @poojawarghat261
    @poojawarghat261 Год назад

    Thank you so much sir

  • @morklee8771
    @morklee8771 2 года назад

    thank you!!

    • @MaheshHuddar
      @MaheshHuddar  2 года назад

      Welcome
      Do like share and subscribe

  • @avanishkamak9498
    @avanishkamak9498 2 года назад

    Thank you so much very simply explained

    • @MaheshHuddar
      @MaheshHuddar  2 года назад

      Welcome
      Do like share and subscribe

  • @samsamsamie
    @samsamsamie 2 года назад

    thanks a lot bro.

    • @MaheshHuddar
      @MaheshHuddar  2 года назад

      Welcome
      Do like share and subscribe

  • @dr.subhanishaik5022
    @dr.subhanishaik5022 3 года назад

    good

  • @okeemokee
    @okeemokee Год назад

    Nicely and clearly done. Thanks!

  • @unique_bhanu
    @unique_bhanu 5 месяцев назад

    Can you explain this "One interpretation of entropy from information theory is that it specifies the minimum number of bits of information needed to encode the classification of an arbitrary member of S.

  • @kMuhammadRayanAli
    @kMuhammadRayanAli 2 года назад

    tussi great ho paji

    • @MaheshHuddar
      @MaheshHuddar  2 года назад

      Thank You
      Do like Share and subscribe

  • @archanachinnu3197
    @archanachinnu3197 3 года назад

    Sir is this same for data mining also

  • @shiva12832
    @shiva12832 5 месяцев назад +7

    To achieve Entropy = 0.94 , you need to divide the whole answer solved by taking ln X by ln 2, because log X base 2 = ln X / ln 2

  • @subhashsingh8553
    @subhashsingh8553 2 года назад

    How can this data use in python machine learning

  • @blessaddo8042
    @blessaddo8042 5 месяцев назад

    Ahh here i am revising for my final Data mining exam, wish me luck

  • @aarushgandhi7324
    @aarushgandhi7324 3 года назад +2

    since we are calculating entropy for a given dataset from one column only that is our classification, thus we could say entropy for a given dataset remains same when we calculate it for each column as no. of yes and no remains same? please answer fast

    • @indrajeetjadhav1140
      @indrajeetjadhav1140 2 года назад

      True. As there is one target variable, entropy of the whole dataset remains same. However, when we select any feature, let's say 'Wind' in this case, data is divided in two sub parts each corresponding to 'weak' and 'strong' wind respectively. In each of the sub parts, values of 'Yes' and 'No' are different. So, entropy of each sub part is different.
      Answering your question, entropy of sub parts depends on the feature that we select. If we select 'Humidity' as feature, sub parts and corresponding 'Yes' and 'No' distribution will be different. Hence, entropy in that case will be different.

  • @samratchatterjee972
    @samratchatterjee972 3 года назад

    Sir gain is very difficult explain again on video

  • @cs-anjan8763
    @cs-anjan8763 Год назад

    Thank you sir tmrrw having sem exam😊

  • @haanbhai3185
    @haanbhai3185 Год назад

    sir how to find entropy using calculator?
    ...please help

  • @amangusain6146
    @amangusain6146 6 месяцев назад

    🎉

  • @bollytainment27
    @bollytainment27 3 года назад +1

    How to calculate gain and entropy of continuous data?
    Is it possible to do so ?

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 8 месяцев назад

    actually, I'm not sure if there is a math error. instead of 0.811, should it not be 0.5623?
    I see why, you took log on base 2.

  • @033husnain5
    @033husnain5 Год назад +2

    Reply please
    I have some doubt on how to calculate the term log base 2 . How to do this calculation. Means 9/14 log base 2 9/14 - ...... How to achieve the answer like 0.94

    • @princesshanon45
      @princesshanon45 Год назад

      Hi there! I want to ask if you have figured this out. Cuz sameee I nvr got 0.94 and I only got 0.28

    • @hioh3012
      @hioh3012 Год назад

      Use calculator

    • @JamesBandOfficial
      @JamesBandOfficial Год назад

      I get 0.12

    • @reverbmusic8444
      @reverbmusic8444 Год назад +1

      @@JamesBandOfficial They have Skipped One Minus Sign

    • @shiva12832
      @shiva12832 5 месяцев назад

      Dude, you need to divide the whole answer by ln 2, because log X base 2 = ln X / ln 2

  • @jfehef
    @jfehef 3 года назад +1

    in 5:04 on the three examples you divided 14/14, 9/14,5/14 and 7/4 .. why exactly you divided all these numbers from 14? from where you get it and how

    • @kireetinunna7253
      @kireetinunna7253 3 года назад

      Total number of days are 14 in the data set took by him(Total instances were 14)

  • @proxxskullxxgaming2955
    @proxxskullxxgaming2955 3 года назад +6

    Sir I have a 1 dout on the entropy topic can u please help me 👇 sir ?😔

    • @proxxskullxxgaming2955
      @proxxskullxxgaming2955 3 года назад

      @@MaheshHuddar thanks sir giving me reply sir questions is how to find out entropy ? Questions ) P1=0.1 P2= 0.2
      P3= 0.3 P4= 0.4 find entropy ?
      What was the solution sir ? And how we find the entropy using formula please sir helps me ?

    • @proxxskullxxgaming2955
      @proxxskullxxgaming2955 3 года назад

      Plzz reply sir ?

    • @proxxskullxxgaming2955
      @proxxskullxxgaming2955 3 года назад

      Thanks sir

  • @eqtidarma4726
    @eqtidarma4726 Год назад

    thank you sir