Decision Tree Solved | Id3 Algorithm (concept and numerical) | Machine Learning (2019)

Поделиться
HTML-код
  • Опубликовано: 5 янв 2025

Комментарии • 272

  • @poojah4127
    @poojah4127 5 лет назад +49

    The best explanation for id3 algorithm.. Well done

  • @me-hn4bs
    @me-hn4bs 4 года назад +34

    i have to thank indians for tutorials you are amazing guys

  • @JanithGamageVEVO
    @JanithGamageVEVO 4 года назад +10

    You are wayyy better with teaching than most of the so-called senior lecturers at my University.

    • @CodeWrestling
      @CodeWrestling  4 года назад +1

      Glad it helped!
      We also help in implementing projects. Reach us at codewrestling@gmail.com

  • @kartheekeswar8231
    @kartheekeswar8231 2 года назад +1

    he is only one who explained best and in a easy way to understand id3 in RUclips.

  • @nikhilnarayane680
    @nikhilnarayane680 5 лет назад +6

    Very well explained. Most people leave it after finding the root node. Thanks for showing the entire calculation

  • @anirudhbharadwaj7909
    @anirudhbharadwaj7909 4 года назад +7

    Excellent! Your emphasis on certain points made it even more clearer to understand. Keep up the good work.

  • @MohamedHassan-tk5bq
    @MohamedHassan-tk5bq 3 месяца назад

    Thank you very much; you explained things more clearly and slowly than my prof.

  • @muhammadhamzanabeel3440
    @muhammadhamzanabeel3440 3 года назад

    Concept clear sir.. your teaching method is very impressive. Thanks sir...

  • @Virus-ke8xj
    @Virus-ke8xj 4 года назад +1

    WOW
    WOW
    Thanks a lot!!!!!!
    Our lecturer took 3 days to explain this, thanks for making this so easy

  • @nige3218
    @nige3218 3 года назад +3

    Wow, thank you, from a zero Math and AI background.
    Looking forward to learning more

  • @surajkulkarni9394
    @surajkulkarni9394 4 года назад

    I had gone through the same example in many courses, everyone explained how to select root node but after that nobody explained further splitting till leaf node. Thank you....

    • @CodeWrestling
      @CodeWrestling  4 года назад

      Glad it helped!
      We also help in implementing projects. Reach us at codewrestling@gmail.com

  • @mehmetkazanc5855
    @mehmetkazanc5855 5 лет назад +1

    Clear. Simply explained. Great Job. Thanks

  • @tauhait
    @tauhait 3 года назад

    The best explanation on the whole internet!! Thank you,

  • @furqanhaider1154
    @furqanhaider1154 5 лет назад +1

    Shukria ...
    buht shukria ...
    buht acha or easy explain kia Dear Borther ..
    Allah apko khush rakhy ...

  • @keerthithejas.c.9098
    @keerthithejas.c.9098 5 лет назад +12

    Thank you for this video. This is by far the best explanation of the ID3 algorithm out there. I have one question though if the Information gains of two attributes are the same then which one do we use for further splitting?

  • @admail2020
    @admail2020 4 года назад +1

    I feel that you are the one who explained to us this in Manipal course in summer 2020? Excellent. thank you

  • @vineethpai4486
    @vineethpai4486 4 года назад +1

    You're a life saver❤️ Keep doing such contents. Thanks a lot. Waiting for rest of ML lab Algorithms.😊

  • @immy_afridi_7760
    @immy_afridi_7760 5 лет назад +5

    Thank you very much for such an easy explanation, you explain very well. Please make more and more videos like this on machine learning. I like your way teaching and concept delivery..
    Thanks a lot
    Imaad Ullah

  • @vamsikrishnachiguluri8510
    @vamsikrishnachiguluri8510 4 года назад

    The best and clear cut explanation that I have ever seen

  • @ncsctsr
    @ncsctsr 3 года назад +1

    FANTASTIC EXPLANATION.ANYBODY CAN UNDERSTAND.THANKS FOR THE VIDEO.PLEASE DO A RF ALGORITHM WITH NUMERICAL EXAMPLE LIKE ID3.

  • @sharmass3756
    @sharmass3756 4 года назад +2

    I have exam tomorrow thanks for saving super explanation my teacher took 5 hours but i didn't understand u explained in 20 mins u are awesome man!!! keep going

    • @CodeWrestling
      @CodeWrestling  4 года назад +1

      Glad it helped!
      We also help in implementing projects. Reach us at codewrestling@gmail.com

  • @132132465465
    @132132465465 5 лет назад +3

    very clear!! pls continue doing this. you are so good

    • @CodeWrestling
      @CodeWrestling  5 лет назад

      Thank you 😊 soon we will update stay tuned with us.

  • @naeemaanwer3619
    @naeemaanwer3619 5 лет назад +1

    Superb explanation of calculating entropy and information gain. Thanks a bunch for making these videos.

    • @CodeWrestling
      @CodeWrestling  5 лет назад

      Thanks 😊 we will come up with more videos soon stay tuned with us...

  • @kamleshprajapati6621
    @kamleshprajapati6621 5 лет назад +3

    Awesome lecture on ID3 numerical..
    Thank You 🎉💝🎊

  • @iAndrewMontanai
    @iAndrewMontanai 5 лет назад +3

    Preparing to the exam, your videos are the best. Also liked Candidate Elimination and Find-S explaination. Gotta check other videos

  • @omrk5622
    @omrk5622 5 лет назад +2

    The best explanation of ID3!

  • @sonakshigoswami5622
    @sonakshigoswami5622 4 года назад

    Thank you so much. Not one video explained how to calculate the gains after finding the first node.
    RECOMMEND IT 10/10

  • @trojanhorse2003
    @trojanhorse2003 5 лет назад +37

    College is fucked up we have computers for doing all this stuff still they want us to do it manually using pen and paper for just a few marks

  • @midhilak8422
    @midhilak8422 5 лет назад +8

    Easily understood.👍 Thank you.
    Please upload C4.5 algorithm also.

  • @rajeshdevadiga2539
    @rajeshdevadiga2539 5 лет назад +6

    awesome explaination sir,now I understood this algorithm.

  • @sudhadevi3211
    @sudhadevi3211 4 года назад

    Thank you so much for explaining ID3...my doubts are cleared now.

  • @roopagaur8834
    @roopagaur8834 5 лет назад +1

    Very single video in ML cleany explained. Thanks a lot....!!!

  • @Mohitkumar-rb5zr
    @Mohitkumar-rb5zr 3 года назад +1

    @Code_Wrestling where is the video for python implementation?

  • @rkfm2012
    @rkfm2012 5 лет назад +2

    Very clearly explained -- excellent -- more videos from you on machine learning please

    • @CodeWrestling
      @CodeWrestling  5 лет назад

      The videos are on the way.. Stay Tuned #codewrestling

  • @rohiniam4650
    @rohiniam4650 4 года назад +1

    Thank u so much sir and id3 concept along with problem is clearly explained

    • @CodeWrestling
      @CodeWrestling  4 года назад +1

      Thanks for commenting and appreciating. :)

  • @hasibahammad8508
    @hasibahammad8508 4 года назад

    It's indeed a great video on ID3 algorithm. Thanks @Code Wrestling.

  • @shibuvm2127
    @shibuvm2127 5 лет назад +1

    The best practical explanation

  • @premstein16
    @premstein16 5 лет назад +2

    Good explanation, thanks for taking patience to explain the entire information gain step :)

  • @jaelinmccreary9217
    @jaelinmccreary9217 4 года назад +3

    This helped me so incredibly much. Thank you!!

  • @divyarao4146
    @divyarao4146 4 года назад

    At 15:55 , while calculating entropy , shouldnt it be 0 as n=p?

  • @sidharths9416
    @sidharths9416 4 года назад

    Thanks a ton bro. Clearly understood in depth. Excellent

  • @bhumikab6475
    @bhumikab6475 3 года назад

    very good explaination sir..understood it really well

  • @swapnilagashe4715
    @swapnilagashe4715 4 года назад +1

    thanks for the detailed explanation, request you to make a similar video for Random forests covering all the maths, I haven't been able to find a good resource online for this

  • @shankrukulkarni3234
    @shankrukulkarni3234 5 лет назад +2

    Thank you bro for this Crystal clear explanation..

  • @tanishktripathi8773
    @tanishktripathi8773 3 года назад +35

    This is one hell of an algorithm, simple but time consuming

    • @theprathmeshagashe
      @theprathmeshagashe 3 года назад

      Hard

    • @jihonoh5212
      @jihonoh5212 3 года назад

      The algorithm itself is simple, the time consuming part will be done by a computer iteratively

  • @masakkali9996
    @masakkali9996 3 года назад

    If equal positive examples and negative example then entropy is 1 right? At 2:14...

  • @xDDeeDee
    @xDDeeDee 3 года назад

    Beautifully explained, thank you! :)

  • @saibazriyazahmedmulla3701
    @saibazriyazahmedmulla3701 5 лет назад +2

    Bhai excellent explanation thanks a lot bro

  • @matheusfelix5742
    @matheusfelix5742 5 лет назад +2

    Thank you so much the lecture. Great explanation.

  • @Roger14216
    @Roger14216 4 года назад +1

    Marvelous explanation 👌

  • @apoorvasavitha5512
    @apoorvasavitha5512 5 лет назад

    All the video u upload are very useful n helpful, upload all algorithms in machine learning

  • @foodiez0217
    @foodiez0217 4 года назад +2

    Explained well but music during the video is bad option. Some may loose concerntration while listening to the video.

  • @kanhataak1269
    @kanhataak1269 4 года назад +1

    very very good explain, Thanks a lot

  • @zouhairghazi3541
    @zouhairghazi3541 3 года назад

    What to do if 2 average information entropy I for 2 attributes are equal and maximal ? Which one is root ?

  • @rohitchitte5614
    @rohitchitte5614 5 лет назад +1

    You deserve more subscribers

  • @NeverStoppedSinging
    @NeverStoppedSinging 5 лет назад +1

    Excellent video ! Thank you!

  • @yaminipeddireddi8969
    @yaminipeddireddi8969 4 года назад +1

    great explanation

  • @bhavikdudhrejiya852
    @bhavikdudhrejiya852 5 лет назад

    Best video for learning decision tree superb

  • @aniketbrahmankar8009
    @aniketbrahmankar8009 4 года назад

    Excellent 👌👍👍👍👍👍👍👍👍👍😀

  • @akarshmalhotra1154
    @akarshmalhotra1154 4 года назад

    Well Explained. 👍👍

  • @1UniverseGames
    @1UniverseGames 2 года назад

    Why no temperature in the final decision tree? Any idea

  • @mushfc2343
    @mushfc2343 5 лет назад +1

    thannnnnnnnkkkkk you brother that was very useful for me , and i will subscribe

  • @keremylmaz9595
    @keremylmaz9595 5 лет назад

    Why we arent checking if outlook is sunny,humity is high and we arent checking the temprature or windy, arent there important. How can we select the second branch

  • @Areeva2407
    @Areeva2407 4 года назад

    Very Clear, Very systematic

  • @MahmudulHasan-jm9gw
    @MahmudulHasan-jm9gw 5 лет назад +1

    Best explanation sir.

  • @dingshuoyang2421
    @dingshuoyang2421 3 года назад +1

    awesome video, would be better if there is no BGM when you r talking

  • @whiitehead
    @whiitehead 3 года назад

    To calculate entropy, just paste this in your browser console:
    const entropy = (...numbers) => numbers.map((number) => ((number, sum) => -number / sum * Math.log2(number / sum))(number, numbers.reduce((prev, cur) => prev + cur))).reduce((prev, cur) => prev + cur)
    after pasting it, you can calculate entropy of outlook with:
    entropy(5, 4, 5)

  • @vidhipunjabi3637
    @vidhipunjabi3637 4 года назад +5

    Why is the "temperature" attribute not part of the final decision tree?

    • @kregg34
      @kregg34 3 года назад

      attributes can be irrelevant to the decision and hence don’t show up

  • @sunilkumarsk7670
    @sunilkumarsk7670 5 лет назад +2

    U taught better than our lecture😊

  • @nitinshiv
    @nitinshiv 4 года назад +1

    best explanation

  • @ashwinchandore7676
    @ashwinchandore7676 2 года назад

    THank you for this video.very good explaination

  • @seed5340
    @seed5340 4 года назад

    Thank you sir for your amazing explanation!

  • @anbinh3967
    @anbinh3967 5 лет назад +1

    incredible explaination!

  • @farmeenfathima5627
    @farmeenfathima5627 4 года назад

    great explanation!!

  • @naturelifestyle9836
    @naturelifestyle9836 5 лет назад

    One of the best ever explanation bro tqsmmmm

    • @CodeWrestling
      @CodeWrestling  5 лет назад +1

      Thanks for appreciating...!! #CodeWrestling

  • @carlitoz450
    @carlitoz450 4 года назад

    really well explained, thanks a lot sir

  • @nagesh4utube
    @nagesh4utube 4 года назад +1

    Great explanation, understood the concept in a single go. Can you share the name of the music used in the background. Its stuck in my head and want to download it.

  • @Linaiz
    @Linaiz 4 года назад +1

    Amazing, this helped me so much!!!

  • @sankethkini6192
    @sankethkini6192 5 лет назад

    nice and clear explanation.....want a video on instances for using different classifiers

  • @shivlal121
    @shivlal121 3 года назад

    Very well explained

  • @Brsir999
    @Brsir999 5 лет назад

    Waw... Amazing explanation 💕😍

  • @abilash305
    @abilash305 4 года назад

    What about the attributes Humidity and Temperature? Are they neglected.?

  • @pratiksharma1655
    @pratiksharma1655 5 лет назад +1

    Great Video. Well done.

  • @scaratlas3347
    @scaratlas3347 4 года назад

    I'm not getting the same entropy values through the calculator

  • @ssuriset
    @ssuriset 3 года назад

    Exactly what I needed

  • @reibalachandran4775
    @reibalachandran4775 4 года назад +1

    great explanation, thanks a lot!

  • @Ghostalking
    @Ghostalking 4 года назад

    Life saver, thank you man!

  • @lilcoconut3116
    @lilcoconut3116 4 года назад

    Really helpful 😊Thank u so much.

  • @samiranroy4188
    @samiranroy4188 5 лет назад +1

    very nice, big thumps up , can you please share the other algo for Decision tree.

  • @anudeepk7390
    @anudeepk7390 5 лет назад +5

    I understood this algorithm thanks to you

  • @fabioliani7337
    @fabioliani7337 4 года назад

    Very good and detailed explananation, thanks!

  • @Muddassir_Ahmed
    @Muddassir_Ahmed 5 лет назад

    what should we do if gain of two features is same ? we can make root any one of them?

  • @simmysingh563
    @simmysingh563 5 лет назад +1

    Thanks a lot, it is an amazing video, helped a lot!!!

  • @arjungoud3450
    @arjungoud3450 2 года назад +1

    Superb bro thanks a lot

  • @sidharthrunwal6833
    @sidharthrunwal6833 3 года назад

    great video! thanks

  • @DeathRacerSH97
    @DeathRacerSH97 2 года назад

    Great video learned a lot, the way i do it there can be attributes that have no positive or negative examples. in this case a 0 would appear in a log2(). What am i missing?

  • @dasarisandhya361
    @dasarisandhya361 3 года назад

    I got two gain's as same what would I do

  • @balsongorai1521
    @balsongorai1521 5 лет назад +2

    Dhanyawad tomarrr...humbi samjh gye :-)

  • @joelphilip2942
    @joelphilip2942 4 года назад +1

    a decent explanation!!!!

  • @higiniofuentes2551
    @higiniofuentes2551 5 лет назад +2

    Best presentation ever for ID3 decision tree definition and calculation! Thank you
    I found a difference with the Entropy formula, the theory is : ...log2(1/p) but you have ...log2(p), there are any reason? Thanks a million!!

    • @CodeWrestling
      @CodeWrestling  5 лет назад +1

      Thank You, and wrt to that log thing.. see the -ve sign.. if you will do reciprocal then a -ve sign will come forward according to log property

    • @vib2810
      @vib2810 5 лет назад +1

      if the final classification had 3 types(here we have only 2 +ve and -ve) then if would have been log3() as entropy should be 1 if data is equally distributed