Introduction to Decision Trees

Поделиться
HTML-код
  • Опубликовано: 29 июн 2024

Комментарии • 33

  • @diligentjohn
    @diligentjohn 2 года назад +5

    How great our teachers are .. the more the knowledge the more the humbleness ... May God Bless you

  • @sanchayana2007
    @sanchayana2007 5 лет назад +1

    Awesome explanation , The basic reason performing the DT , i got know from your Lecture ,other tutorials just give an example Thanks a Lot .

  • @kumarc4853
    @kumarc4853 4 года назад +2

    Thank you mam for teaching future ML practitioners and innovators

  • @anuradharao2559
    @anuradharao2559 4 года назад +2

    super explanation mam, Thank you so much.

  • @divyamishra4834
    @divyamishra4834 3 года назад

    Super Explaination Mam. Thankyou so much.

  • @Sensible_Money
    @Sensible_Money Год назад

    Thank you. Your videos changed my life literally 🥰

  • @anvarisamukhamedov4961
    @anvarisamukhamedov4961 4 года назад +1

    What a sweat lecturer you are. Thanks for your class.

  • @SuchetanaGupta
    @SuchetanaGupta 7 лет назад +1

    Is this the next video after mon01lec05?

  • @Madhu9738
    @Madhu9738 5 лет назад +3

    Hi Madam, Your explanation is really good,
    I have one question at 9th Mins, You selected age example of 30, 30 to 40 & > 40
    How can we decide this in the large sample... is there any formula behind it...or are we selecting randomly.... If yes... How can we depend on that random Number.
    I will be waiting for your response
    Thanks in Advance

    • @moidamer281
      @moidamer281 3 года назад

      You have to analyse the data. So that there are less number of ranges /categories with most information

  • @MukeshRajput1982
    @MukeshRajput1982 6 лет назад +1

    Maam, In the last example ( video time 21:20) why are you taking Outlook attribute as a root node and left other attributes as internal node.
    Please reply.

    • @mallikarjun3425
      @mallikarjun3425 6 лет назад

      Mukesh Rajput Choosing an attribute is based up reduced entropies of features.

    • @CheemaAwaisZafar
      @CheemaAwaisZafar 6 лет назад

      It has been selected as a root node because it has maximum information gain. Please check out the 2nd lecture of decision tree.

    • @mathavanmuthaiyan
      @mathavanmuthaiyan 4 года назад

      The algorithm has decided the root node based entropy value

  • @SuchetanaGupta
    @SuchetanaGupta 7 лет назад +1

    where are the videos for Part A? This one only has Part B

  • @avejantzero9090
    @avejantzero9090 7 лет назад +6

    There is mistake in subtitles: not "bullion", but "boolean" function.

  • @naveenchowdary7959
    @naveenchowdary7959 7 лет назад +3

    excelent mam i wanna bow before you

  • @punitjha6599
    @punitjha6599 5 лет назад

    What is the term for DECISION TREE LEARNING METHOD THAT CREATE MULTIPLE TREE.

  • @SuperShalender
    @SuperShalender 6 лет назад +1

    Agar is k notes cheen le to ek min bhi na padha paye... As being a professor of IIT at least you should have enough knowledge so that you can teach without notes.

    • @jyotbamania2112
      @jyotbamania2112 5 лет назад +7

      She did her PhD from standford university

    • @faraza5161
      @faraza5161 5 лет назад +14

      apne baap se bol aake padha de phir

    • @kartikkamboj295
      @kartikkamboj295 5 лет назад

      @@jyotbamania2112 Wow ! That says a lot about her stature.

    • @manaspeshwe8297
      @manaspeshwe8297 4 года назад

      Sab kuch yaad rakhna knowledge nai hota hai bhai....Cheezo ko samajhna hota hai. And ma'am is sharing her knowledge on an open platform...so more respect is expected.

    • @rajs4990
      @rajs4990 4 года назад +1

      Immature comment