Feature Importance using Random Forest and Decision Trees | How is Feature Importance calculated

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2024

Комментарии • 35

  • @MohitChaudhary-bc9wb
    @MohitChaudhary-bc9wb 3 года назад +25

    Sir great respect for you. I will rate your course higher than many top coursera courses. I have watched all your ML videos till now. I am doing masters from IIT kanpur. If i will get a job in this domain, then many credit will go to you and your dedication. 👒 off

    • @AjayBharathRathiMCI
      @AjayBharathRathiMCI Год назад +4

      IIT kanpur mei hoke bhi "IF " word ka use krra hai BKL

    • @hill_climber
      @hill_climber 7 дней назад

      Have you got the job in this domain?

  • @guitarkahero4885
    @guitarkahero4885 2 года назад +2

    Thanks for putting so much efforts. Appreciate your work! Worth watching

  • @akashdeepmishra7835
    @akashdeepmishra7835 2 года назад +4

    Thanks for the informative video. I'd just like to kindly point out that for the 2nd example of Tree with 15 datapoints and 2 features, there was a slight error in the node importance formula for the 2nd and 3rd node. As per the formula you mentioned earlier, the impurity weight should have been 3/9 instead of 3/15. That explains the discrepancy in the feature importance numbers between your calculations and the package.

  • @samriddhlakhmani284
    @samriddhlakhmani284 8 месяцев назад +5

    @20:32 based on formula it should be 3/9*0.44

  • @Ankit-hs9nb
    @Ankit-hs9nb 2 года назад +1

    so awesome sir! you explained everything in detail! thanks!

  • @SaifAbbas-c9p
    @SaifAbbas-c9p 2 месяца назад

    Great Discussion . Thanks

  • @dpchand
    @dpchand 3 года назад

    Awesome... really helpful.... i was wandering for such easily understandable video.

  • @AagamaS
    @AagamaS Год назад

    Its a great video! Thanks for explaining in detail. It would be very much helpful If you do similar video on how permutation importance is calculated.
    And more questions. Does this 'feature importance' helps in finding a root cause for a problem?

  • @nrted3877
    @nrted3877 13 дней назад

    Thankyou nitish sir

  • @123arskas
    @123arskas Год назад

    Man you're a beast. Awesome

  • @rafibasha4145
    @rafibasha4145 2 года назад

    Hi Nitish,pls cover feature selection ,xgboost and s

  • @prashastvaish4688
    @prashastvaish4688 16 дней назад

    will the OOB samples not be 0 when we use default RandomforestClassifier() as all rows are fed to each decision tree ?

  • @dineshjoshi4100
    @dineshjoshi4100 2 года назад

    Hello, Thanks for the explanation. I have one question. My question is, Does using best features helps to reduce the training data sets. Say I do not have a large datasets, but I can make independent variable that is highly corelated with the dependent variable, will it help me reduce my traning data sets. Your response will be highly valuable.

  • @ankitasonkar42
    @ankitasonkar42 9 месяцев назад

    can we do landslide predicition from this?

  • @adnanwalayat7895
    @adnanwalayat7895 4 месяца назад

    Awesome ❤❤❤

  • @ComicKumar
    @ComicKumar 2 года назад

    what if there are more than 2 columns.. then how will the importance be calculated and how will the x/ (x+y) formula for the nodes look?

  • @supriyachaudhary5112
    @supriyachaudhary5112 Год назад

    Can also plot feature importance for SVM classifier and KSVM??

  • @kindaeasy9797
    @kindaeasy9797 8 месяцев назад

    thanks sir , maja aaya

  • @datasciencegyan5145
    @datasciencegyan5145 2 года назад

    random forest me jo DT use hotey hai wo kitne DT hotey hai during training pata ker saktey jai kya ya randomly select hotey depending on rows and col which they select during row and feature sampling.

  • @studywithamisha9903
    @studywithamisha9903 11 месяцев назад

    Plz explain golden features??

  • @gowthamsr3545
    @gowthamsr3545 2 года назад

    Hi sir, how can we check which features contributed most for each prediction??
    Suppose we built model to predict if loan should be given or not...... Then if a person ask why did my application get regected, then feature importantance will differ from person to person..... So, how to check feature importanance for each prediction?????

  • @stevegabrial1106
    @stevegabrial1106 3 года назад

    Hello Sir, plz update this series. Thanks

  • @sptest4298
    @sptest4298 2 месяца назад

    How feature Importance is calculated in Decision Tree for Regression ?

  • @rafibasha4145
    @rafibasha4145 2 года назад

    Pls cover feature selection xgboost knn dbscan catboost

  • @sanamsharma8886
    @sanamsharma8886 2 года назад

    Hi, that was very informative. I have a question regarding the above problem:
    In Decision Tree, if a feature is more important, as we saw 1st feature was more important. Shouldn't it be the root node? Is there any relation b/w what should be the order of nodes with the feature importance?

    • @kindaeasy9797
      @kindaeasy9797 8 месяцев назад

      there are parameters which decide the feature that has to be used for root note , in random forest , we have multiple decision trees , so it comes down to Feature Sampling

  • @maniteja8561
    @maniteja8561 2 года назад

    If the zero column has more feature importance then y cant be it primary node

  • @shivoham5939
    @shivoham5939 Год назад

    20:46 PICHE DHEKO

  • @yashjain6372
    @yashjain6372 Год назад

    best

  • @PratapO7O1
    @PratapO7O1 3 года назад +1

    high cardinality as in numbers also or only categorical data?

  • @PratapO7O1
    @PratapO7O1 3 года назад

    How is this different from Mutual Information?