How to find the Entropy and Information Gain in Decision Tree Learning by Mahesh Huddar
HTML-код
- Опубликовано: 28 сен 2020
- How to find the Entropy and Information Gain in Decision Tree Learning by Mahesh Huddar
In this video, I will discuss how to find entropy and information gain given a set of training examples in constructing a decision tree.
Machine Learning - • Machine Learning
Big Data Analysis - • Big Data Analytics
Data Science and Machine Learning - Machine Learning - • Machine Learning
Python Tutorial - • Python Application Pro...
entropy nptel,
entropy explained,
entropy data mining,
entropy data mining example,
entropy based discretization example data mining,
entropy machine learning,
entropy machine learning example,
entropy calculation machine learning,
entropy in machine learning in hindi,
information gain decision tree,
gain decision tree,
information gain and entropy in the decision tree,
information gain,
information gain and entropy,
information gain feature selection,
information gain calculation,
information gain and Gini index,
information gain and Gini index,
information gain for continuous-valued attributes
very lucid explanation. Thanks a ton Prof.
How root node depends on information gain of attributes.?.. Simply super Explanation..If you get questions from listeners.. You understand that they like your videos
Thank you for the clear explanation!
Thank you very much sir, completely cleared my doubt.
HI sir this is best explanation i have seen so far. in my Datascience i shared this link to my friends . thanks for your support and
in Issues in Decision Tree Learning. can u pls post the ans for last two questions ( handling arributes with different costs
and
alternative measures for selecting attributes )
Clean and neat explanation Thank you sir
thank you. very much appreciated
Thank u so much , so helpful video
Thank u so much sir ❤️🙏🏻
Excellent!
Thank you sir.
Thank you sir
thanks for this wonderful example .
Welcome
Do like share and subscribe
Thank you so much! I was able to submit my assignment in my masters because of this video
Welcome
Do like share and subscribe
Really good explanation, thank you.
Welcome
Do like share and subscribe
Very nice
your presentation is excellent and clear. thank you for making these videos available to everyone.
Welcome
Do like share and subscribe
Thank you!
Welcome
Do like share and subscribe
Superb explanation
Thank You.
Do like share and subscribe
Fantastic explanation 🎉
Welcome
Do like share and subscribe
I am a masters student of data science at a german university and this has helped me thoroughly for my ML exam! Thank you dear sir!
Bhai bhai bhai..i am also giving Attendance here 🤣💁♂️ exact 10hrs before exam
@@ankitpathak5 and me 4 hours before the exam xD
Hello Sir, awesome job on this video.
Thank You
Do like share and subscribe
Thank you very much🙏
Welcome
Do like share and subscribe
This was so easy to understand!! Thank You so much
Welcome
Do like share and subscribe
Best explanation
Thank You
Do like share and subscribe
Thank you so much sir
Welcome
Do like share and subscribe
thank you!!
Welcome
Do like share and subscribe
Thank you so much very simply explained
Welcome
Do like share and subscribe
thanks a lot bro.
Welcome
Do like share and subscribe
good
Nicely and clearly done. Thanks!
Welcome
Do like share and subscribe
Can you explain this "One interpretation of entropy from information theory is that it specifies the minimum number of bits of information needed to encode the classification of an arbitrary member of S.
tussi great ho paji
Thank You
Do like Share and subscribe
Sir is this same for data mining also
To achieve Entropy = 0.94 , you need to divide the whole answer solved by taking ln X by ln 2, because log X base 2 = ln X / ln 2
Thank you so much
How can this data use in python machine learning
Ahh here i am revising for my final Data mining exam, wish me luck
since we are calculating entropy for a given dataset from one column only that is our classification, thus we could say entropy for a given dataset remains same when we calculate it for each column as no. of yes and no remains same? please answer fast
True. As there is one target variable, entropy of the whole dataset remains same. However, when we select any feature, let's say 'Wind' in this case, data is divided in two sub parts each corresponding to 'weak' and 'strong' wind respectively. In each of the sub parts, values of 'Yes' and 'No' are different. So, entropy of each sub part is different.
Answering your question, entropy of sub parts depends on the feature that we select. If we select 'Humidity' as feature, sub parts and corresponding 'Yes' and 'No' distribution will be different. Hence, entropy in that case will be different.
Sir gain is very difficult explain again on video
Thank you sir tmrrw having sem exam😊
Welcome
Do like share and subscribe
sir how to find entropy using calculator?
...please help
🎉
How to calculate gain and entropy of continuous data?
Is it possible to do so ?
actually, I'm not sure if there is a math error. instead of 0.811, should it not be 0.5623?
I see why, you took log on base 2.
Reply please
I have some doubt on how to calculate the term log base 2 . How to do this calculation. Means 9/14 log base 2 9/14 - ...... How to achieve the answer like 0.94
Hi there! I want to ask if you have figured this out. Cuz sameee I nvr got 0.94 and I only got 0.28
Use calculator
I get 0.12
@@JamesBandOfficial They have Skipped One Minus Sign
Dude, you need to divide the whole answer by ln 2, because log X base 2 = ln X / ln 2
in 5:04 on the three examples you divided 14/14, 9/14,5/14 and 7/4 .. why exactly you divided all these numbers from 14? from where you get it and how
Total number of days are 14 in the data set took by him(Total instances were 14)
Sir I have a 1 dout on the entropy topic can u please help me 👇 sir ?😔
@@MaheshHuddar thanks sir giving me reply sir questions is how to find out entropy ? Questions ) P1=0.1 P2= 0.2
P3= 0.3 P4= 0.4 find entropy ?
What was the solution sir ? And how we find the entropy using formula please sir helps me ?
Plzz reply sir ?
Thanks sir
thank you sir
Welcome
Do like share and subscribe