5:41 Hi. I understand the Bayes formula leads to the final two numbers in green. But can we explain why the sum of the two probability is not equal to 1? Is there a non-zero intersection of P(yes|X) and P(no|X)? If yes, what does it even mean? Thanks!
My professor took several hours to talk about but no idea what he was talking about, I just watched your video just 12 minutes, I fully comprehended. Thank you for guiding how to do my assignment, I was struggling until I watched your video.
The most important point for NB is that it can be trained incrementally as new evidence comes in. That is a giant drawback in other classifiers in which you have to retrained based on the whole data-set.
To beginners (like myself), I suggest you watch this video several times if you don't understand it at first. And also learn about this concept from another source and then come back to this video, it will help you understand more. Anyway, this is a great video, thanks!
Nice video, apparently i understand it better from you than from my teacher, the fact that you use illustrations helps me a lot to visualize the idea and better understand how it works
This video was soooo sooo useful to me. I was breaking my head over a bad video from my university course and after watching this it all became soo simple. Keep up the good work!!
Watched for 20 seconds and I knew I had to subscribe immediately if at all i wanted to increase my knowledge! Thanks man! Fantastic video for scums like me who find it hard to understand by reading text book
Funny thing - I used a Naive Bayes library in Python that attempts to guess whether a statement is positive or negative and gave it two very similar sentences: "That is a dog." "That is a cat" The sentence with 'dog' came back as 67% positive while the sentence with 'cat' was reported as 58% negative It seems Thomas Bayes preferred dogs! :-D
Good tutorial, but I'm fairly sure you made a mistake when calculating P(X) to Normalize. The Value should have been the sum of you initial two equations....0.0053+0.0206 = 0.0259 Then dividing 0.0053/0.0259 = 20.5% for Play = Yes against 0.0206/0.0259 = 79.5% for Play = No and these probabilities, collectively adding up to 100% or 1 In your example, you have the probabilities 0.2424 + 0.9421 which is >1 and is just wrong. Otherwise, as I said... a good and easy to follow tutorial... so thank you.
Thanks for this correction. Actually, there isn't a need here to calculate the denominator as we are classifying. 0.0206 > 0.0053 shows already that we should not play the game. I suppose it is for completeness. I agree, nicely done tutorial, great production values.
It should definitely be a pinned comment, there should even be an annotation to the video or something of that sort... I did it the way he does on my assignment paper and didn't get any points for the exercise for exactly that reason. On the other hand: Now I certainly won't make the same mistake in the exam.
+AYUSH RASTOGI Dividing by P(X) (also often referred to as "evidence") is meant to normalize the values. So yes, after normalization they have to add up to 1. It is true that this normalization can be ignored if it is a constant and if you only want to classify an observation, but seeing as that is not how the 'Naïve Bayes Classifier' originally works and that this is a teaching video, he should probably apply the algorithm correctly. Also, instead of just not dividing by the evidence at all (which is - as said - what some people do to avoid unnecessary effort), he uses a completely wrong value for P(X). So I guess it's safe to say, that this is actually a mistake and not just "saving time".
I assume I need a lot of prior knowledge, because I really didn't understand a thing (since I'm looking this up for college, I assume it's downhill from here). Tips on where to begin are appreciated.
I am fairly new to Naive Bayes - however, shouldn't 'P(Outlook = sunny | Play = Yes)' be interpreted as "probability that the outlook is sunny given that we can play"? and not the other way around?
I think this was a very good overview of how this algorithm works, however given the length of the video I'm assuming a lot of things have been left out. It works nicely as a "primer" - a jumping-off point for people who want to get an idea of how it works.
Theoretically it should, but in this case not necessary, given that the assumption is each X features is independent when it is often not the case, in other words, the statement of P(X1, X2, ...Xn| y) = P(X1|y) * P(X2|y) * .... * P(Xn|y) is usually not true. Your P(y|X1, X2,...Xn) + P(y'|X1,X2,...,Xn) = 1 only holds when you DON'T break P(X1, X2, ...Xn| y) into P(X1|y) * P(X2|y) * .... * P(Xn|y) when applying Bayes Theorem, however Naive Bayes assumes this is true, in the case this is not true(most of the time), P(y|X1, X2,...Xn) + P(y'|X1,X2,...,Xn) = 1 doesn't not hold anymore. So there's really nothing wrong in the video.
This video made me subscribed to the channel! But also, what next? I'd love to know what is the next step to start mastering this classifier? I'm starting to program some but any good ressource to help debug myself?
Correct me if I'm wrong but the C in your function could be confused with the classes 'yes' and 'no' I've seen some other examples that use C to denominate yes and no.
Why do you choose Outlook = "Sunny", Temperature = "Cool", Humidity = "High", Wind = "Strong" when there is no such row in the table or is it a golfing condition?
Liked, subscribed, commented. This has to be the best explanation on naive Bayes classifier ever. thank you sir. And cheers! But please clarify why 0.2424 + 0.9421 is turning out to be >1
Hi 7:40: 'We can view the probability that we play golf, given that it is sunny P ( Yes | Sunny) equals the probability that we play golf given a yes P (Sunny | Yes) times the probability of it being sunny P (Sunny) divided by it being a yes P (Yes). Given the theorem, shouldn't it be that P(Yes|Sunny) = P(Sunny|Yes) * P(Yes) / P(Sunny)?
at 3.43: the higher the probabilty of yes the higher is the probability we can play. Then why they selected the options where the probability is less for having higher chances to play?
At 3:30 why are you taking only the 5 probability conditions for Play=Yes .? for example why didnt you take P(Outlook=Overcast | Play= Yes)=4/9 . Please help
I am confused with we take outlook = overcast then what will be equation for the NO. because it has zero which will make the whole equation zero. ....???
There are some mistakes when calculating P(Ci|X) = P(Play=Yes|X) = P(X|Play=Yes)*P(Play=Yes). Because we were calculating only one case (Ci -> Play=Yes) for each case of attributes. The lack of situation include: + P(Overcast|Play=Yes), P(Rainy|Play=Yes), P(Hot|Play=Yes)... And the same condition with (Ci -> Play=No). After calculating P(X|Ci), we will calculate P(Ci|X) = P(X|Ci)*P(Ci). These probabilities of P(Play=Yes|X) + P(Play=No|X) = 1. In some situations, when X would be specified by condition, this addition would not be 1 or 100%.
⭐ If you enjoy my work, Id really appreciate a Coffee😎☕ - augmentedstartups.info/BuyMeCoffee
5:41 Hi. I understand the Bayes formula leads to the final two numbers in green. But can we explain why the sum of the two probability is not equal to 1? Is there a non-zero intersection of P(yes|X) and P(no|X)? If yes, what does it even mean? Thanks!
My professor took several hours to talk about but no idea what he was talking about, I just watched your video just 12 minutes, I fully comprehended. Thank you for guiding how to do my assignment, I was struggling until I watched your video.
The most important point for NB is that it can be trained incrementally as new evidence comes in. That is a giant drawback in other classifiers in which you have to retrained based on the whole data-set.
What an amazing video. If Education system is to changed i would very much like it to become like this.
Enjoyed every second of it. Thanks
Not my favorite type of educational video, but I still liked it because it was extremely easy to understand and quite informative.
To beginners (like myself), I suggest you watch this video several times if you don't understand it at first. And also learn about this concept from another source and then come back to this video, it will help you understand more.
Anyway, this is a great video, thanks!
Nice video, apparently i understand it better from you than from my teacher, the fact that you use illustrations helps me a lot to visualize the idea and better understand how it works
Lot of efforts have been put to create such a nice explanatory video. Thanks a lot for creating such easy to understand video.
I'm really grateful for your comment☺️ thank you so much.
This video was soooo sooo useful to me. I was breaking my head over a bad video from my university course and after watching this it all became soo simple. Keep up the good work!!
Thank you so much, it means. Lot to me :). I really appreciate it
it's nice tutorial , just you made it easy to quickly grasp the idea. Thank you!
+Ephrem Tadesse thank you, I'm glad it was easy to grap. I appreciate it. :)
Best Tutorial on Naïve Bayes . Easy to Understand.
Watched for 20 seconds and I knew I had to subscribe immediately if at all i wanted to increase my knowledge! Thanks man! Fantastic video for scums like me who find it hard to understand by reading text book
really to good and very easy way to teach thank you so much
Such an awesome video! You made it look so easy. And your video itself is fun to watch. Thanks!
Funny thing - I used a Naive Bayes library in Python that attempts to guess whether a statement is positive or negative and gave it two very similar sentences:
"That is a dog."
"That is a cat"
The sentence with 'dog' came back as 67% positive while the sentence with 'cat' was reported as 58% negative
It seems Thomas Bayes preferred dogs! :-D
Good tutorial, but I'm fairly sure you made a mistake when calculating P(X) to Normalize.
The Value should have been the sum of you initial two equations....0.0053+0.0206 = 0.0259
Then dividing 0.0053/0.0259 = 20.5% for Play = Yes
against 0.0206/0.0259 = 79.5% for Play = No
and these probabilities, collectively adding up to 100% or 1
In your example, you have the probabilities 0.2424 + 0.9421 which is >1 and is just wrong.
Otherwise, as I said... a good and easy to follow tutorial... so thank you.
This should be a pinned comment :). The normalisation isn't done correctly in this tutorial.
Thanks for this correction. Actually, there isn't a need here to calculate the denominator as we are classifying. 0.0206 > 0.0053 shows already that we should not play the game. I suppose it is for completeness. I agree, nicely done tutorial, great production values.
It should definitely be a pinned comment, there should even be an annotation to the video or something of that sort... I did it the way he does on my assignment paper and didn't get any points for the exercise for exactly that reason. On the other hand: Now I certainly won't make the same mistake in the exam.
Thanks!
+AYUSH RASTOGI Dividing by P(X) (also often referred to as "evidence") is meant to normalize the values. So yes, after normalization they have to add up to 1. It is true that this normalization can be ignored if it is a constant and if you only want to classify an observation, but seeing as that is not how the 'Naïve Bayes Classifier' originally works and that this is a teaching video, he should probably apply the algorithm correctly.
Also, instead of just not dividing by the evidence at all (which is - as said - what some people do to avoid unnecessary effort), he uses a completely wrong value for P(X). So I guess it's safe to say, that this is actually a mistake and not just "saving time".
Awesome video.Finely explained using numerical
I assume I need a lot of prior knowledge, because I really didn't understand a thing (since I'm looking this up for college, I assume it's downhill from here). Tips on where to begin are appreciated.
Best video ever for naive Bayes
This is the BEST explanation of NB I've ever seen.
Thank you so much. I really appreciate it 😊
Very great video, this guy is amazing for machine learning
Great video - but I would tone the music down just a tad - but the content is superb!
wooow, you literally rescued my life 😂😂😂 THANK YOUU SOO MUCH SIR
I'm glad I could help 😁
this is straght fire i love this video this is how all of ML should be taught kudos
Thank you Nishka😁. I agree!
Great channel for educational videos, the best, very interesting !!
Clearly Explained. Looking for more such videos
+Gopala Krishna thanks Gopala, I will be uploading every week . Please subscribe to see more =)
Best tutorial I've seen for Naïve Bayes. Thanks
Excellent, well explained thank you sir.
I am fairly new to Naive Bayes - however, shouldn't 'P(Outlook = sunny | Play = Yes)' be interpreted as "probability that the outlook is sunny given that we can play"? and not the other way around?
Indeed it was easy and fun to learn ML. Thanks !
Great work, easy to understand, and kept me interested throughout the video
I think this was a very good overview of how this algorithm works, however given the length of the video I'm assuming a lot of things have been left out. It works nicely as a "primer" - a jumping-off point for people who want to get an idea of how it works.
NIce video. But I have a question. Shouldn't P(Play=Yes|X)+P(Play=No|X) be 1?
Yes, it should. There is a mistake in the video on that step.
totally agree
I was wondering why it was not 1....
Theoretically it should, but in this case not necessary, given that the assumption is each X features is independent when it is often not the case, in other words, the statement of P(X1, X2, ...Xn| y) = P(X1|y) * P(X2|y) * .... * P(Xn|y) is usually not true. Your P(y|X1, X2,...Xn) + P(y'|X1,X2,...,Xn) = 1 only holds when you DON'T break P(X1, X2, ...Xn| y) into P(X1|y) * P(X2|y) * .... * P(Xn|y) when applying Bayes Theorem, however Naive Bayes assumes this is true, in the case this is not true(most of the time), P(y|X1, X2,...Xn) + P(y'|X1,X2,...,Xn) = 1 doesn't not hold anymore. So there's really nothing wrong in the video.
@@dstwo6539 Nice explanation thanks !
This video made me subscribed to the channel!
But also, what next? I'd love to know what is the next step to start mastering this classifier? I'm starting to program some but any good ressource to help debug myself?
Correct me if I'm wrong but the C in your function could be confused with the classes 'yes' and 'no' I've seen some other examples that use C to denominate yes and no.
Why do you choose Outlook = "Sunny", Temperature = "Cool", Humidity = "High", Wind = "Strong" when there is no such row in the table or is it a golfing condition?
Awesome! Super easy to understand. Thanks for making this video!
Thank you man. Just watched before exam
NPTEL?
Good luck for that.
Fantastic presentation . Many thanks.
Thank you Poorsorkh
very important easy to learn
please decrease the volume of background music
Or remove it completely feels like a yoga tutorial. Although, I like your style I could not watch more than a few minutes.
so easily understood. thanks'
Excellent.
Liked, subscribed, commented. This has to be the best explanation on naive Bayes classifier ever. thank you sir. And cheers! But please clarify why 0.2424 + 0.9421 is turning out to be >1
Thank you so much for this amazing video!
I'm glad that I could help 😊.
Hi 7:40: 'We can view the probability that we play golf, given that it is sunny P ( Yes | Sunny) equals the probability that we play golf given a yes P (Sunny | Yes) times the probability of it being sunny P (Sunny) divided by it being a yes P (Yes). Given the theorem, shouldn't it be that P(Yes|Sunny) = P(Sunny|Yes) * P(Yes) / P(Sunny)?
Excellent explanations with examples, pros/cons and applicability ! Covered it all !
+Vivek Kumaresan thank you so much , I really appreciate it :)
very nice explanation! Thank you so much for the video
Thank you, I'm really glad you enjoyed it :)
Wow so fun video!
at 3.43: the higher the probabilty of yes the higher is the probability we can play. Then why they selected the options where the probability is less for having higher chances to play?
Best video for intution behind ML algorithm
Really good. Class of his own
Thanks Vijanth 😁🤟
Thank you. very easy to understand and learn. hoping u can make a video for Bayesian Network too since u already show naive bayes and bayes theorem :)
Obrigada! Ajudou muito 🇧🇷
Excellent Explanation.Thank u.
Awesome Tutorial.. this video made my day!
I'm new to machine learning and I would just like to know if Bayes Classifier is a non-linear algorithm, thanks :D
thank you 🙏🏻🙏🏻🙏🏻🙏🏻🙏🏻🙏🏻
At 3:30 why are you taking only the 5 probability conditions for Play=Yes .? for example why didnt you take P(Outlook=Overcast | Play= Yes)=4/9 . Please help
That was an awesome learning experience with your stuff.... : )
+M Anbazhagan I'm really glad you enjoyed it :). You can check out my playlist for other fun and easy machine learning tutorials.
Gotta learn bayes
Respect from China. This tutorial is more useful than my professor did a whole hour!
Thanks for it. Much appreciated.
Brilliant video!
This is the best tutorial I've seen in RUclips. Please keep uploading new videos!
+Lin Wyatt I'm glad you enjoying it :) will keep uploading
Thank you! It helped a lot
I'm glad it helped 😁
its really cool
Thank you 😊
Your paintings are super cool!
Brilliant, How did you make the slides? was fun.
Videoscribe
Thank you, great explanation, it helped me on last minutes
how did get that formula to calculate p(x)?
best explanation, the thing which separates is that there is theory and a well defines numerical of the data.
keep going
Great tutorial thank you, could u plz tell me the name of the software that u used to make such tutorial?
Great explanation !!
Handy Video and perfect to understand the mathematical principals we have learned in class much better. Thank u :)
youtube.s best Naive Bayes expaination video...love it
Thank you, I really appreciate the comment :)
Thanks indeed!
allahu fuckbar
Thank you ...u make it possible learn while fun..
Shouldn't it add up to 1? Even though normalization isn't necessary.
Because it assumes independence for all features, even if this is not entirely the case
Yeah, almost right. Normalization would have shown that P(Play=No|X) = 0.7954 > P(Play=Yes|X) = 0.2046
I am confused with we take outlook = overcast then what will be equation for the NO. because it has zero which will make the whole equation zero. ....???
There are some mistakes when calculating P(Ci|X) = P(Play=Yes|X) = P(X|Play=Yes)*P(Play=Yes). Because we were calculating only one case (Ci -> Play=Yes) for each case of attributes. The lack of situation include:
+ P(Overcast|Play=Yes), P(Rainy|Play=Yes), P(Hot|Play=Yes)...
And the same condition with (Ci -> Play=No).
After calculating P(X|Ci), we will calculate P(Ci|X) = P(X|Ci)*P(Ci).
These probabilities of P(Play=Yes|X) + P(Play=No|X) = 1. In some situations, when X would be specified by condition, this addition would not be 1 or 100%.
Subbed. Thanks for excellent tutorial
Brilliant, well presented!
Excellent video, I understood everything
Thank you, great video!
The best video
+kamran shaik thank you so much. I'm really glad you enjoyed this video. :) I really appreciate it.
Loved the video. So easy to understand and everything explained beautifully! Thanks a lot for creating this video! :)
You are most welcome :)
what if a column contains numerical values only ? How to do prediction on that ?
P>1, Its so fundamental mistake :D please change it
Good Video! Thanks
simple and clear tutorial, thank you !
thx
nice ...
great video, helped me a lot
I'm really glad that it helped 😀
thank you
What a nice presentation!!
+junaid shahid thank you Junaid. I really appreciate it. :)
Nicely Explained!
Thank you Kamlesh :)
thanks a lot
simple and good
Thank you so much Angus 😀. I really appreciate it
commendable efforts brother! :)
Thanks man I really appreciate it 😊👍😁