Naïve Bayes Classifier - Fun and Easy Machine Learning

Поделиться
HTML-код
  • Опубликовано: 29 окт 2024

Комментарии • 225

  • @Augmented_AI
    @Augmented_AI  3 года назад +4

    ⭐ If you enjoy my work, Id really appreciate a Coffee😎☕ - augmentedstartups.info/BuyMeCoffee

    • @sunway1374
      @sunway1374 Год назад

      5:41 Hi. I understand the Bayes formula leads to the final two numbers in green. But can we explain why the sum of the two probability is not equal to 1? Is there a non-zero intersection of P(yes|X) and P(no|X)? If yes, what does it even mean? Thanks!

  • @christopherchan5357
    @christopherchan5357 5 лет назад +2

    My professor took several hours to talk about but no idea what he was talking about, I just watched your video just 12 minutes, I fully comprehended. Thank you for guiding how to do my assignment, I was struggling until I watched your video.

  • @teckyify
    @teckyify 5 лет назад +8

    The most important point for NB is that it can be trained incrementally as new evidence comes in. That is a giant drawback in other classifiers in which you have to retrained based on the whole data-set.

  • @syedshahab8471
    @syedshahab8471 3 года назад +1

    What an amazing video. If Education system is to changed i would very much like it to become like this.
    Enjoyed every second of it. Thanks

  • @uzKantHarrison
    @uzKantHarrison 5 лет назад +1

    Not my favorite type of educational video, but I still liked it because it was extremely easy to understand and quite informative.

  • @asadulhaqmshani4737
    @asadulhaqmshani4737 5 лет назад +3

    To beginners (like myself), I suggest you watch this video several times if you don't understand it at first. And also learn about this concept from another source and then come back to this video, it will help you understand more.
    Anyway, this is a great video, thanks!

  • @groovytau
    @groovytau 6 лет назад +1

    Nice video, apparently i understand it better from you than from my teacher, the fact that you use illustrations helps me a lot to visualize the idea and better understand how it works

  • @ankitshah008
    @ankitshah008 5 лет назад +4

    Lot of efforts have been put to create such a nice explanatory video. Thanks a lot for creating such easy to understand video.

    • @Augmented_AI
      @Augmented_AI  5 лет назад

      I'm really grateful for your comment☺️ thank you so much.

  • @apoorvasrini2196
    @apoorvasrini2196 6 лет назад +8

    This video was soooo sooo useful to me. I was breaking my head over a bad video from my university course and after watching this it all became soo simple. Keep up the good work!!

    • @Augmented_AI
      @Augmented_AI  6 лет назад

      Thank you so much, it means. Lot to me :). I really appreciate it

  • @ephremtadesse3195
    @ephremtadesse3195 7 лет назад +12

    it's nice tutorial , just you made it easy to quickly grasp the idea. Thank you!

    • @Augmented_AI
      @Augmented_AI  7 лет назад

      +Ephrem Tadesse thank you, I'm glad it was easy to grap. I appreciate it. :)

  • @wajay2006
    @wajay2006 7 лет назад

    Best Tutorial on Naïve Bayes . Easy to Understand.

  • @GauravKumar-vu9td
    @GauravKumar-vu9td 6 лет назад +1

    Watched for 20 seconds and I knew I had to subscribe immediately if at all i wanted to increase my knowledge! Thanks man! Fantastic video for scums like me who find it hard to understand by reading text book

  • @relaxingminds3530
    @relaxingminds3530 5 лет назад +1

    really to good and very easy way to teach thank you so much

  • @krishnakanjee6239
    @krishnakanjee6239 7 лет назад +7

    Such an awesome video! You made it look so easy. And your video itself is fun to watch. Thanks!

  • @SamuelLawson
    @SamuelLawson 5 лет назад +1

    Funny thing - I used a Naive Bayes library in Python that attempts to guess whether a statement is positive or negative and gave it two very similar sentences:
    "That is a dog."
    "That is a cat"
    The sentence with 'dog' came back as 67% positive while the sentence with 'cat' was reported as 58% negative
    It seems Thomas Bayes preferred dogs! :-D

  • @usscork
    @usscork 6 лет назад +133

    Good tutorial, but I'm fairly sure you made a mistake when calculating P(X) to Normalize.
    The Value should have been the sum of you initial two equations....0.0053+0.0206 = 0.0259
    Then dividing 0.0053/0.0259 = 20.5% for Play = Yes
    against 0.0206/0.0259 = 79.5% for Play = No
    and these probabilities, collectively adding up to 100% or 1
    In your example, you have the probabilities 0.2424 + 0.9421 which is >1 and is just wrong.
    Otherwise, as I said... a good and easy to follow tutorial... so thank you.

    • @ishansoni9819
      @ishansoni9819 6 лет назад +6

      This should be a pinned comment :). The normalisation isn't done correctly in this tutorial.

    • @PierLim
      @PierLim 6 лет назад +7

      Thanks for this correction. Actually, there isn't a need here to calculate the denominator as we are classifying. 0.0206 > 0.0053 shows already that we should not play the game. I suppose it is for completeness. I agree, nicely done tutorial, great production values.

    • @barcode628
      @barcode628 6 лет назад +2

      It should definitely be a pinned comment, there should even be an annotation to the video or something of that sort... I did it the way he does on my assignment paper and didn't get any points for the exercise for exactly that reason. On the other hand: Now I certainly won't make the same mistake in the exam.

    • @werbungschrott4050
      @werbungschrott4050 6 лет назад

      Thanks!

    • @barcode628
      @barcode628 6 лет назад +1

      +AYUSH RASTOGI Dividing by P(X) (also often referred to as "evidence") is meant to normalize the values. So yes, after normalization they have to add up to 1. It is true that this normalization can be ignored if it is a constant and if you only want to classify an observation, but seeing as that is not how the 'Naïve Bayes Classifier' originally works and that this is a teaching video, he should probably apply the algorithm correctly.
      Also, instead of just not dividing by the evidence at all (which is - as said - what some people do to avoid unnecessary effort), he uses a completely wrong value for P(X). So I guess it's safe to say, that this is actually a mistake and not just "saving time".

  • @kundansahuji
    @kundansahuji 5 лет назад

    Awesome video.Finely explained using numerical

  • @renammartinez
    @renammartinez 4 года назад +1

    I assume I need a lot of prior knowledge, because I really didn't understand a thing (since I'm looking this up for college, I assume it's downhill from here). Tips on where to begin are appreciated.

  • @tajrianbintajahid7561
    @tajrianbintajahid7561 4 года назад

    Best video ever for naive Bayes

  • @lizravenwood5317
    @lizravenwood5317 4 года назад +2

    This is the BEST explanation of NB I've ever seen.

    • @Augmented_AI
      @Augmented_AI  4 года назад +1

      Thank you so much. I really appreciate it 😊

  • @jamesturban6944
    @jamesturban6944 4 года назад

    Very great video, this guy is amazing for machine learning

  • @pascalbercker7487
    @pascalbercker7487 3 года назад +1

    Great video - but I would tone the music down just a tad - but the content is superb!

  • @FlvckoJr
    @FlvckoJr 4 года назад +2

    wooow, you literally rescued my life 😂😂😂 THANK YOUU SOO MUCH SIR

  • @nishkaarora6343
    @nishkaarora6343 5 лет назад +1

    this is straght fire i love this video this is how all of ML should be taught kudos

  • @haroldfelipezuluagagrisale3875
    @haroldfelipezuluagagrisale3875 4 года назад

    Great channel for educational videos, the best, very interesting !!

  • @kitsadda
    @kitsadda 7 лет назад

    Clearly Explained. Looking for more such videos

    • @Augmented_AI
      @Augmented_AI  7 лет назад

      +Gopala Krishna thanks Gopala, I will be uploading every week . Please subscribe to see more =)

  • @kaviramsamy3708
    @kaviramsamy3708 7 лет назад

    Best tutorial I've seen for Naïve Bayes. Thanks

  • @NKG_Creations
    @NKG_Creations 4 года назад +1

    Excellent, well explained thank you sir.

  • @akino.3192
    @akino.3192 6 лет назад +1

    I am fairly new to Naive Bayes - however, shouldn't 'P(Outlook = sunny | Play = Yes)' be interpreted as "probability that the outlook is sunny given that we can play"? and not the other way around?

  • @TheExcessivemhz
    @TheExcessivemhz 6 лет назад +1

    Indeed it was easy and fun to learn ML. Thanks !

  • @azula203
    @azula203 7 лет назад

    Great work, easy to understand, and kept me interested throughout the video

  • @Skyfox94
    @Skyfox94 4 года назад

    I think this was a very good overview of how this algorithm works, however given the length of the video I'm assuming a lot of things have been left out. It works nicely as a "primer" - a jumping-off point for people who want to get an idea of how it works.

  • @mingchengao
    @mingchengao 6 лет назад +20

    NIce video. But I have a question. Shouldn't P(Play=Yes|X)+P(Play=No|X) be 1?

    • @shyresearcher
      @shyresearcher 6 лет назад +6

      Yes, it should. There is a mistake in the video on that step.

    • @Jack-dx7qb
      @Jack-dx7qb 6 лет назад +1

      totally agree

    • @krutznutz1215
      @krutznutz1215 5 лет назад

      I was wondering why it was not 1....

    • @dstwo6539
      @dstwo6539 5 лет назад +2

      Theoretically it should, but in this case not necessary, given that the assumption is each X features is independent when it is often not the case, in other words, the statement of P(X1, X2, ...Xn| y) = P(X1|y) * P(X2|y) * .... * P(Xn|y) is usually not true. Your P(y|X1, X2,...Xn) + P(y'|X1,X2,...,Xn) = 1 only holds when you DON'T break P(X1, X2, ...Xn| y) into P(X1|y) * P(X2|y) * .... * P(Xn|y) when applying Bayes Theorem, however Naive Bayes assumes this is true, in the case this is not true(most of the time), P(y|X1, X2,...Xn) + P(y'|X1,X2,...,Xn) = 1 doesn't not hold anymore. So there's really nothing wrong in the video.

    • @saadahmad485
      @saadahmad485 5 лет назад

      @@dstwo6539 Nice explanation thanks !

  • @alexisnlavergne
    @alexisnlavergne 4 года назад +1

    This video made me subscribed to the channel!
    But also, what next? I'd love to know what is the next step to start mastering this classifier? I'm starting to program some but any good ressource to help debug myself?

  • @lexispanks7189
    @lexispanks7189 3 года назад

    Correct me if I'm wrong but the C in your function could be confused with the classes 'yes' and 'no' I've seen some other examples that use C to denominate yes and no.

  • @duongminhnguyet7308
    @duongminhnguyet7308 3 года назад

    Why do you choose Outlook = "Sunny", Temperature = "Cool", Humidity = "High", Wind = "Strong" when there is no such row in the table or is it a golfing condition?

  • @yhx89757
    @yhx89757 7 лет назад

    Awesome! Super easy to understand. Thanks for making this video!

  • @saqibcs
    @saqibcs 5 лет назад

    Thank you man. Just watched before exam

  • @dr.mehdipoorsorkh752
    @dr.mehdipoorsorkh752 5 лет назад

    Fantastic presentation . Many thanks.

  • @jayananuranga5485
    @jayananuranga5485 4 года назад +1

    very important easy to learn

  • @vaisakhv9916
    @vaisakhv9916 5 лет назад +43

    please decrease the volume of background music

    • @Kyodu
      @Kyodu 4 года назад +8

      Or remove it completely feels like a yoga tutorial. Although, I like your style I could not watch more than a few minutes.

  • @sajidhasan1161
    @sajidhasan1161 5 лет назад

    so easily understood. thanks'

  • @ssshukla26
    @ssshukla26 4 года назад +1

    Excellent.

  • @chloekimball536
    @chloekimball536 6 лет назад

    Liked, subscribed, commented. This has to be the best explanation on naive Bayes classifier ever. thank you sir. And cheers! But please clarify why 0.2424 + 0.9421 is turning out to be >1

  • @benphua
    @benphua 5 лет назад +2

    Thank you so much for this amazing video!

    • @Augmented_AI
      @Augmented_AI  5 лет назад +1

      I'm glad that I could help 😊.

  • @slava-keshkov
    @slava-keshkov 6 лет назад +1

    Hi 7:40: 'We can view the probability that we play golf, given that it is sunny P ( Yes | Sunny) equals the probability that we play golf given a yes P (Sunny | Yes) times the probability of it being sunny P (Sunny) divided by it being a yes P (Yes). Given the theorem, shouldn't it be that P(Yes|Sunny) = P(Sunny|Yes) * P(Yes) / P(Sunny)?

  • @TheVivek1978
    @TheVivek1978 6 лет назад

    Excellent explanations with examples, pros/cons and applicability ! Covered it all !

    • @Augmented_AI
      @Augmented_AI  6 лет назад

      +Vivek Kumaresan thank you so much , I really appreciate it :)

  • @rodrik1
    @rodrik1 6 лет назад

    very nice explanation! Thank you so much for the video

    • @Augmented_AI
      @Augmented_AI  6 лет назад

      Thank you, I'm really glad you enjoyed it :)

  • @EmapMe
    @EmapMe 4 года назад +1

    Wow so fun video!

  • @urjadamodar4093
    @urjadamodar4093 5 лет назад +1

    at 3.43: the higher the probabilty of yes the higher is the probability we can play. Then why they selected the options where the probability is less for having higher chances to play?

  • @rajatietl
    @rajatietl 4 года назад +2

    Best video for intution behind ML algorithm

  • @vijanth
    @vijanth 4 года назад +1

    Really good. Class of his own

  • @nazimrazali2773
    @nazimrazali2773 5 лет назад

    Thank you. very easy to understand and learn. hoping u can make a video for Bayesian Network too since u already show naive bayes and bayes theorem :)

  • @luanacs37
    @luanacs37 3 года назад

    Obrigada! Ajudou muito 🇧🇷

  • @kodieswaria528
    @kodieswaria528 6 лет назад

    Excellent Explanation.Thank u.

  • @rampetajraji
    @rampetajraji 7 лет назад +3

    Awesome Tutorial.. this video made my day!

  • @elliuslurenzpino2005
    @elliuslurenzpino2005 6 лет назад +1

    I'm new to machine learning and I would just like to know if Bayes Classifier is a non-linear algorithm, thanks :D

  • @sembutininverse
    @sembutininverse 3 года назад +1

    thank you 🙏🏻🙏🏻🙏🏻🙏🏻🙏🏻🙏🏻

  • @techwellness6142
    @techwellness6142 5 лет назад

    At 3:30 why are you taking only the 5 probability conditions for Play=Yes .? for example why didnt you take P(Outlook=Overcast | Play= Yes)=4/9 . Please help

  • @mailanbazhagan
    @mailanbazhagan 7 лет назад +1

    That was an awesome learning experience with your stuff.... : )

    • @Augmented_AI
      @Augmented_AI  7 лет назад

      +M Anbazhagan I'm really glad you enjoyed it :). You can check out my playlist for other fun and easy machine learning tutorials.

  • @jasonperhaps
    @jasonperhaps 5 лет назад +1

    Gotta learn bayes

  • @weicao4101
    @weicao4101 5 лет назад

    Respect from China. This tutorial is more useful than my professor did a whole hour!

  • @balamurugann1461
    @balamurugann1461 4 года назад +1

    Thanks for it. Much appreciated.

  • @danielbassett7933
    @danielbassett7933 5 лет назад

    Brilliant video!

  • @linwyatt9302
    @linwyatt9302 7 лет назад

    This is the best tutorial I've seen in RUclips. Please keep uploading new videos!

    • @Augmented_AI
      @Augmented_AI  7 лет назад

      +Lin Wyatt I'm glad you enjoying it :) will keep uploading

  • @danielacarrapico2034
    @danielacarrapico2034 5 лет назад +1

    Thank you! It helped a lot

  • @engr.alidawoodi6461
    @engr.alidawoodi6461 4 года назад +2

    its really cool

  • @mkgamesartvisuals
    @mkgamesartvisuals 6 лет назад

    Your paintings are super cool!

  • @mostafanakhaei2487
    @mostafanakhaei2487 4 года назад

    Brilliant, How did you make the slides? was fun.

  • @ereniacastellan4587
    @ereniacastellan4587 6 лет назад

    Thank you, great explanation, it helped me on last minutes

  • @aashwinbhushal3467
    @aashwinbhushal3467 4 года назад

    how did get that formula to calculate p(x)?

  • @vivekchoudhary8745
    @vivekchoudhary8745 5 лет назад

    best explanation, the thing which separates is that there is theory and a well defines numerical of the data.
    keep going

  • @numidian19
    @numidian19 4 года назад

    Great tutorial thank you, could u plz tell me the name of the software that u used to make such tutorial?

  • @aspdeepak4yt
    @aspdeepak4yt 6 лет назад

    Great explanation !!

  • @timobohnstedt5143
    @timobohnstedt5143 5 лет назад

    Handy Video and perfect to understand the mathematical principals we have learned in class much better. Thank u :)

  • @udal100
    @udal100 6 лет назад

    youtube.s best Naive Bayes expaination video...love it

    • @Augmented_AI
      @Augmented_AI  6 лет назад

      Thank you, I really appreciate the comment :)

  • @GlobalContentThatWeNeed
    @GlobalContentThatWeNeed 6 лет назад +1

    Thanks indeed!

  • @SapnaGupta-vv5fr
    @SapnaGupta-vv5fr 5 лет назад +1

    Thank you ...u make it possible learn while fun..

  • @renzocoppola4664
    @renzocoppola4664 6 лет назад +7

    Shouldn't it add up to 1? Even though normalization isn't necessary.

    • @Filmsuper95
      @Filmsuper95 5 лет назад +1

      Because it assumes independence for all features, even if this is not entirely the case

    • @moyube7475
      @moyube7475 4 года назад

      Yeah, almost right. Normalization would have shown that P(Play=No|X) = 0.7954 > P(Play=Yes|X) = 0.2046

  • @anubhavsrivastava850
    @anubhavsrivastava850 4 года назад

    I am confused with we take outlook = overcast then what will be equation for the NO. because it has zero which will make the whole equation zero. ....???

  • @binhnguyenam2546
    @binhnguyenam2546 4 года назад

    There are some mistakes when calculating P(Ci|X) = P(Play=Yes|X) = P(X|Play=Yes)*P(Play=Yes). Because we were calculating only one case (Ci -> Play=Yes) for each case of attributes. The lack of situation include:
    + P(Overcast|Play=Yes), P(Rainy|Play=Yes), P(Hot|Play=Yes)...
    And the same condition with (Ci -> Play=No).
    After calculating P(X|Ci), we will calculate P(Ci|X) = P(X|Ci)*P(Ci).
    These probabilities of P(Play=Yes|X) + P(Play=No|X) = 1. In some situations, when X would be specified by condition, this addition would not be 1 or 100%.

  • @sasankv9919
    @sasankv9919 5 лет назад +1

    Subbed. Thanks for excellent tutorial

  • @bellicose2009
    @bellicose2009 7 лет назад +5

    Brilliant, well presented!

  • @karanshukla6889
    @karanshukla6889 5 лет назад +4

    Excellent video, I understood everything

  • @allall02
    @allall02 6 лет назад +1

    Thank you, great video!

  • @kamranshaik7049
    @kamranshaik7049 7 лет назад +7

    The best video

    • @Augmented_AI
      @Augmented_AI  7 лет назад +1

      +kamran shaik thank you so much. I'm really glad you enjoyed this video. :) I really appreciate it.

  • @sandeepranote9711
    @sandeepranote9711 5 лет назад +1

    Loved the video. So easy to understand and everything explained beautifully! Thanks a lot for creating this video! :)

  • @soumachatterjee9399
    @soumachatterjee9399 5 лет назад

    what if a column contains numerical values only ? How to do prediction on that ?

  • @grigolperadze7732
    @grigolperadze7732 5 лет назад +19

    P>1, Its so fundamental mistake :D please change it

  • @sindhuchinniah4363
    @sindhuchinniah4363 6 лет назад

    Good Video! Thanks

  • @soufianebenkhaldoun7765
    @soufianebenkhaldoun7765 5 лет назад

    simple and clear tutorial, thank you !

  • @bariskocer
    @bariskocer 4 года назад +1

    thx

  • @zkzk5334
    @zkzk5334 5 лет назад +2

    nice ...

  • @victorlima8018
    @victorlima8018 6 лет назад

    great video, helped me a lot

    • @Augmented_AI
      @Augmented_AI  6 лет назад

      I'm really glad that it helped 😀

  • @raneemyad3181
    @raneemyad3181 5 лет назад

    thank you

  • @junaid5388
    @junaid5388 7 лет назад +1

    What a nice presentation!!

    • @Augmented_AI
      @Augmented_AI  7 лет назад

      +junaid shahid thank you Junaid. I really appreciate it. :)

  • @kamleshbhalui
    @kamleshbhalui 6 лет назад +1

    Nicely Explained!

  • @infiniteunconditionallove1620
    @infiniteunconditionallove1620 5 лет назад

    thanks a lot

  • @AngusLou
    @AngusLou 6 лет назад

    simple and good

    • @Augmented_AI
      @Augmented_AI  6 лет назад

      Thank you so much Angus 😀. I really appreciate it

  • @a_level_math704
    @a_level_math704 4 года назад +1

    commendable efforts brother! :)

    • @Augmented_AI
      @Augmented_AI  4 года назад

      Thanks man I really appreciate it 😊👍😁