Naive Bayes classifier: A friendly approach

Поделиться
HTML-код
  • Опубликовано: 2 окт 2024

Комментарии • 197

  • @08ae6013
    @08ae6013 5 лет назад +157

    This is the best explanation of Naive Baye’s & Baye’s theorem ... you rocked it ... Thanks for this

  • @Adenosene
    @Adenosene 5 лет назад +33

    This is explained so well, this video is so beautiful that I want to cry

    • @milanico2309
      @milanico2309 4 года назад +2

      same here

    • @OzScout66
      @OzScout66 4 года назад +4

      I am crying as I type this line...... *snif* .....so good !

  • @sharathnatraj
    @sharathnatraj 3 года назад +4

    No concept is too difficult to understand if its explained in the way that it can be comprehended. Great job Luis! I keep coming back to your videos whenever I am stuck. You style of explanation with examples is amazing.

  • @drranjitha
    @drranjitha 5 лет назад +10

    "So if you like formulas..." OMG! Thank you so much, Dr. Serrano. You helped my brain find the missing piece in my puzzle. The whole explanation was so clear but the formula helped me transition from Bayes to Naive Bayes. I was looking for the missing piece in youtube and somehow landed on your video. I actually came here after attending your AIND class.

  • @sayakpaul3152
    @sayakpaul3152 5 лет назад +22

    Beautiful Luis. You clearly draw the distinction between an educator and an instructor.

  • @AbhishekJain-bv6vv
    @AbhishekJain-bv6vv 3 года назад

    I have to say, this is the best explanation of Naive Bayes on youtube. All other videos start off with the formula, while you started with an example then went on to the formulae. Keep posting videos man, you are one gem of a teacher!!

  • @lucp2
    @lucp2 4 года назад

    Without any doubt the best explanation I've ever seen on Naive Bayes. Thank you

  • @ichallengemydog
    @ichallengemydog 5 лет назад +62

    The two people who disliked were looking for baes, but got Bayes.

    • @BharCode09
      @BharCode09 4 года назад

      haha! Imagine how they would have felt by end of the video!

    • @romanemul1
      @romanemul1 3 года назад +5

      they were naive.

    • @AIConcepts-tk3yq
      @AIConcepts-tk3yq 5 месяцев назад

      😂

  • @Ntho1994
    @Ntho1994 5 лет назад

    You have some of the best explanations of the topic in the entire business (literally hundreds of courses and YT series and stuff is out there). But you are the best

  • @LocuraRosa987
    @LocuraRosa987 4 года назад

    Can't thank you enough for this beautiful explanation on this kinda confusing topic. I mean, you didn't made any "naive" assumption about the watcher's background and explained it all in detail.
    Thanks a lot!

  • @obilorjim
    @obilorjim 5 лет назад

    Beautiful and easy. Gentle speaking but excellent teaching ability. Luis Serrano, thank you!

  • @aztmln
    @aztmln 5 лет назад +1

    Thanks Luis. This was a lot easier to follow than most of my profs to be honest. The fact that you explained first and then put it in equation terms now helpsme remember the equation and understand it better. Many many thanks ! and God Bless

  • @murad4622
    @murad4622 Год назад

    Wow ... I love the way you present this topic. Thank you very much.

  • @sidharthadaggubati438
    @sidharthadaggubati438 3 года назад

    best Explanation. This Channel is a hidden gem

  • @yunqingwang2761
    @yunqingwang2761 4 года назад

    indeed! best explanation of Naive Baye’s & Baye’s theorem

  • @tanimadebmallik8750
    @tanimadebmallik8750 4 года назад

    This is the simplest and most effective video on NB

  • @meirgoldenberg5638
    @meirgoldenberg5638 5 лет назад +1

    Maybe I missed this part in the video, but Naive Bayes assumes only conditional independence. For example, this training set suggests that the words "Buy" and "Cheap" are far from being (unconditionally) independent. Namely, P("Buy")=P("Cheap")=25/100=1/4. So, if the two words were independent, we would expect P("Buy" and "Cheap")=1/16=6.25%. However, there are 12 emails containing both words out of 100, which is 12%.

    • @MasterGreg82
      @MasterGreg82 4 года назад

      I have exactly the same remark

  • @MrSuraj2776
    @MrSuraj2776 4 года назад

    Great explanation...It was very easy to understand Naive Bayes ...Thank you very much for this video...!!

  • @jamesimmordino4855
    @jamesimmordino4855 7 месяцев назад

    Great explanation , perfectly paced.

  • @dossierk6406
    @dossierk6406 4 года назад

    The best explanation ever of Naïve Bayes

  • @ghalibashraf
    @ghalibashraf 5 лет назад +1

    This was a very clear and beginner-friendly explanation. Thank you!

  • @shailesh_joshi
    @shailesh_joshi 4 года назад +1

    Absolutely fantastic explaination. Thank you so much for this.

  • @mrtandon5278
    @mrtandon5278 5 лет назад +13

    It's been around 8 months, I'm moving towards ML and your guidance, teaching strategy are playing major role in it.
    I can't simply say thank you.
    Stay blessed.

    • @SerranoAcademy
      @SerranoAcademy  5 лет назад +1

      Thank you, that's really nice to hear! Keep up the good work in ML!

  • @johnhutchinson9445
    @johnhutchinson9445 4 года назад

    Thank you! This is so much more clear than my textbook!

  • @NitinPandya26
    @NitinPandya26 5 лет назад +8

    Great Explanation for Bayes Theorem, I have never understood naive Bayes so well....Thanks for this Luis

  • @PuzzledMei
    @PuzzledMei 5 лет назад +1

    you made this so easy to understand, thank you!

  • @junecnol79
    @junecnol79 5 лет назад +3

    thank you for excellent explanation !

  • @milanico2309
    @milanico2309 4 года назад

    You Sir are a hero!

  • @ammanuelbekeletilahun9526
    @ammanuelbekeletilahun9526 Год назад

    An excellent explanation.

  • @jomondal
    @jomondal 4 года назад

    I must say thank you so much for this fantastic upload..

  • @nikhithasagarreddy
    @nikhithasagarreddy 4 года назад

    Great Beginning!!

  • @arunabhadeb5916
    @arunabhadeb5916 4 года назад +4

    Such a brilliant explanation. Thank you Luis. Kindly add more lectures on traditional ML related topics.

  • @mjb48219
    @mjb48219 4 года назад

    Great explanation! Thanks!

  • @obheech
    @obheech 5 лет назад +1

    Wonderful explanations !!

  • @nilsmelchert776
    @nilsmelchert776 4 года назад

    Would love to see you explaining Kalmanfilter

  • @AnilAnvesh
    @AnilAnvesh 2 года назад

    Thank You for this video. You are an inspiration ❤️

  • @panamacherry6845
    @panamacherry6845 4 года назад

    Awesome explanation 👍 👍👍👏👏👏

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 4 года назад

    Great explanation!

  • @kunalr_ai
    @kunalr_ai 5 лет назад +1

    Thanks for the episode

  • @omarmohand-amer660
    @omarmohand-amer660 4 года назад

    you're such a legend, thank you for this video !

  • @SRAVANAM_KEERTHANAM_SMARANAM
    @SRAVANAM_KEERTHANAM_SMARANAM 3 года назад

    Kindly Make a video on Expectation Maximization

  • @esspi9
    @esspi9 4 года назад

    Beautiful and Effective.
    Thank you!

  • @mumbaiashutosh
    @mumbaiashutosh 4 года назад +3

    Awesome session.. Thanks a million!! God Bless..

  • @usamakhawaja8571
    @usamakhawaja8571 5 лет назад +1

    You brother! have my respect.

  • @chaitakmukherjee
    @chaitakmukherjee 5 лет назад +2

    This is the really the best explanation of naive bayes...that beats even Andrew Ng's and many other's...

  • @karrde666666
    @karrde666666 4 года назад +3

    so much more clearer than my professor explaining it for 80 minutes

  • @yingzhu505
    @yingzhu505 3 года назад

    Thanks A LOT! great video!

  • @TheStarbalaji
    @TheStarbalaji 5 лет назад

    Great explanation

  • @chyldstudios
    @chyldstudios 5 лет назад

    You're crushing it!

  • @shreyasarojkar5267
    @shreyasarojkar5267 2 года назад

    thankyou so much for this !!

  • @arashalizade9583
    @arashalizade9583 4 года назад

    It was gold! thanks

  • @anilpillai9180
    @anilpillai9180 5 лет назад +4

    Awesome explanation...👍

  • @NitinPandya26
    @NitinPandya26 5 лет назад +4

    Hey Luis, can you explain bayes theorem with laplace corrections applied to it when, cond prob is zero

    • @questforprogramming
      @questforprogramming 4 года назад

      Just add the numerator with 'alpha' and denominator with 'k'*'alpha' for each class probabilities, where k is no of classes (here binary, so 2) and typically alpha is a hyper parameter (varies between 10^-3 to 1).

  • @MH_HD
    @MH_HD 5 лет назад

    very nice explanation, can you please mention how to find the naive Bayesian classifier's uncertainty?

  • @paedrufernando2351
    @paedrufernando2351 2 года назад +1

    you are the coach who could teach students two weeks before the exams and get all students to get distinctions. I only wish if you took up Master classes.. Guys would be pouring to take up your master classes

  • @pedramasshabi8551
    @pedramasshabi8551 4 года назад +1

    I love these easy explanations though it's that easy I cant connect them to the formulas and stuff I read before. That would have been great if you've done that too.

  • @aashishluthra8124
    @aashishluthra8124 4 года назад +1

    Explained with great simplicity, thanks for this!

  • @animatedmath4307
    @animatedmath4307 2 года назад

    I think the probabilities you picked might be a bit confusing, didactically: P(S|B) happens to be equal to P(B|S) in the Bayes formula at 17min.

  • @MrDejurado
    @MrDejurado 5 лет назад +1

    This video is SOLID!!! Thank you so much for this and please keep making more videos! You made this concept so digestible it is not even funny.

  • @filipeoliveira5489
    @filipeoliveira5489 3 года назад +1

    Fantastic explanation! Such amazing teaching skills! I wish every teacher was like you! Great work and thank you!

  • @tubewatcher77
    @tubewatcher77 5 лет назад +1

    Finally, I noticed that there is a difference between Bayes theorem and Naive Bayes.

  • @MichelC2000
    @MichelC2000 5 лет назад +2

    This is really good! Thank you so much for your time and effort to make this topics accessible to the masses 🙂

  • @yusufsalk1136
    @yusufsalk1136 4 года назад

    Perfect!

  • @JRobertoArt
    @JRobertoArt 5 лет назад +2

    Great video dude, helped a lot. Thank U.

  • @frogfrog1993
    @frogfrog1993 5 лет назад

    the best of best

  • @pratikdeoolwadikar5124
    @pratikdeoolwadikar5124 4 года назад

    Thanks a lot

  • @artem_skok
    @artem_skok 2 месяца назад

    Thank you very much for this video. I've spent days trying to work out intuition on how to apply the Naive Bayes for spam detection, but all other videos just repeat the Bayes probability formula and show you the answer. Formulas give you 0 understanding unless you figure out the logic behind the approach, and only then they become useful.

  • @piobr
    @piobr Месяц назад

    Thank you, Luis. Your classes are amazing, keep the good work.
    Best regards from Brazil.

  • @ameygujre7674
    @ameygujre7674 2 года назад

    Came here from "codebasics" youtube channel.
    pretty amazingly explained by you man.. Thanks a lot..

  • @gemini_537
    @gemini_537 5 месяцев назад

    Gemini: This video is about Naive Bayes classifier, a spam detector which is based on Bayes theorem.
    The video uses an example of building a spam detector to illustrate the concept. The idea is that we can classify an email as spam or not spam based on the presence of certain words in the email. For instance, emails containing the word "buy" are more likely to be spam than those which do not contain "buy".
    Bayes theorem allows us to calculate the probability of an event (e.g. an email being spam) given another event (e.g. the email containing the word "buy"). The video uses a simple example with two properties (presence of "buy" and presence of cheap") to illustrate this concept.
    However, the challenge arises when we want to consider more than two properties at the same time. Ideally, we would like to calculate the probability of an email being spam given the presence of all the properties we are considering (e.g. "buy", "cheap", "work").
    But calculating the probability of all these properties appearing together becomes cumbersome as the number of properties increases. This is where the Naive Bayes assumption comes in. Naive Bayes assumes that all these properties are independent of each other. This assumption although not always true, simplifies the calculation significantly.
    The video concludes by explaining how the Naive Bayes classifier works with this assumption and shows how to calculate the probability of an email being spam given multiple properties.

  • @jacksugood7684032su
    @jacksugood7684032su 4 года назад +1

    Thanks for your detailed and friendly explanation. It really helps me a lot :)

  • @leenabhandari5949
    @leenabhandari5949 5 лет назад

    Thank you!

  • @Shadowdoctor117
    @Shadowdoctor117 5 лет назад +1

    Amazing breakdown! I liked how you made it visually first and gradually turned it into the formula. That really made it click in my head!

  • @ajazhussain2919
    @ajazhussain2919 Год назад

    Really amazing sir❤ love from India, watching you videos on 3G internet 😅

  • @albertli7044
    @albertli7044 5 лет назад +1

    your video should be the first result of "naive bayes"

  • @gyangauchan6111
    @gyangauchan6111 2 года назад

    Great Explanation! Can someone please explain how we get for P(B/S) = 20/25? @ 17:45

  • @totetronico
    @totetronico 2 года назад

    Thank You for Such Clear and Well structured Explanation!.

  • @ManzoorKhan-kk6qk
    @ManzoorKhan-kk6qk 4 года назад

    Appreciate the way (visualization) you explained a more complicated concept.

  • @corymaklin7864
    @corymaklin7864 5 лет назад +1

    Very well done thank you

  • @rashminkumarmadan3393
    @rashminkumarmadan3393 3 года назад

    First I saw the video from 3b1b then from statquest. Both of them are great videos. But I was not able to find a connection between them. Your video helped me to connect all the dots

  • @Kev1nTheCoder
    @Kev1nTheCoder 4 года назад +1

    This is really FRIENDLY. Thank you!

  • @feggak
    @feggak 5 лет назад +1

    Holy, what a good explanation of the concept, damn dude. Thanks alot!

  • @hebashakeel7161
    @hebashakeel7161 Год назад

    First time ever I understood this Naive Bayes. Thank you so much

  • @ShaidaMuhammad
    @ShaidaMuhammad 4 года назад

    How 10% of 5% is equal to 0.5?

  • @himaninarula584
    @himaninarula584 4 года назад

    Thank you so much for this amazing explanation

  • @ganeshhegde4049
    @ganeshhegde4049 4 года назад

    Explained intuitively! Thanks! :)

  • @mrj4949
    @mrj4949 2 года назад

    Really great 🙏 my understanding is much better now 😊

  • @kjoemack7013
    @kjoemack7013 5 лет назад +1

    no thank you Luis you are an excellent educator

  • @oybekabdulazizov419
    @oybekabdulazizov419 4 года назад

    awesome explanation. thank you very much :)

  • @rishidixit7939
    @rishidixit7939 6 месяцев назад

    Why is it assumed that 0.5% of the words contain buy and cheap

  • @deepikadasara8238
    @deepikadasara8238 4 года назад

    amazing tutorial! very easy to follow!

  • @AvielLivay
    @AvielLivay Год назад

    1:31 I counted 80 non spam mails, you should have chopped off one column there!

  • @anjalibudhiraja235
    @anjalibudhiraja235 3 года назад

    At 6:55 how did you conclude 0.5% ? Please make it clear

  • @tariqo6756
    @tariqo6756 4 месяца назад

    wow, that was wonderful explanation. thatnks!

  • @Questionbang
    @Questionbang 5 лет назад

    Awesome!

  • @meysamamini9473
    @meysamamini9473 3 года назад

    Best Naive Bayes explanation ever! thank U Luis

  • @mitramir5182
    @mitramir5182 4 года назад

    Thank you very very much, I would probably liked your video twice if it was possible. It's so clear and plain that after a while, I again came back to it for reviewing naive bayes.

  • @dragolov
    @dragolov 5 лет назад +1

    Thank you so much for this video, Luis! Respect!

  • @israilzarbaliev7024
    @israilzarbaliev7024 3 года назад

    PLEASE COULD YOU SEND LINK TO DOWNLOAD THIS PRESENTATION?

  • @EGlobalKnowledge
    @EGlobalKnowledge 2 года назад

    Excellent explanation. Thank you 🙏