Gaussian Naive Bayes, Clearly Explained!!!

Поделиться
HTML-код
  • Опубликовано: 5 июл 2024
  • Gaussian Naive Bayes takes are of all your Naive Bayes needs when your training data are continuous. If that sounds fancy, don't sweat it! This StatQuest will clear up all your doubts in a jiffy!
    NOTE: This StatQuest assumes that you are already familiar with...
    Multinomial Naive Bayes: • Naive Bayes, Clearly E...
    The Log Function: • Logs (logarithms), Cle...
    The Normal Distribution: • The Normal Distributio...
    The difference between Probability and Likelihood: • Probability is not Lik...
    Cross Validation: • Machine Learning Funda...
    For a complete index of all the StatQuest videos, check out:
    statquest.org/video-index/
    If you'd like to support StatQuest, please consider...
    Buying my book, The StatQuest Illustrated Guide to Machine Learning:
    PDF - statquest.gumroad.com/l/wvtmc
    Paperback - www.amazon.com/dp/B09ZCKR4H6
    Kindle eBook - www.amazon.com/dp/B09ZG79HXC
    Patreon: / statquest
    ...or...
    RUclips Membership: / @statquest
    ...a cool StatQuest t-shirt or sweatshirt:
    shop.spreadshirt.com/statques...
    ...buying one or two of my songs (or go large and get a whole album!)
    joshuastarmer.bandcamp.com/
    ...or just donating to StatQuest!
    www.paypal.me/statquest
    Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
    / joshuastarmer
    0:00 Awesome song and introduction
    1:00 Creating Gaussian distributions from Training Data
    2:34 Classification example
    4:46 Underflow and Log() function
    7:27 Some variables have more say than others
    Corrections:
    3:42 I said 10 grams of popcorn, but I should have said 20 grams of popcorn given that they love Troll 2.
    #statquest #naivebayes

Комментарии • 472

  • @statquest
    @statquest  4 года назад +19

    NOTE: This StatQuest is sponsored by JADBIO. Just Add Data, and their automatic machine learning algorithms will do all of the work for you. For more details, see: bit.ly/3bxtheb BAM!
    Corrections:
    3:42 I said 10 grams of popcorn, but I should have said 20 grams of popcorn given that they love Troll 2.
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @phildegreat
      @phildegreat 3 года назад

      website not working?

    • @statquest
      @statquest  3 года назад

      @@phildegreat Thanks! The site is back up.

    • @anirbanpatra3017
      @anirbanpatra3017 Год назад

      8:15 There's a minor error in the slide 'help use decide' .
      You really are a great teacher.Wish I could Meet you in person some day.

  • @rohan2609
    @rohan2609 3 года назад +125

    4 weeks back I had no idea what is machine learning, but your videos have really made a difference in my life, they are all so clearly explained and fun to watch, I just got a job and I mentioned some of the learnings I had from your channel, I am grateful for your contribution in my life.

  • @raa__va4814
    @raa__va4814 Год назад +24

    Im at the point where my syllabus does not require me to look into all of this but im just having too much fun learning with you. Im glad i took this course up to find your videos

  • @mildlyinteresting1925
    @mildlyinteresting1925 4 года назад +68

    Following your channel for over 6 months now sir, your explanations are truly amazing..

    • @statquest
      @statquest  4 года назад +2

      Thank you very much! :)

  • @amirrezamousavi5139
    @amirrezamousavi5139 2 года назад +2

    My little knowledge about machine learning could not be derived without your tutorials. Thank you very much

  • @tassoskat8623
    @tassoskat8623 4 года назад +58

    This is by far my favorite educational RUclips channel.
    Everything is explained in a simple, practical and fun way.
    The videos are full of positive vibes just from the beginning with the silly song entry. I love the catch phrases.
    Statquest is addictive!

    • @statquest
      @statquest  4 года назад +2

      Thank you very much! :)

  • @TheVijaySaravana
    @TheVijaySaravana 3 года назад +2

    I have watched over 2-3 hours of lecture about Gaussian Naive Bayes. Now is when I feel my understanding is complete.

  • @zitravelszikazii894
    @zitravelszikazii894 14 дней назад +1

    Thank you for the prompt response. I’m fairly new to Stats. But this video prompted me to do a lot more research and I’m finally confident on how you got to the result. Thank you for your videos. They are so helpful

    • @statquest
      @statquest  14 дней назад

      Glad it was helpful!

  • @minweideng4595
    @minweideng4595 3 года назад +8

    Thank you Josh. You deserve all the praises. I have been struggling with a lot of the concepts on traditional classic text books as they tend to "jump" quite a lot. You channel brings all of them to life vividly. This is my go to reference source now.

    • @statquest
      @statquest  3 года назад +2

      Awesome! I'm glad my videos are helpful.

  • @Godofwarares1
    @Godofwarares1 Год назад +10

    This is crazy I went to school for Applied Mathematics and it never crossed my mind that what I learned was machine learning as chatgpt came into the lime light I started looking into it and almost everything I've learned so far is basically everything I've learned before but in a different context. My mind is just blown that I was assuming ML was something unattainable for me and it turns out I've been doing it for years

  • @leowei2575
    @leowei2575 7 месяцев назад +2

    WOOOOOOW. I watched every video of yours, recommended in the description of this video, and now this video. Everything makes much more sense now. It helped me a lot to undersand the Gaussian Naive Bayes algorithm implemented and available from scikit-learn for applications in machine learning. Just awesome. Thank you!!!

    • @statquest
      @statquest  7 месяцев назад +1

      Wow, thanks!

  • @samuelbmartins
    @samuelbmartins 2 года назад +3

    Hi, Josh.
    Thank you so much for all the exceptional content from your channel.
    Your work is amazing.
    I'm a professor in Brazil of Computer Science and ML and your videos have been supporting me a lot.
    You're an inspiration for me.
    Best.

  • @mohit10singh
    @mohit10singh 3 года назад +3

    I am a beginner in Machine Learning field, and your channel helped me alot, almost went through all the videos, very nice way of explaining. Really appreciate you for making these videos and helping everyone. You just saved me ... Thank you very much...

    • @statquest
      @statquest  3 года назад

      Thank you very much! :)

  • @sakhawath19
    @sakhawath19 3 года назад +7

    If I remember all the best educator's name on RUclips, you always come at the beginning! You are a flawless genius!

  • @yuxinzhang4228
    @yuxinzhang4228 3 года назад +5

    It's amazing! Thank you so much !
    Our professor let us self-teach the Gaussian naive bayes and I absolutely don't understand her slides with many many math equations. Thanks again for your vivid videos !!

    • @statquest
      @statquest  3 года назад

      Glad it was helpful!

  • @tianhuicao3297
    @tianhuicao3297 3 года назад +9

    These videos are amazing !!! Truly a survival pack for my DS class👍

  • @georgeruellan
    @georgeruellan 3 года назад +3

    This series is helping me so much with my dissertation, thank you!!

    • @statquest
      @statquest  3 года назад +1

      Awesome and good luck with your disertation!

  • @joganice2197
    @joganice2197 11 дней назад +1

    this was the best explanation i've ever seen in my life, (i'm not even a english native speaker, i'm brazilian lol)

    • @statquest
      @statquest  10 дней назад +1

      Muito obrigado! :)

  • @pinesasyg9894
    @pinesasyg9894 2 года назад +2

    amazing kowledge with incredible communication skills..world will change if every student has such great teacher

  • @sudhashankar1040
    @sudhashankar1040 3 года назад +1

    This video on Gaussian Naive Bayes has been very well explained. Thanks a lot.😊

  • @qbaliu6462
    @qbaliu6462 2 месяца назад +1

    This channel has helped me so much during my studies 🎉

    • @statquest
      @statquest  2 месяца назад

      Happy to hear that!

  • @sampyism
    @sampyism 16 дней назад +1

    Your videos and voice make ML and statistics fun to learn. :)

    • @statquest
      @statquest  15 дней назад

      Glad you like them!

  • @sairamsubramaniam8316
    @sairamsubramaniam8316 3 года назад +1

    Sir, this playlist is a one-stop solution for quick interview preparations. Thanks a lot sir.

    • @statquest
      @statquest  3 года назад

      Good luck with your interviews! :)

  • @argonaise_jay
    @argonaise_jay 2 года назад +1

    One of the best channel for learners that the world can offer..

  • @anje889
    @anje889 Год назад +1

    contents are excellent and also i love your intro quite a lot (its super impressive for me) btw. thanking for doing this at the fisrt place as a beginner some concepts are literally hard to understand but after watching your videos things are a lot better than before. Thanks :)

    • @statquest
      @statquest  Год назад

      I'm glad my videos are helpful! :)

  • @Adam_0464
    @Adam_0464 3 года назад +1

    Thank you, You have made the theory concrete and visible!

  • @jiheonlee4065
    @jiheonlee4065 4 года назад +2

    Thank you for another excellent Statquest !~

  • @WorthyVII
    @WorthyVII Год назад +2

    Literally the best video ever on this.

  • @akashchakraborty6431
    @akashchakraborty6431 3 года назад +1

    You have really helped me a lot. Thanks Sir. May you prosper more and keep helping students who cant afford paid content :)

  • @MrRynRules
    @MrRynRules 3 года назад +1

    Daym, your videos are so good at explaining complicated ideas!! Like holy shoot, I am going to use this, multiple predictors ideas to figure out the ending of inception, Was it dream, or was it not a dream!

  • @chonky_ollie
    @chonky_ollie 2 года назад +1

    Your videos are more helpful than my Machine Learning lectures were. Man, you are Gigachad of Machine Learning

  • @tcidude
    @tcidude 3 года назад

    Josh. I love you your videos. I've been following your channel for a while. Your videos are absolutely great!
    Would you consider covering more of Bayesian statistics in the future?

    • @statquest
      @statquest  3 года назад

      I'll keep it in mind.

  • @hli2147
    @hli2147 3 года назад +2

    This is the only lecture that makes me feel not stupid...

  • @haofu1673
    @haofu1673 3 года назад +2

    Great video! If people are willing to spend time on videos like this rather than Tiktok, the wold would be a much better place.

    • @statquest
      @statquest  3 года назад

      Thank you very much! :)

  • @liranzaidman1610
    @liranzaidman1610 4 года назад +5

    How do people come up with these crazy ideas? it's amazing, thanks a lot for another fantastic video

  • @heteromodal
    @heteromodal 3 года назад +1

    Thank you Josh for another great video! Also, this (and other vids) makes think I should watch Troll 2, just to tick that box.

    • @statquest
      @statquest  3 года назад

      Ha! Let me know what you think!

  • @ahhhwhysocute
    @ahhhwhysocute 3 года назад +1

    Thanks for the video !! it was very helpful and easy to understand

    • @statquest
      @statquest  3 года назад

      Glad it was helpful!

  • @chenzhiyao834
    @chenzhiyao834 2 года назад +1

    you explained much clearer than my lecturer in ML lecture.

  • @Mustafa-099
    @Mustafa-099 2 года назад +1

    Hey Josh I hope you are having a wonderful day, I was searching for a video on " Gaussian mixture model " on your channel but couldn't find one, I have a request for that video since the concept is a bit complicated elsewhere
    Also btw your videos enabled to get one of the highest scores in the test conducted recently in my college, all thanks to you Josh, you are awesome

    • @statquest
      @statquest  2 года назад +1

      Thanks! I'll keep that topic in mind.

  • @AmanKumar-oq8sm
    @AmanKumar-oq8sm 3 года назад

    Hey Josh, Thank you for making these amazing videos. Please make a video on the "Bayesian Networks" too.

    • @statquest
      @statquest  3 года назад

      I'll keep it in mind.

  • @Vivaswaan.
    @Vivaswaan. 4 года назад +1

    The demarcation of topics in the seek bar is useful and helpful. Nice addition.

    • @statquest
      @statquest  4 года назад +1

      Glad you liked it. It's a new feature that RUclips just rolled out so I've spent the past day (and will spend the next few days) adding it to my videos.

    • @anitapallenberg690
      @anitapallenberg690 4 года назад +2

      @@statquest We really appreciate all your dedication into the channel!
      It's 100% awesomeness :)

    • @statquest
      @statquest  4 года назад

      @@anitapallenberg690 Hooray! Thank you! :)

  • @konmemes329
    @konmemes329 2 года назад +1

    Your video just helped me a lot !

  • @rogertea1857
    @rogertea1857 3 года назад +1

    Another great tutorial, thank you!

  • @camilamiraglia8077
    @camilamiraglia8077 3 года назад

    Thanks for the great video!
    I would just like to point out that in my opinion if you are talking about log() when the base is e, it is easier (and more correct) to write ln().

    • @statquest
      @statquest  3 года назад +1

      In statistics, programming and machine learning, "ln()" is written "log()", so I'm just following the conventions used in the field.

  • @ADESHKUMAR-yz2el
    @ADESHKUMAR-yz2el 3 года назад +1

    i promise i will join the membership and buy your products when i get a job... BAM!!!

    • @statquest
      @statquest  3 года назад

      Hooray! Thank you very much for your support!

  • @meysamamini9473
    @meysamamini9473 3 года назад +1

    I'm Having great time watching Ur videos ❤️

  • @nzsvus
    @nzsvus 4 года назад

    BAM! thanks, Josh! It would be amazing if you can make a StatQuest concerning A/B testing :)

    • @statquest
      @statquest  4 года назад +1

      It's on the to-do list. :)

  • @MinhPham-jq9wu
    @MinhPham-jq9wu 2 года назад +1

    So great, this video so helpful

    • @statquest
      @statquest  2 года назад +1

      Glad it was helpful!

  • @har_marachi
    @har_marachi 2 года назад +1

    😅😅😅😅It's the "Shameless Self Promotion" for me... Thank you very much for this channel. Your videos are gold. The way you just know how to explain these hard concepts in a way that 5-year-olds can understand... To think that I just discovered this goldmine this week.
    God bless you😇

    • @statquest
      @statquest  2 года назад

      Thank you very much! :)

  • @samuelschonenberger
    @samuelschonenberger Год назад +2

    These gloriously wierd examples really are needed to understand a concept

  • @tagoreji2143
    @tagoreji2143 2 года назад +1

    Tqsm Sir for the Very Valuable Information

  • @prashuk-ducs
    @prashuk-ducs Месяц назад

    Why the fuck does this video make it look so easy and makes 100 percent sense?

  • @RFS_1
    @RFS_1 3 года назад +1

    Love the explaination BAM!

  • @auzaluis
    @auzaluis 4 года назад +1

    The world needs more Joshuas!

  • @WillChannelUS
    @WillChannelUS 4 года назад +2

    This channel should have 2.74M subscribers instead of 274K.

    • @statquest
      @statquest  4 года назад

      One day I hope that happens! :)

  • @justinneddie9437
    @justinneddie9437 3 года назад +1

    well the little intro made me cry laugh. I don't know why... awesome

  • @sayanbhowmick9203
    @sayanbhowmick9203 3 месяца назад +1

    Great style of teaching & also thank you so much for such a great video (Note : I have bought your book "The StatQuest illustrated guide to machine learning") 😃

    • @statquest
      @statquest  3 месяца назад +1

      Thank you so much for supporting StatQuest!

  • @vinaykumardaivajna5260
    @vinaykumardaivajna5260 Год назад +1

    Awesome as always

  • @patrycjakasperska7272
    @patrycjakasperska7272 7 месяцев назад +1

    Love your channel

  • @mukulsaluja6109
    @mukulsaluja6109 3 года назад +1

    Best video i have ever seen

  • @diraczhu9347
    @diraczhu9347 2 года назад +1

    Great video!

  • @Theviswanath57
    @Theviswanath57 3 года назад +3

    In Stats Playlist, we used following notation for P( Data | Model ) for probability & L(Model | Data) for likelihood;
    Here we are writing likelihood as L(popcorn=20 | Loves) which I guess L( Data | Model );

    • @statquest
      @statquest  3 года назад +2

      Unfortunately the notation is somewhat flexible and inconsistent - not just in my videos, but in the the field in general. The important thing is to know that likelihoods are always the y-axis values, and probabilities are the areas.

    • @Theviswanath57
      @Theviswanath57 3 года назад +1

      @@statquest understood; somewhere in the playlist you mentioned that likelihood is relative probability; and I guess this neatly summaries how likelihood and probability

    • @radicalpotato666
      @radicalpotato666 Год назад

      I just had the exact same question when I started writing the expression in my notebook. I am more acquainted with the L(Model | Data) notation.

  • @worksmarter6418
    @worksmarter6418 3 года назад +1

    Super awesome, thank you. Useful for my Intro to Artificial Intelligence course.

    • @statquest
      @statquest  3 года назад

      Glad it was helpful!

  • @yuniprastika7022
    @yuniprastika7022 3 года назад +1

    can't wait for your channel to BAAM! going worldwide!!

  • @rrrprogram8667
    @rrrprogram8667 4 года назад +1

    Thanks for the awesome video..

  • @ahmedshifa
    @ahmedshifa 3 месяца назад

    These videos are extremely valuable, thank you for sharing them. I feel that they really help to illuminate the material.
    Quick question though: where do you get the different probabilities, like for popcorn, soda pop, and candy? How do we calculate those in this context? Do you use the soda a person drinks and divide it by the total soda, and same with popcorn, and candy?

    • @statquest
      @statquest  3 месяца назад

      What time point are you asking about (in minutes and seconds). The only probabilities we use in this video are if someone loves or doesn't love troll 2. Everything else is a likelihood, which is just a y-axis coordinate.

  • @therealbatman664
    @therealbatman664 Год назад +1

    Your videos are really great !! my prof made it way harder!!

  • @ArinzeDavid
    @ArinzeDavid 2 года назад +1

    awesome stuff for real

  • @YesEnjoy55
    @YesEnjoy55 8 месяцев назад +1

    Great so much Thanks!

    • @statquest
      @statquest  8 месяцев назад

      You're welcome!

  • @taetaereporter
    @taetaereporter Год назад

    thank you for ur service T.T

  • @jianshue9240
    @jianshue9240 3 года назад

    Thanks dude

  • @arjunbehl9771
    @arjunbehl9771 3 года назад +1

    Great stuff : )

  • @haneulkim4902
    @haneulkim4902 3 года назад

    Amazing video! Thank you so much!
    One question, What if distribution of candy or other feature does not follow normal distribution?

    • @statquest
      @statquest  3 года назад

      Just use whatever distribution is appropriate, and you can mix and match distributions for different variables.

  • @MrElliptific
    @MrElliptific 4 года назад

    Thanks for this super clear explanation. Why would we prefer this method for classification over a gradient boosting algorithm? When we have too few samples?

    • @statquest
      @statquest  4 года назад

      With relatively small datasets it's simple and fast and super lightweight.

  • @kicauburungmania2430
    @kicauburungmania2430 3 года назад

    Thanks for the awesome explanation. But I've a question. Is GNB can be used for sentiment analysis?

    • @statquest
      @statquest  3 года назад

      Presumably you could use GNB, but I also know that normal NB (aka multinomial naive bayes) is used for sentiment analysis.

  • @taotaotan5671
    @taotaotan5671 4 года назад

    Hi, Josh. Thanks for this clear explanation. Since this Naive Bayes could be applied to Gaussian distribution, I guess it could also be applied to other distributions like Poisson distribution, right? Then a question is: how to determine the distribution of a feature? I believe this will be quite important to build a reasonable model.
    Thanks again for the nice video.

    • @statquest
      @statquest  4 года назад

      One day (hopefully not too long from now), I'm going to cover the different distributions, and that should help people decide which distributions to use with their data.

  • @deepshikhaagarwal4125
    @deepshikhaagarwal4125 Год назад +1

    Thank you josh your videos are amazing! HoW to buy study guides from statquest

    • @statquest
      @statquest  Год назад

      See: statquest.gumroad.com/

  • @linianhe
    @linianhe Год назад +1

    dude you are awesome

  • @sheebanwasi2925
    @sheebanwasi2925 3 года назад +1

    Hey JOSH Thanks for making such amazing video. Keep up the work. I just have a quick question if you don't mind.
    I can't understand how you got the likelihood eg: L(soda = 500 | LOVES) how you calculating that value.

    • @statquest
      @statquest  3 года назад

      We plugged the mean and standard deviation of soda pot for people that loved Troll2 into the equation for a normal curve and then determined the y-axis coordinate when the x-axis value = 500.

  • @jonathanjacob5453
    @jonathanjacob5453 6 месяцев назад +1

    Looks like I have to check out the quests before getting to this one😂

  • @dipinpaul5894
    @dipinpaul5894 4 года назад +1

    Excellent explanation. Any NLP series coming up ? Struggling to find good resources.

    • @statquest
      @statquest  4 года назад +4

      I'm working on Neural Networks right now.

    • @ragulshan6490
      @ragulshan6490 4 года назад +1

      @@statquest it's going to be BAM!!

  • @Steve-3P0
    @Steve-3P0 3 года назад +1

    +5000 for using an example as obscure and as obscene as Troll 2.

  • @Geza_Molnar_
    @Geza_Molnar_ 4 года назад

    Hi - another great explanation!
    I wonder what would be the result if you normalise the probabilies of the 3 values.
    - Would it affect the outcome of the example in this video?
    - Which areas of values are affected: different outcomes with non-normalised and normalised distributions (=probability or likelihood here)?

    • @statquest
      @statquest  4 года назад

      Interesting questions! You should try it out and see what you get.

    • @Geza_Molnar_
      @Geza_Molnar_ 4 года назад

      @@statquest Hi, that only make sense with real data. Without that, only juggling with equations and abstract parameters, the thing is not enough 'visual', IMO. Though, could run through the calculations with e.g. 2x scale, 10x scale and 100x scale... Maybe, when I have free few hours.

  • @jacobwinters6648
    @jacobwinters6648 23 дня назад

    Hello! Does it matter if the data in one of the columns (say popcorn) is not normally distributed? Or should the assumption be that we will have a large enough sample size to use the central limit theorem?
    Thanks for all of your videos! I love them and can’t wait for your book to be delivered (just ordered it yesterday).

    • @statquest
      @statquest  23 дня назад +1

      It doesn't matter how the data are distributed. As long as we can calculate the likelihoods, we are good to go. BAM! :) And thank you so much for supporting StatQuest!!! TRIPLE BAM!!! :)

  • @sejongchun8350
    @sejongchun8350 4 года назад +2

    Troll 2 is an awesome classic, and should not be up for debate. =)

  • @shichengguo8064
    @shichengguo8064 3 года назад

    Looks like it also works when both multinomial and gaussian predictor existing in the prediction dataset.

    • @statquest
      @statquest  3 года назад

      Yes, you are correct. And thanks for supporting StatQuest!

  • @61_shivangbhardwaj46
    @61_shivangbhardwaj46 3 года назад +1

    Thnx sir 😊

  • @konstantinlevin8651
    @konstantinlevin8651 10 месяцев назад +1

    I'm a simple man, I watch statquests in the nights, leave a like and go chat about it with chatgpt.That's it.

  • @jacquelineb.2468
    @jacquelineb.2468 3 года назад

    thanks for creating this helpful video! is your sample data available somewhere? would love to calculate things by hand for practice!

    • @statquest
      @statquest  3 года назад

      Thanks! Unfortunately, the raw data is not available :(

  • @mohammadelghandour1614
    @mohammadelghandour1614 2 года назад

    Great work ! In 8:11 How can we use cross validation with Gaussian Naive Bayes? I have watched the Cross validation video but I still can't figure out how to employ cross validation to know that candy can make the best classification.

    • @statquest
      @statquest  2 года назад

      to apply cross validation, we divide the training data into different groups - then we use all of the groups, minus 1, to create a gaussian naive bayes model. Then we use that model to make predictions based on the last group. Then we repeat, each time using a different group to test the model.

  • @alanamerkhanov6040
    @alanamerkhanov6040 7 месяцев назад +1

    Hi, Josh. Troll 2 is a good movie... Thanks

  • @user-bz8nm6eb6g
    @user-bz8nm6eb6g 4 года назад +3

    Can you talk about Kernel estimation in the future?? Bam!

    • @statquest
      @statquest  4 года назад +2

      I will consider it.

  • @roymillsdixton7941
    @roymillsdixton7941 Год назад

    A nice video on Gaussian Naive Bayes Classification model. Well done! But I have a quick question for you, Josh. I only understand that Lim ln(x) as x approaches o is negative infinity. How is the Natural log of a really small unknown number very close to zero assumed to be equal to -115 and -33.6 as in the case of L(candy=25|Love Troll 2) and L(popcorn=20|does not Love Troll 2) respectively? What measure was used to determine these values?

    • @statquest
      @statquest  Год назад

      log(1.1*10^-50) = -115 and log(2.5*10^-15) = -33.6

  • @piyushdadgal
    @piyushdadgal 2 года назад +1

    Thanku bam🔥🔥

  • @mahadmohamed2748
    @mahadmohamed2748 2 года назад

    Thanks for these great videos! Quick question: In other resources the likelihood is actually the probability of the data given the hypothesis rather than the likelihood of the data given the hypothesis. Which one would be correct, or is it fine to use either?

    • @statquest
      @statquest  2 года назад +1

      Generally speaking, you use the likelihoods of the data. However, we can normalize them to be probabilities. This does not offer an advantages and takes longer to do, so people usually omit that step and just use the likelihoods.

  • @marisagonzalez679
    @marisagonzalez679 3 года назад +4

    You always get the like after the intro song hahaha

    • @statquest
      @statquest  3 года назад +1

      Bam! Thank you very much! :)

  • @peteerya7839
    @peteerya7839 2 года назад

    Hi
    Your video is amazing!!! I have a quick question. When you said to use the cross validation to help use decide which thins, pop corn, soda pop and candy, I think the training data part can "only" help decide the prior probability, then use the testing data to do the confusion test comparisons, all the above are conditioned in each scenario, right? For example, we will have three confusion matrix from pop corn, soda pop and candy based on the test data. What do you think?

    • @statquest
      @statquest  2 года назад

      That sounds about right.

  • @shailukonda
    @shailukonda 4 года назад +1

    Could you please make a video on Time Series Analysis (Arima model)?

    • @statquest
      @statquest  4 года назад +1

      One day I'll do that.

  • @xmartazi
    @xmartazi 15 дней назад +1

    I love you bro !

  • @tanishasethi7363
    @tanishasethi7363 2 года назад +1

    The shameless self promotion got me lol, u're so funny