EM Algorithm : Data Science Concepts

Поделиться
HTML-код
  • Опубликовано: 8 авг 2024

Комментарии • 135

  • @aaroojyashin3790
    @aaroojyashin3790 6 месяцев назад +26

    This could be by far the best explanation I have seen for EM algorithm. The way you have connected the intuitive way to mathematical explanation, is so so commendable!!!! Thank you so much for your efforts

    • @ritvikmath
      @ritvikmath  6 месяцев назад

      Glad it was helpful!

    • @chainetravail2439
      @chainetravail2439 3 дня назад

      @@ritvikmath I comfirm, thank you for helping lost students

  • @anuraganand9675
    @anuraganand9675 2 года назад +68

    The world needs to see this. Thanks Ritvik, I honestly have utmost respect and love for the amount of hard work you put in your videos. Cheers :)

    • @e555t66
      @e555t66 Год назад +2

      It would take a lot of time to develop these intuitions on your own.

  • @simeonvince8013
    @simeonvince8013 Год назад +5

    Thank you for the high-quality contents that you have produced over the past few years. Most of the time, it really did help me get the intuition and understanding of what was going on with the theoretical concepts I was seeing in my courses.
    Once again, thank you !

  • @rachelhobbs6189
    @rachelhobbs6189 Год назад +15

    Your channel and way of teaching is so amazing!! Very inviting, inclusive, and friendly. Thank you so much for such good vibes 💕

  • @albertonieto2557
    @albertonieto2557 Год назад +1

    Brilliant explanation. I especially appreciate you first providing the intuition of the method in the verbal explanation of the E and M steps. I struggled with the seeing the math first in other lectures until seeing your video. Thanks for posting this.

  • @nnyabavictoria6332
    @nnyabavictoria6332 2 года назад +4

    Thank you Ritvik for simplifying EM algorithm like this. This is the best video I have seen so far.

  • @adrian-mu3jr
    @adrian-mu3jr 2 года назад +5

    That's really great way to look at EM. I'm an engineering graduate but new to ML and the workup explanation before dropping into the maths is excellent. thanks

  • @fotiskamanis8592
    @fotiskamanis8592 9 месяцев назад

    Thank you! By far the best channel for providing clear explanations to fairly complex problems.

  • @alizarean5080
    @alizarean5080 7 месяцев назад +2

    I have an exam tomorrow and this video was the thing I needed. I can't thank you enough dude.

  • @tb1131
    @tb1131 Год назад

    Your explanations are soooo clear! really appreciate the effort you put into your videos. Thank youu!!

  • @mohammed2250
    @mohammed2250 2 года назад

    Yes please go on with the prove, that will be an interesting topic. Though I went on Andrew's ng video couple of times, but I couldn't understand it better than here!! You're a rock star in simplifying complex concepts!!

  • @peterhopkinson3040
    @peterhopkinson3040 Год назад +1

    Your videos are unreal, simple explanations of complex problems its insane.

  • @navyadewan8680
    @navyadewan8680 4 месяца назад

    your understanding and explanation of such a complicated concept is impeccable

  • @aniketsakpal4969
    @aniketsakpal4969 Год назад +2

    Incredible explanation! Was trying to understand the intuition behind EM for a long time! Thanks for the video! Keep Going!!

  • @bassoonatic777
    @bassoonatic777 2 года назад +1

    Much better explanation than what I normally see. I would also be interested in seeing you go through the proof.

  • @Murphyalex
    @Murphyalex 2 года назад +1

    I had a jolt of excitement when I saw you had decided to do a video on this topic. It's something I've had to revisit time and time again, always understanding the intuition, but always getting lost in the formulas. Your post did a great job at helping to explain the intuition. I did struggle a bit with your non-conventional likelihood notation, though. That did throw me off a little bit but I understand why you had to have it that way and quickly adjusted. The care you took in explaining why there is mu and mu0 just shows why you are a fantastic teacher.

  • @christophb9616
    @christophb9616 2 года назад +5

    Thanks for the very clear explanation! A follow up video on how the EM algorithm can be used in gaussian mixture models or bayesian networks would be awesome!

  • @aidanheffernan652
    @aidanheffernan652 Год назад +1

    It's 4am and I saw this video and had to watch....... really great explanation bro.....your a natural teacher.....thanks for this......subscribed

  • @rishi71095
    @rishi71095 Год назад +2

    It would take me two more lives to be able to explain it this well to someone, kudos! Great job buddy!

  • @andrashorvath2411
    @andrashorvath2411 Год назад +2

    Awesome explanation. I'd like to extend yours with my intuition regarding the E-Step: the first term p(x|m0) shows the probability of x happening for the chosen m0, and the second term LogLikelihood shows the probability of x happening for the computed m, and we want to maximize both. Because we want a choice with high probability from every aspect. That's why we multiply them together. Because the multiplication can weight between them. If one of them is small then the result will be small. It can be high only if both are high.

    • @ritvikmath
      @ritvikmath  Год назад +1

      thanks for the additional inputs!

  • @michalistsatsaronis8728
    @michalistsatsaronis8728 2 года назад +9

    Thanks!You explained such a complicated subject so clearly!!!!

  • @sakshamsingh2005
    @sakshamsingh2005 2 месяца назад +1

    By far the best explanation, amazing.

  • @louighi91
    @louighi91 2 года назад +1

    Holy, i can't believe how good this video was :) thank you so much

  • @vantongerent
    @vantongerent 2 года назад +5

    YES! I have quiz on this NEXT WEEK!

  • @paramjeetsingh3444
    @paramjeetsingh3444 2 года назад

    cant express how happy i m to see after yr videos . thanks a lot !

  • @pouce902
    @pouce902 Год назад

    you are just amazing! What would be super useful would be an EM video based on your "Maximum likelihood" one.

  • @user-rx3fo8ne7m
    @user-rx3fo8ne7m Год назад

    Thank you for all the work you put in your videos to make life's like mine easier. Cheers man!

  • @ehsanmon
    @ehsanmon 2 года назад +4

    Thanks so much for this great and explanation! I would definitely be interested in the proof. It will be great if you could do a video on Gaussian mixture models as well and how it is solved using the EM algorithm.

  • @MN-zs8lc
    @MN-zs8lc 3 месяца назад

    Although there is more for fully understanding, I was able to gain the concept because of your video!

  • @TanmayGhonge
    @TanmayGhonge 11 месяцев назад +1

    Broke down the most complicated algorithm in the simplest terms. Wow!

  • @underlecht
    @underlecht Год назад

    Interesting way of looking at EM problem. Thank you.

  • @sajedehaghababaei5300
    @sajedehaghababaei5300 6 месяцев назад

    Thanks a lot! That is a great explanation!!! I was struggling with EM for a long time!! :))
    I'd grateful if you also talk about the proof of convergence!

  • @bbcailiang7
    @bbcailiang7 7 месяцев назад

    Excellent explanations!

  • @nuamaaniqbal6373
    @nuamaaniqbal6373 2 года назад

    i thank GOD i found your channel. A big thanks to youtube and to you!!

  • @nawfalguefrachi2071
    @nawfalguefrachi2071 Год назад +1

    thank you ritvik the best videos are in this channel.
    Very intresting way of teaching thank you from TUNISIA

  • @H1K8T95
    @H1K8T95 2 года назад +2

    Ngl my favorite rapper-turned-algorithm

  • @commentor93
    @commentor93 2 года назад

    Thanks for your explanation.
    I think my main mental knot was wondering why you alter N instead of looking for the best guess for x to locate the value of the unknown value. To realize that x doesn't change and the power of the algorithm lies in finding the optimal solution for the learner without caring for the actual value of x was what I needed for it all to make sense.

  • @isciaberger5017
    @isciaberger5017 Год назад

    this helped so much, thank you a lot!!

  • @user-xi6gn4ud9h
    @user-xi6gn4ud9h 5 месяцев назад

    Excellent explanation!!

  • @srinivasbringu67
    @srinivasbringu67 Год назад

    Lovely, that's very intuitive. Thank you so much.

  • @nikospelekanos3903
    @nikospelekanos3903 2 месяца назад

    Amazing explanation!

  • @MegaNightdude
    @MegaNightdude 2 года назад +3

    Great video !

  • @evgenyivanko9680
    @evgenyivanko9680 7 месяцев назад

    Ritvik, you are doing a great job, thanks

  • @sahilagarwal1871
    @sahilagarwal1871 2 года назад +2

    Would love to see a proof video! Keep up the great work!

  • @vijayrajan5792
    @vijayrajan5792 Год назад

    Very compelling ... Brilliant

  • @tungbme
    @tungbme 2 года назад

    Thank you so much for your explanation, helps me a lot

  • @sharmilakarumuri6050
    @sharmilakarumuri6050 2 года назад +3

    Very well explained

  • @fabianomenezes5892
    @fabianomenezes5892 Год назад

    What an amazing channel, honestly

  • @biswajit-k
    @biswajit-k Год назад

    Got this crystal clear. Thanks a lot!

  • @dadimanoj9051
    @dadimanoj9051 Год назад

    Thankyou for explaining very clearly

  • @davidforman4080
    @davidforman4080 Год назад

    So clear -- wow!

  • @abhishekganapure6456
    @abhishekganapure6456 2 месяца назад

    The only explanation you need for understanding EM algorithm, proper chad explanation!

  • @yaochung-chen
    @yaochung-chen Год назад

    Really nice explaination! Thank you!

  • @sasakevin3263
    @sasakevin3263 Год назад

    Awesome! Best explanation of EC algorithm for the beginner!

  • @yulinliu850
    @yulinliu850 2 года назад +3

    Excellent. Thank you so much! 👍

  • @awangsuryawan7320
    @awangsuryawan7320 2 года назад +3

    Can't wait to see the proof

  • @apresunreve
    @apresunreve Год назад

    amazing, thanks for such a clear explanation :)

  • @skanoun
    @skanoun 2 месяца назад

    Great teacher❤

  • @sneggo
    @sneggo Год назад

    Great video! Thank you so much

  • @sgklee2664
    @sgklee2664 Год назад

    This explanation is amazing in order to get the concept

  • @bradleyadjileye1202
    @bradleyadjileye1202 4 месяца назад +1

    Amazing, thank you for that !

  • @tecnom7133
    @tecnom7133 3 месяца назад +2

    The best Thanks man

  • @oliviernussli3799
    @oliviernussli3799 2 года назад

    really great explanation! thank you :-)

  • @ericwilson8665
    @ericwilson8665 Год назад

    Absolutely fantastic. I agree w/ other comments... The DS world needs to see this. Thank you.

  • @juliachu4685
    @juliachu4685 Год назад

    THANK YOU. You're literally saving my ML undergraduate course

  • @lordscourge-jp8ch
    @lordscourge-jp8ch 9 месяцев назад

    Thank you so much BRO

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Год назад

    This is explained so well

  • @kulbhushansingh1101
    @kulbhushansingh1101 Год назад

    great man, ultra great

  • @rachidwatcher5860
    @rachidwatcher5860 Год назад

    Thank you Ritvik for your explanation! does it work only for normal distribution or we can apply it for other kind of distributions

  • @akshiwakoti7851
    @akshiwakoti7851 2 года назад

    Great videos. Got it in one go! Could you do Gaussian Mixture Models? Thanks.

  • @Martyr022
    @Martyr022 2 года назад +5

    I'm interested in the proof!

  • @deepshikhaagarwal4125
    @deepshikhaagarwal4125 Год назад

    Thanks So much Ritvik!Your videos are amazing...do you have list of playlist for machine learning to connect dots in ML concepts,I see playlist for data science but not for machine learning.
    Thanks

  • @Tyokok
    @Tyokok 2 года назад

    Thanks for the great lecture. One question if I may: 2:20, why you put best guess 1 here instead of a random draw from your known distribution?

  • @dannyzilberg3101
    @dannyzilberg3101 5 месяцев назад

    Thank you so much for these videos!
    One question: how do you estimate and maximize the integral in practice? That was the elephant for me...

  • @sharmilakarumuri6050
    @sharmilakarumuri6050 2 года назад +4

    Can you do the proof too please

  • @AhmedMohamed-sp4mm
    @AhmedMohamed-sp4mm Год назад

    Thank you so much for these amazing vids.
    Would you kindly provide any MATLAB codes that illustrate these concepts?

  • @neelabhchoudhary2063
    @neelabhchoudhary2063 4 месяца назад

    oh my god. this was so helpful

  • @xiaogangcho6261
    @xiaogangcho6261 2 года назад

    Thanks a for the easy-to-understand intuation of EM algorithm. Would you like to explain the Coin-flip example along with your formulation step ② and ③?

  • @n.m.c.5851
    @n.m.c.5851 2 года назад

    thank you !!!

  • @kaurnavjeet943
    @kaurnavjeet943 4 месяца назад

    Amazing.

  • @akshaygulabrao6516
    @akshaygulabrao6516 2 года назад

    OH MY GOD THANK YOU

  • @dougcree6486
    @dougcree6486 Год назад

    A worked example of the final process would be invaluable.

  • @michaelrainer7487
    @michaelrainer7487 Год назад

    On step 2, what does the dx do at the end of that equation?

  • @nipa161991
    @nipa161991 Год назад

    Thanks for the video! What was not clear to me is whether we calculate all E(LL|M) for all Ms in which we can calculate the argman in step 3?

  • @pallavibothra9671
    @pallavibothra9671 2 года назад +3

    Hi Ritvik, thank you very much for awesome videos. Could you please make some videos on SQL?

    • @ritvikmath
      @ritvikmath  2 года назад +2

      thanks! and please check out my full SQL playlist here:
      ruclips.net/p/PLvcbYUQ5t0UFAZGthysGOAtZqLl3otZ-k

    • @pallavibothra9671
      @pallavibothra9671 2 года назад

      @@ritvikmath Awesome! Thanks a lot.. Could you please add sql with window function to the playlist, if possible?

  • @EW-mb1ih
    @EW-mb1ih Год назад

    Is the EM algorithm the best algorithm to use in some specific problem (compared for example to the gradient descent algorithm)?

  • @eceserin
    @eceserin 2 года назад +2

    Example with python coming anytime?

  • @sairaamvenkatraman5998
    @sairaamvenkatraman5998 6 месяцев назад

    Coild you please do the derivation or intuition for EM for clustering? I observe that it is described in many textbooks, but not in such a cool way. 😅

  • @jalaluddinmohammad262
    @jalaluddinmohammad262 2 года назад +2

    Please make a proof video

  • @Fredfetish
    @Fredfetish Год назад

    Can you explain EM algorithm in terms of compositional data please?

  • @shivambindal4809
    @shivambindal4809 2 года назад

    The expression for Expectation seems similar to Bayesian theory where we have prior belief (P(x|u)) and likelihood and we are multiplying both to get posterior. Is this the same concept?

  • @kevinshao9148
    @kevinshao9148 Год назад

    Thanks for the great video! One question: if you have (1+2+x)/3 = x , then you can have close form solution, why you still need numerical approach?

  • @grjesus9979
    @grjesus9979 Год назад

    What if x is high dimensional? How would the integral change?

  • @ehsanmon
    @ehsanmon 2 года назад

    How would the problem change if we didn't know the variance either?

  • @Thefare1234
    @Thefare1234 7 месяцев назад

    Great explanation. However, the way you have written it, there is no difference between the likelihood function and the probability function. I think for clarity you should swap x,1,2 and \mu. Also you should use ; instead of | so that the likelihood function is not confused with conditional probability.

  • @Sushanta-Das
    @Sushanta-Das 3 месяца назад

    Sir , I know, In E - step we estimate unknown x , but you are calculating Likelyhood . how are these connected ?

  • @Andres186000
    @Andres186000 2 года назад +3

    Nice to see the theorem guaranteeing convergence for sequences that are increasing and bounded being used to prove this. I do have a more pragmatic question which is how somebody would go about finding the argmax in the M step. Would gradient descent be used on the expectation of log-likelihood function (I would imagine in this case the expectation of log-likelihood would have to be convex for this to work) to find the argmax?

    • @Michael-vs1mw
      @Michael-vs1mw 2 года назад +2

      Yep, you can use any optimization method. For Gaussian mixture models there are explicit formulas for the M step which are obtained in the usual way by setting the gradient of the expected log-likelihood to zero.

  • @Mrfrikha1
    @Mrfrikha1 Год назад

    why x = -1 and not equal 0 im kind confused for his first guess on the first question

  • @mohammadrezanargesi2439
    @mohammadrezanargesi2439 Год назад

    Something does not seem right to me in the E-step.
    I think the likelihood should be written given the latent variable, which is "x" in this case. But you have written it in given mu...
    I'm confused.
    Also I don't understand how to solve the M-step.
    When i write it down in this case i cannot update x at all🤦🤦🤦
    I only update mu 🤦🤦
    I'm completely confused

  • @fowadkhan7605
    @fowadkhan7605 2 года назад

    Sir. I only know basic statistics. From where should I start watching your videos. Is there any order to them? The concepts you are mentioning I am not familiar with them.