Markov Chain Monte Carlo (MCMC) : Data Science Concepts

Поделиться
HTML-код
  • Опубликовано: 24 ноя 2024

Комментарии • 135

  • @cissygu4088
    @cissygu4088 3 года назад +126

    I had two different university professors explaining MCMC, but I didn't quite get them until watching your video! Best explanation ever!

  • @gufo__4922
    @gufo__4922 4 месяца назад +13

    My dude, I don't often need your teachings, but when I do you are able to single-handedly overshadow most of my past professors.
    I've watched in the past 4 years a good chunk of your videos and you didn't do a single one in which I didn't add some new view, even if small, on the topic.
    Keep it up with the work.

  • @tomleyshon8610
    @tomleyshon8610 3 года назад +46

    Fantastic! Note the lack of cuts and edits - this guy knows his stuff.

  • @trong9402
    @trong9402 3 года назад +4

    I don't know what it is, but i really like this guy. Clearly knows his stuff and articulate too. Great presentation, thank you

  • @baoanhvu8356
    @baoanhvu8356 3 года назад +38

    I gotta say your videos have been super helpful for a stats subject I took last semester (which involved time series, ARIMA model, stationarity etc.) and now MCMC came out at the perfect timing. You have such a gift for explaining the intuition behind statistical concepts, and I'm looking forward to future videos from you. Your channel is a treasure!

    • @ritvikmath
      @ritvikmath  3 года назад +3

      Glad I could help!

    • @cao2106
      @cao2106 Год назад

      Does anyone have a python code that uses MCMC to predict closing prices? Can I have it, thanks

  • @arrau08
    @arrau08 3 года назад +9

    Thank you so much, I'm a scientist myself and have used some mcmc package blindly. Now, applying what I have been doing to every step of this video made me understand the full concept super clearly.

  • @edwardhartz1029
    @edwardhartz1029 2 года назад +1

    You have a gift for explaining things. Every question that pops into my head gets immediately answered.

  • @songchaerin5407
    @songchaerin5407 2 года назад +1

    I'm very impressed to how clear the explanation is.

  • @tianjoshua4079
    @tianjoshua4079 3 года назад +12

    Hi Ritvik, your explanations are great in many ways. One of the best things is they are very logically coherent, leaving no gaps that require the listener to figure out. Please do keep up the splendid work. This is a major good deed for so many.

    • @ritvikmath
      @ritvikmath  3 года назад +4

      Thanks a ton!

    • @dhinas9444
      @dhinas9444 2 года назад +1

      Exactly. Was about to write the same thing!

  • @fdsfkdj
    @fdsfkdj 3 года назад +1

    finally someone explained why we need markov chain. thank you!

  • @jamesmckenna6165
    @jamesmckenna6165 3 года назад +9

    Really excellent series of videos - been scratching my head over sampling methods for ages, but you explain it so succinctly and clearly it is finally making sense. Thanks for these!

  • @rahul-qo3fi
    @rahul-qo3fi 2 года назад +2

    wow!! The continuity in the explanation is just phenomenal , thanks a ton!

  • @proxyme3628
    @proxyme3628 2 года назад +2

    Thanks for making this video. Finally came across the one that explain MCMC in plain words without dumping math formulas. Hope other videos and articles in follow this.

  • @mk_upo
    @mk_upo 2 года назад +2

    Your channel is so underrated, you are making absolutely sick content!

  • @prashantkumar-ue7up
    @prashantkumar-ue7up 3 года назад +19

    The interpretation of this entire series is very helpful to understand these topics. Could you please make a video on Bayesian Regression using MCMC

  • @香港地舖購物
    @香港地舖購物 2 года назад

    Without your video, I think I will never understand the key idea behind MCMC ! Thanks for the good work...

  • @itdepends5906
    @itdepends5906 2 года назад

    One of my favorite guys. Has a great knack for knowing the right balance of intuition and rigor/formal definitions.

  • @cianr8452
    @cianr8452 2 года назад

    This video has significantly improved my base understanding of MCMC, thank you so much

  • @XxPaRaZiTzZxX
    @XxPaRaZiTzZxX 3 года назад +5

    You're an awesome professor. I have finally understood MCMC and Metropolis Hastings thanks to you

  • @catherinepuellomora8041
    @catherinepuellomora8041 Год назад

    I have been reading a 37 pages paper without understand a thing for two hours, and you've been clear in 12 mins¡¡¡ amazing job, many thanks

  • @daveamiana778
    @daveamiana778 3 года назад +4

    I found this series on MCMC really helpful for my project! Thank you for your very kind support in giving good content.

  • @murphp151
    @murphp151 2 года назад +1

    I've watch a load of your videos in the last 4 or 5 days.
    They are absolutely brilliant!!

  • @hochungyip1123
    @hochungyip1123 5 месяцев назад +1

    a complement about why detailed balanced condition is valid if a distribution is stationary, it's because of bayesian statistics.
    recall the equation P(a|b) = P(b|a)p(a)/p(b),
    some rearrangement we get: p(b)P(a|b) = p(a)P(b|a)
    if it's in stationary, p(a) and p(b) are const, then the equation holds, we call it detailed balanced conditon.

  • @gaprof4300
    @gaprof4300 25 дней назад +1

    REQUEST: Please organize this playlist in sequential / logical order. Example: The first video of this playlist is Markov Chains (MCMC) which refers to a previous video for accept-reject sampling; but that video is 13th in this playlist. So it's like watching random stuff here.

  • @cementheed
    @cementheed 3 года назад +9

    Dude! That was the clearest explanation of MCMC I've ever heard. Thanks!

  • @andrashorvath2411
    @andrashorvath2411 9 месяцев назад

    You are a great presenter, it is very easy to follow you, clean logic of how you build up the reasoning step by step, I like it very much, thank you.

  • @paultrow7266
    @paultrow7266 3 года назад +1

    Great video! Much clearer than anything else I've seen or read about MCMC.

  • @thisisadiman
    @thisisadiman Год назад

    I have never seen such an in-depth explanation of the MCMC! Thanks a lot bro.

    • @cao2106
      @cao2106 Год назад

      Do you have any python code that uses MCMC to predict closing prices? Can I have it, thanks

  • @chuckbecker4983
    @chuckbecker4983 Год назад

    You, Sir, are a brilliant instructor...I am awed. Thank you!

  • @faijro9260
    @faijro9260 Месяц назад

    At the very end it took me a second watch to realize that of course the sum of all probabilities for x given y would be 1 and thus you would get p(y) on the right hand (so obvious when you type it out :') ). Once again a great video. I think you really hit a sweet spot where people with basic math skills, can benefit from your succinct yet in depth explanations.

  • @skate456park
    @skate456park 3 года назад +3

    This is going to be super helpful for a future interview :) Thanks!

  • @hadeerahmed2477
    @hadeerahmed2477 2 года назад

    I love your videos and you really simplify concepts , my only comment is sometimes I get confused or don’t know applications for the concept

  • @geoffreyanderson4719
    @geoffreyanderson4719 2 года назад +1

    I expect by watching this video, the percent successful uptake of this material for me is so much better than any textbook alone. YT and presenters like ritvikmath is the way to learn new STEM stuff for sure. Much faster and easier, this way. It's like when they finally translated the Bible from Latin to English, and now I'm not needing to suffer with the Latin version any more. haha

  • @yulinliu850
    @yulinliu850 3 года назад +3

    Awesome! Looking forward to more on McMC.

  • @zalooooo
    @zalooooo 3 года назад +12

    fantastic. are you just going through chris bishops book and making videos to help us out? i'm reading it atm and keep finding content on your channel. it really is quite helpful in providing intuition for a very dense subject

  • @raveeshaperera3829
    @raveeshaperera3829 3 года назад +3

    Thank you so much for this video. This is really helpful for my undergraduate research work. One thing I'm finding difficult to understand is, why do we use "thinning" in MCMC ? From what I have read so far, it aims to reduce autocorrelation - but why? Please tell me your thoughts on this problem. I appreciate it a lot. TIA

  • @MiaoQin-m2u
    @MiaoQin-m2u 2 месяца назад

    Thanks for sharing. I begin to love learning.

  • @kylec1813
    @kylec1813 2 года назад

    Great stuff. I'll be running through all your videos.

  • @danielwiczew
    @danielwiczew 3 года назад +11

    Urging for it more than for a new Netflix series!

  • @OwenMcKinley
    @OwenMcKinley 2 года назад +2

    I'm speechless; your presenting style and explanatory power is insane!!! Thank you so much, I'm just getting into this stuff and the reading is tricky
    Liked, subbed, etc. 👍👌😁

  • @jaquelinemoreira7385
    @jaquelinemoreira7385 2 месяца назад

    This video just save my day

  • @pavybez
    @pavybez 3 года назад

    I like the way you teach. Thanks for these videos.

  • @purefeel
    @purefeel 2 года назад

    I wish Ian Goodfellow's book explained MCMC like you do. And I wish my professors back in university can teach and give intuition like this video. I would have been much more interested in stats and data science if it was taught properly.

  • @stefan5128
    @stefan5128 2 года назад +1

    Fantastic explanation! Now I got all the intuition I need to work through the formulas in our lecture :)

  • @lennyatomz8389
    @lennyatomz8389 3 года назад +2

    Thank you for making this video! Your explanation is superb and easy to follow. Much appreciated!!

  • @ankushkothiyal5372
    @ankushkothiyal5372 2 года назад

    That clears everything, thank you.

  • @dragolov
    @dragolov 6 месяцев назад

    You are great teacher! Deep respect!

  • @SnoZe95
    @SnoZe95 10 месяцев назад

    That's a very clear explanation. Thank you bro

  • @MohammadYoussof
    @MohammadYoussof 3 года назад

    Very clear description. Thank you!

  • @PatrickSVM
    @PatrickSVM 2 года назад

    Thanks, very informative! I really like the way you explain things.

  • @xiaoweidu4667
    @xiaoweidu4667 2 года назад

    This guy is really fantastic

  • @maxgotts5895
    @maxgotts5895 2 года назад

    Shit… good stuff! I've just gone through 4 of your videos instead of going to pick up dinner. Bravo sir!

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 года назад

    Brilliant. One word.

  • @geoffreyanderson4719
    @geoffreyanderson4719 2 года назад

    To be clear the following comment is in no way a criticism; rather it's a line of thinking as to illuminating how I can use this tool on some project. Can you also demonstrate a powerful application or two of this powerful method, on real data from a business, institution, or science dataset? So then is this machinery intended for making better simulations? Such as...? Compared against baseline case that does not use it, how much better is the answer to the problem? Accordingly, in this vein, some excellent looking software frameworks to help use MCMC were recently very well described by Kapil Sachdeva also on RUclips, particularly PYMC3, Stan, NumPyro, and TFProb. (Sorry for YT, but I expect YT will interfere with it if I provide a URL in this comment linking directly to Kapil Sachdeva.)

  • @kirillolkhovsky9160
    @kirillolkhovsky9160 2 года назад

    bro you litterly saving lifes hear thx

  • @richardbabley2544
    @richardbabley2544 3 года назад +2

    So the Monte Carlo part refers to the eventual sampling from the stationary Markov Chain? I kind of missed where it comes in, except for the board title.

    • @ritvikmath
      @ritvikmath  3 года назад +2

      The Monte Carlo part refers to simulating steps through the Markov Chain. So we design a Markov Chain with some transition probabilities and then we start at some x0 and step from one state to the next which is the Monte Carlo part.

  • @ritvikkharkar7586
    @ritvikkharkar7586 3 года назад +15

    Hey all; here is the Markov Chain Stationary Distribution Video Link: ruclips.net/video/4sXiCxZDrTU/видео.html

  • @seminkwak
    @seminkwak 2 года назад

    this is an amazing explanation!

  • @landmaster420
    @landmaster420 Год назад

    Great video! Really liked the high-level explanation to get us comfortable with the ideas behind these methods. Quick question: I'm assuming we don't know p(x), so how do we construct a stationary distribution about p(x)?

  • @zareef5583
    @zareef5583 Год назад

    Loved your explanation but can you please organise the videos I need to see serially before watching the "Markov Chain Monte Carlo (MCMC) : Data Science Concepts" video. All the videos are scattered all over the place.

  • @yinstube
    @yinstube 3 года назад +2

    Hey your videos are the best!

  • @muhammadibrahim7668
    @muhammadibrahim7668 4 месяца назад

    I like your concepts. Do you have any reference (books) for citation, if I want to add your formulae in my presentation for reference.

  • @PatrickSVM
    @PatrickSVM 2 года назад

    Also, could you maybe make a video on where in Data Science sampling techniques like MCMC (Gibbs, Metropolis ...) are useful? Missing data imputation? Would be highly appreciated!

  • @Oceansteve
    @Oceansteve 2 года назад

    Thanks for this, really enjoyed your explination

  • @sharmilakarumuri6050
    @sharmilakarumuri6050 3 года назад +1

    Awesome thanks a tonne waiting for further videos on mcmc, could you please do a video on hamiltonian monte carlo too

  • @Pmaisterify
    @Pmaisterify 2 года назад

    Really great video. A quick question though, what if I want to approximate f(x)? Currently I am using a form of MCMC to do this to estimate the state probability of n samples.

  • @bezaeshetu5454
    @bezaeshetu5454 2 года назад

    Thank you, you are always the best. I am working on Bayesian network structure learning using Gibbs sampling, Could you suggest the best book or video which will help me to go through this please. Thank you.

  • @outtaspacetime
    @outtaspacetime 2 года назад

    exceptional content!

  • @lauravargasgonzalez9317
    @lauravargasgonzalez9317 2 года назад +1

    Amazing !

  • @brofessorsbooks3352
    @brofessorsbooks3352 3 года назад

    KING you are KING

  • @moimonalisa5129
    @moimonalisa5129 2 года назад +1

    I get a philosophy from here. The objective is actually is to design the appropriate transition probability. It's like to build work out and healthy eating habit if you want a body goals.

  • @zhixiangwang7165
    @zhixiangwang7165 2 года назад

    Great lectures! Awesome!

  • @priyankakaswan7528
    @priyankakaswan7528 3 года назад +1

    you are god send!

  • @aminmohammadigolafshani2015
    @aminmohammadigolafshani2015 2 года назад

    How do we know the p(x) that should be the steady state of our MC? because I think the p(x) is the black box that we do not know and wants to sample from it to find it. If we have p(x), what is the obstacle against us that prevent us from sampling from it? This is a little bit confusing for me in all sampling videos on RUclips.

  • @juanete69
    @juanete69 2 года назад

    One of the hypothesis of "rejection sampling" is that samples must be independent. But here there, in MCMC, they are not independent.
    I can't understand why this is still acceptable.

  • @itsgerm2183
    @itsgerm2183 3 года назад

    @ritvikmath by any chance would you happen to have some notes presenting the topic in more depth? I have a general idea of the method but having trouble wrapping my head around some methods presented in papers. If not, its okay!

  • @thepenghouse
    @thepenghouse 3 года назад

    you're a legend

  • @faresziad7593
    @faresziad7593 Год назад

    Excellent pédagogue

  • @alaasmarneh7811
    @alaasmarneh7811 3 года назад

    Thank you, this helped me a lot

  • @TheNazem
    @TheNazem Год назад

    it's fun to stay at the mcmc

  • @sorsdeus
    @sorsdeus 3 года назад

    What a great video.

  • @itsrainbowoutside
    @itsrainbowoutside Год назад

    Thank you! Very helpful for me.

  • @geoffreyanderson4719
    @geoffreyanderson4719 2 года назад

    How exactly should the end of the burn in be detected and decided by an iterative algorithm, when it's a random variable that is being monitored, and it is therefore jumping around (so you can't see if it goes flat compared to prior values) and you don't even have the truth value to compare with, because otherwise you'd already have your goal in hand at the very beginning?

  • @wafike1
    @wafike1 3 года назад

    love the intro

  • @honshingandrewli7632
    @honshingandrewli7632 Год назад

    Can you do a lesson on Gaussian Copula, please?

  • @alexiapr9861
    @alexiapr9861 2 года назад

    Clear. Thank you.

  • @sharmilakarumuri6050
    @sharmilakarumuri6050 3 года назад

    could you please make a video on Sequential monte carlo (SMC) and Hamiltonian monte carlo (HMC)

  • @SpazioAlpha
    @SpazioAlpha 2 года назад

    Thanks again!

  • @shivampatel8928
    @shivampatel8928 3 года назад +1

    Very useful!

  • @ninadpimparkar9035
    @ninadpimparkar9035 3 года назад

    When are you going to do Hamilton MCMC? Its so hard to understand.

  • @samson6707
    @samson6707 Месяц назад

    the hat is dope

  • @porelort09
    @porelort09 Год назад

    Thank you!

  • @jordanwilson8277
    @jordanwilson8277 2 года назад

    Any chance of doing the EM algorithm?

  • @rithviksunku6298
    @rithviksunku6298 3 года назад +1

    Goated

  • @juanete69
    @juanete69 2 года назад

    So here you say that stationary is not to have the same probability, the same number, buy to have the same p(x), which is a distribution, a function?

  • @sharmilakarumuri6050
    @sharmilakarumuri6050 3 года назад

    could you please make a viideo on Sequential monte carlo SMC

  • @graceguo5288
    @graceguo5288 2 года назад

    Question - where does the first sample come from?

  • @yoshcn
    @yoshcn Год назад

    amzing channel thanks

  • @Jamesssssssssssssss
    @Jamesssssssssssssss 2 года назад +1

    I'm just here because there is a gun in Destiny 2 call Monte Carlo, which in turn has a perk called Markov Chain.
    I get why it was called that now

    • @ritvikmath
      @ritvikmath  2 года назад +1

      Lol

    • @Jamesssssssssssssss
      @Jamesssssssssssssss 2 года назад

      @@ritvikmath I watched the whole video, really well done. While most of it went over my head, the concept was well explained.

  • @mujtabaalam5907
    @mujtabaalam5907 3 месяца назад

    Why not use the full history of previous samoles instead of the immediate previous sample?