A Beginner's Guide to Monte Carlo Markov Chain MCMC Analysis 2016

Поделиться
HTML-код
  • Опубликовано: 4 авг 2016
  • presented by Dr. David Kipping (Columbia)

Комментарии • 35

  • @crysis1234567
    @crysis1234567 7 лет назад +158

    6:15 skip to MCMC

  • @boseongcho62
    @boseongcho62 2 года назад +6

    Thank you for your helping!
    9:36 posterior
    16:11 Bayes' theorem
    29:18 Metropolis rule
    40:35 walker(emcee)

  • @lexparsimoniae2107
    @lexparsimoniae2107 3 года назад

    Very clear and helpful lecture! Thank you!

  • @nemmaadeni
    @nemmaadeni 4 года назад +3

    This was super helpful. Thanks!

  • @cadenhong245
    @cadenhong245 2 года назад

    Quite clear and clever explanation. Thanks a lot for sharing.

  • @EliseLucy92
    @EliseLucy92 4 года назад +6

    I think the biggest problem of this video is that the presenter is using a laser pointer which we can’t see. So he says “this is this and this is that” and we never know what elements on the slides to follow. Also, the technical problems with things not showing up on slides.

  • @karannchew2534
    @karannchew2534 3 года назад +6

    MCMC starts 6:17

  • @broadwaybrian
    @broadwaybrian 3 года назад +7

    It would be nice to see a video on Nested Sampling. I've read some literature and coded some models in R, but still a little shaky with some things (e.g., dynamic nested sampling, extracting parameter samples proportional to posterior density [I'm getting good estimates with + - sd, but would prefer output similar to MCMC]). Throwing it out there since there are not a ton of resources, and because of the tease at the beginning :)

  • @TheRossspija
    @TheRossspija 3 года назад +1

    He conveys a great point and I understand that the rigor is not the point of this video, but a statistician in me was screaming most of the time.

  • @hfkssadfrew
    @hfkssadfrew 4 года назад +7

    28:06 presentor should mention that he cannot compute posterior but only to compare the ratio between the trial and previous. Otherwise it can confuse people: since it looks like you have computed P, why bother?

  • @prodbyryshy
    @prodbyryshy Год назад

    is there a video that discusses which situations are best for this approach

  • @andreneves6064
    @andreneves6064 6 лет назад +1

    Do you know a beginners guide to Gibbs Sampling?

  • @lillianrthistlethwaite9638
    @lillianrthistlethwaite9638 7 лет назад +1

    I had a question about the plots when you are showing the jumps trying to find the "green zones". The axis labels are "a" and "b", with a_min and a_max as the range for variable a, and b_min and b_max for the variable b. Is a and b some sort of high dimensional space variable, like a principle component? Basically what are these variables you are plotting against to visualize the jumps/path the MCMC algorithm takes?

    • @MadaxeMunkeee
      @MadaxeMunkeee 6 лет назад

      Lillian R Ashmore the variables are the parameters of the model - theta. A and B are two model parameters, but likely there will be many others. He’s just offering a simpler scenario.

  • @aymanhijazy9668
    @aymanhijazy9668 6 лет назад +1

    Great lecture ! really nice introduction to MCMC

  • @WahranRai
    @WahranRai 3 года назад

    31:10 What the case (Metropolis rule) : P_trial=P_i (accept theta_i+1 = theta_i ?)

  • @jonathanstudentkit
    @jonathanstudentkit 6 лет назад +1

    Laplace started Bayesian statistics at least as much as Bayes!

  • @sandeepmane5779
    @sandeepmane5779 2 года назад

    Video starts from @6:15

  • @rizwanniaz9265
    @rizwanniaz9265 6 лет назад

    how to calculate odd ratio in bayesian ordered logistic plz tell me

  • @sitendugoswami1990
    @sitendugoswami1990 3 года назад +1

    This is definitely not a beginner's guide. I know a little bit about bayes theorem, prior, posterior etc. But this Dr. Kipping ensured that i had to open up the books to sort out the mess he created in my mind. Definitely wont recommend to anyone.

  • @hfkssadfrew
    @hfkssadfrew 4 года назад +3

    The way to teach MCMC, should start from its definition on each letter, and explain how does it compute posterior. Not messy terminology together with extremely personal viewpoint on MCMC’ers. I have been in similar situations and I know how to teach the worst.

  • @golgoli4351
    @golgoli4351 7 лет назад +12

    The main concern is that the presenter is not familiar with how to deliver his points. Hope this feedback helps you.

  • @edwardbrown7469
    @edwardbrown7469 6 лет назад +20

    Stopping after 14 minutes, this is a garbled mess.

  • @jiayicox2755
    @jiayicox2755 4 года назад +1

    Too many unnecessary details, and not a clear logical presentation. The slides are hard to follow, it just makes people who understand MCMC more confused.

  • @powerdriller4124
    @powerdriller4124 Год назад

    Too many conceptual jumps, leaving big blanks. Something as important and to the core of Bayesian Analysis as how to define the Likelihood function is left unexplained. And at 27:35 he says that Metropolis does not consider the evidence!! ABSURD!! It is precisely in the Likelihood definition where the evidence is considered.

  • @JohnSmith-lf5xm
    @JohnSmith-lf5xm 7 лет назад +20

    This is not for beginners. you did not explain anything. You just keep saying "Monte Carlo does this or that" but you do not explain anything.

    • @terrysworld5651
      @terrysworld5651 5 лет назад +4

      I think this is a talk designed for people who have started a PhD in astrophysics. We used a very terse definition of the word 'beginner' in Astrophysics...

    • @yevgenydevine
      @yevgenydevine 5 лет назад +11

      People on RUclips think "MCMC for beginners" means "MCMC for those who missed all math classes" smh.

  • @farshadgoldoust6548
    @farshadgoldoust6548 4 года назад

    you tried, anyway it's difficult ;)

  • @m_sh_oh
    @m_sh_oh Год назад +1

    Presentation PDF download:
    github.com/davidkipping/sagan2016/blob/master/MCMC.pdf

  • @rizwanniaz9265
    @rizwanniaz9265 6 лет назад

    how to calculate odd ratio in bayesian ordered logistic plz tell me