11f Machine Learning: Bayesian Regression Example

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 26

  • @IndigentMartian
    @IndigentMartian 3 года назад +6

    Nice - very clear description of confidence vs credible intervals, thanks!

  • @MrDgketchum
    @MrDgketchum 2 года назад

    Really liking the videos. Dove in head first to PyMC3 following the suggestion of an advisor. These videos are helping me understand what's going on under the hood, and why not to just use OLS on my messy data. Thanks!

  • @nb9797
    @nb9797 3 года назад

    what a lovely series of lectures on this. Thank you !

  • @garethedwards6781
    @garethedwards6781 3 года назад +1

    Thanks for the videos! Your explanations are very clear and succinct.

  • @luishernangarciapaucar683
    @luishernangarciapaucar683 3 года назад

    Very grateful, this has been a great explanation and motivation to explore MCMC and more Bayesian approaches!!

  • @sourabhsharma9830
    @sourabhsharma9830 3 года назад +1

    What an explanation SIR 🙏🙏🙏🙏

  • @jiey2271
    @jiey2271 4 года назад +1

    Hi thanks for the great lectures! I really like the way you explain things. Can you talk some about glm?

  • @cahalmurphy
    @cahalmurphy 3 года назад +1

    Hi, great video.
    At 8:20 Just a question: in the 2nd chart, the black line called Bayes Posterior Prediction, is that also the Posterior Predictive Distribution?
    I am just bit confused about notation I think..
    So in our Bayes formula for Bayesian regression we have P(B|Y, X) = P(Y, X|B)P(B)/P(Y, X); and what you got there with the new grain size data point is P(Y_hat|B, X=40) = integral P(Y_hat, B| X)dB = integral p(Y_hat|B, X=40)p(B|X)dB= integral[ p(Y_hat|B)p(B|X) ]dB .
    Thanks!

    • @binli2174
      @binli2174 2 года назад

      I think the difference between the two predictive distributions is that the narrow one assumes no measurement variance (y is free of noise, only W has uncertainty); while in the broader one the measurements y also has variance.

  • @basicmachines
    @basicmachines 2 года назад

    These videos are really helping thanks. I have a question about the slide starting at 6 mins where you show the PDFs of the uncertainty of each parameter. How are these computed? They don’t seem to be normal distribution approximations but the true PDFs. Thanks.

  • @barbapapaplouf5419
    @barbapapaplouf5419 4 года назад +1

    Very nice and clear, thanks! It would have been great to have a link to the script you used for this example. Very hard to find on your git (didn't find it).

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  4 года назад +2

      github.com/GeostatsGuy/PythonNumericalDemos/blob/master/SubsurfaceDataAnalytics_BayesianRegression.ipynb
      Good point, I should start including the links to the code in the video comment!

    • @barbapapaplouf5419
      @barbapapaplouf5419 4 года назад

      @@GeostatsGuyLectures thanks!

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  4 года назад +1

      @@barbapapaplouf5419, great idea, I have added links to all the videos in this playlist.

  • @WahranRai
    @WahranRai 3 года назад +2

    Could you show the code or how you get these sampling etc...

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  3 года назад +1

      Check out the link in the description. You'll need to install the pymc3 python package for the McMC sampling.

    • @WahranRai
      @WahranRai 3 года назад +2

      @@GeostatsGuyLectures Thank you and happy new year !

  • @afzansa8469
    @afzansa8469 3 года назад

    Really thanks for the explanation. Where i can learn the algorithm of mcmc without using pymc3 packcage? thanks

  • @albost1
    @albost1 3 года назад

    Great video, very nicely explained! Do you have material on bayesian hypothesis testing? More specifically to know how to carry out a hypothesis test for the parameters of interest (for example in this case) and interpret the results.

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  3 года назад +1

      This is a great idea, Alejando2. I work in the frequentist and Bayesian domains, but I need to do more Bayesian lectures. That's on my list with Bayesian Neural Networks!

  • @luisandrade5126
    @luisandrade5126 3 года назад

    Very illustrative lecture, sir. I explored and executed your python codes. However, not all of them ran correctly, apparently incompatible with pymc3 3.11.1, for which I had to do some changes:
    import arviz as az
    Substitute the following commands: {pm.stats.summary, pm.traceplot, pm.plot_posterior, pm.forestplot} with: {az.summary, az.plot_trace, az.plot_posterior, az.plot_forest}, respectively.
    I haven't figured out why the model sampled 2 chains instead of 4 because at that point I didn't change any code from the published version. I don't know how to manually set it, if possible.
    I couldn't update the unresponsive pm.quantile command. Thereby I dismissed it and then had to set the substitute arviz commands az.plot_posterior and az.plot_forest without credible_interval settings.
    Any help is appreciated

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  3 года назад

      Howdy Luis, Thank you for this. I've been super busy these days. I just ran the workflow and it completed without issue. I'm using pymc 3.8. I hope this helps, Michael

  • @univuniveral9713
    @univuniveral9713 3 года назад

    I don't understand what Markov chains gotta do with this. Is it a time series?

    • @IndigentMartian
      @IndigentMartian 3 года назад

      Try the previous video! ruclips.net/video/7QX-yVboLhk/видео.html

  • @joshualaferriere4530
    @joshualaferriere4530 3 года назад

    I watched all 3 of your videos (11d-f), and you don't actually show how to create a bayesian linear regression (including sampling) from start to finish.

  • @steadyknowledge3132
    @steadyknowledge3132 3 года назад

    Sir, please explain Differential evolution Markov chain.