Thanks for your work, you are gradually becoming the go to channel for a comprehensive view of Julia and more. Congratulations and keep it up! Cheers from Spain
Cool! I have a Probabilistic Programming playlist that covers some of the content in "Statistical Rethinking": ruclips.net/p/PLhQ2JMBcfAsgU7kZ-Ee_SDrjhJIehICmR
wonderful, these days I was reading some content about probabilistic programming and Turing, so this tutorial become really handy, I will wait until the next one.
This was really interesting, thank you! During my internship project last semester I was doing Monte Carlo ray tracing for estimating radiation heat transfer within a high temperature environment. During that project, one of my biggest struggles was to correctly (analytically) define all of the probability density functions, and also their cumulative density functions, which are used for sampling. Based on your video I think that Turing.jl could be very useful for defining pdf's and cdf's that could later be used for sampling. :-)
cointoss(y) = cointoss(; tosses = length(y)) | (; y) Like to learn more about how this conditioning feature works using the vertical bar? Are there resources for this since I couldn’t find documentations for it.
This has to do with Conditional Probability, Bayes' Theorem and Bayesian Statistics. I cover these subjects in the next video, but here's the short answer: Bayes' Theorem define the Posterior Probability as P(A | B), which is a Conditional Probability that reads "the Probability of observing Event A 'given that ( | )' B is true." In Bayesian Statistics, "A" and "B" are replaced with "theta" and "y", P(theta | y), where "theta" represents some unknown parameter(s) and "y" represents your observed data. So the code is just syntactic sugar that Turing.jl is using for the Bayesian Statistics definition of the Posterior Probability, P(theta | y), or P(unknown_parameter_p | data) -- i.e., the Probability of observing the "unknown parameter p" given the "data".
You’ve somehow been randomly timing these videos right as I’ve been learning about the subjects and I love it
A happy coincidence!
Thanks for your work, you are gradually becoming the go to channel for a comprehensive view of Julia and more. Congratulations and keep it up! Cheers from Spain
Thanks for the kind words! I'll try my best to live up to them!
the best one way to learn!!!
I'm reading the book "Statistical Rethinking" and your videos are awesome to understand the topics, thanks!
Cool! I have a Probabilistic Programming playlist that covers some of the content in "Statistical Rethinking": ruclips.net/p/PLhQ2JMBcfAsgU7kZ-Ee_SDrjhJIehICmR
wonderful, these days I was reading some content about probabilistic programming and Turing, so this tutorial become really handy, I will wait until the next one.
It's a really interesting subject!
This was really interesting, thank you! During my internship project last semester I was doing Monte Carlo ray tracing for estimating radiation heat transfer within a high temperature environment. During that project, one of my biggest struggles was to correctly (analytically) define all of the probability density functions, and also their cumulative density functions, which are used for sampling. Based on your video I think that Turing.jl could be very useful for defining pdf's and cdf's that could later be used for sampling. :-)
Sounds like a really interesting project! Good luck!
Hello Niko! Your project sounds interesting! It's if possible, can you share more information with us? I'd appreciate it.
great video. look forward to more videos on probabilistic programming.
Thanks! The next video will cover Probabilistic Programming concepts!
Awesome!
👏👏👏
Fab as always
Thanks! And thank YOU for your support!
cointoss(y) = cointoss(; tosses = length(y)) | (; y)
Like to learn more about how this conditioning feature works using the vertical bar? Are there resources for this since I couldn’t find documentations for it.
This has to do with Conditional Probability, Bayes' Theorem and Bayesian Statistics. I cover these subjects in the next video, but here's the short answer: Bayes' Theorem define the Posterior Probability as P(A | B), which is a Conditional Probability that reads "the Probability of observing Event A 'given that ( | )' B is true." In Bayesian Statistics, "A" and "B" are replaced with "theta" and "y", P(theta | y), where "theta" represents some unknown parameter(s) and "y" represents your observed data. So the code is just syntactic sugar that Turing.jl is using for the Bayesian Statistics definition of the Posterior Probability, P(theta | y), or P(unknown_parameter_p | data) -- i.e., the Probability of observing the "unknown parameter p" given the "data".
Can you do please a review of Julia 1.9 and how close is Julia at making wait times for precompiled packages a non issue anymore?