The Box-Muller Transform (Proof and Python implementation)
HTML-код
- Опубликовано: 25 июл 2024
- In this video you will learn how the Box-Muller method for generating Normal random variables works and how to implement it in Python
Chapters:
00:00 Intro
00:38 Theory
20:03 Implementation in python
Great explanation. I didn't even realize the video was that long. Thank you
Thank you for the kind comment! I was afraid the video was a bit too long, so I am glad to know that is not the case :-D
very clear demonstration, thanks a lot
This is bloody amazing
Are there anything books/ resources you would recommend? As I like to learn about machine learning/ computational science.
Thank you so much, I am reading the book Statistical Inference these days, and stuck at simulating in a direct way for a while, your presentation is super clear and friendly. And I like the python part also, it makes the whole motivation behind this transform very clear.
I'm glad you liked the video. Thank you for the comment :-D
Mange tak for den klare forklaring.
Thanks a lot
Thank you...It helped well
Glad it helped!
wow thanks for your video I struggled with this topic 1 day
since my textbook provide the wrong contents: r = x^2 + y^2 and this led to the transformation variable: x = r^0.5 cos(theta) , y = r^0.5 sin(theta) this also makes the transformation matrix J becomes 1/2
This is awful 😩
7:25 i think u are missing the determinant
5:01 what's this thing called? I mean how did you know to use jacobian matrix etc to transform the above arrow equation? Whats the math concept behind it?
I know it as the Transformation Theorem. It is in no way a trivial result and requires a lot of work to prove. But it is a very handy result. See the following link for an example of how to write it formally (for 1- and 2-dimensional example): online.stat.psu.edu/stat414/lesson/23/23.1
Let's say I have a real dataset. One of the "Features" let call it X1 follows Weibull distribution.Can I change X1 distribution from Weibull to exponential or to any continuous distribution log normal ,Gamma etc..
Is it possible ?
Can you explain why you would like to do this? The context might help getting you an answer.
In the case of Exponential distribution, you can transform X1 into some E~Exponential(lambda) by first transforming X1 into a U~Uniform(0,1) using the CDF of a Weibull distribution and then transforming U into E using the inverse CDF of an Exponential distribution. In other words:
Let X ~ Weibull( scale, shape).
Then U = 1 - exp( -[X/scale]^shape) ~ Uniform(0,1) (CDF of Weibull taken on X)
And then E = -log(1-U)/lambda ~ Exponential(lambda) (Inverse cdf of Exponential)
However this is only possible since we can easily explicitly invert the CDF of an Exponential distribution. This is (generally) not the case for a Gamma or log normal distributions.
@@statmontecarlo3858 To be honest I m struggling with data science specially in probability Distribution .
Can I send you email?
@@BlueSkyGoldSun You can send me an email on statmontecarlo@gmail.com. I can't guarantee that I can help you, but I will definitely take a look at your mail.
@@statmontecarlo3858 Thank you I have sent .
Thanks for the teaching, but I have to say the tutorial is done in a horrible way, by both terrrible handwriting¬ation and poorly prepared scripts. But at least someone is attempting to display the beauty of the maths.