Great! A research paper I read gave me the final formula for some observables and some parameters assuming a gaussian but didn't tell anything about how we actually got there. Instead of searching through books, I stumbled upon this piece of gem and it saved so much of my time. Thank you!
It was really helpful to clear some basic likelihood and Fisher matrix concepts which are used to constrain cosmological parameters... Thanks a lot sir! :)
Thanks very much, a really useful intro. I was coming across references to Fisher matrices in the literature without knowing what they were. Your video means I now have some idea what they are talking about at least. I have ordered the book you mention.
The example at the very end is important for understanding the Fischer matrix formula at the beginning (when "d^2 likelikhood / d lambda^2" is "converted" into "d P/d lambda_alpha dP/dlambda_beta"). This involves a few assumptions that are only explicited at the end…
when at 3:50 he mentions parabola is not a good approximation instead in log space it is gaussian and thats a good approximation .....what does he mean?
i was actually searching a video to explain me fisher matrix since i'm studying cosmology and i found the explanation of the exact piece of book i was studying hahahahahahaha
Do not be racist. Be thankful to Indians that although English is not their mother tongue, they are speaking for your convenience. Most of American and British english accents just sucks.
I'm really not sure if you guys are joking or fucking serious... I can't understand indians most of the time so I am racist ? my speaking english is also not really good, so if you wouldn't get me, you also would be racist, right ?
I am so confused about Likelihood. Isn't likelihood defined as P(theory | data) instead, as described in the wiki link below? But I understand why it can be P(data | theory) as well. en.wikipedia.org/wiki/Likelihood_function#Definition
P(theory|data) is the posterior, which you really want to know (constrain theory from data). To get there if you take Bayesian approach, you need to multiply prior to likelihood, and with uniform prior the likelihood is just posterior up to normalization.
Great! A research paper I read gave me the final formula for some observables and some parameters assuming a gaussian but didn't tell anything about how we actually got there. Instead of searching through books, I stumbled upon this piece of gem and it saved so much of my time. Thank you!
@5:24, the covariance matrix is not in general equal to the inverse of the fisher information matrix. Equality holds only for the Gaussian case.
is it equal to the asymptotic covariance matrix in general?
It was really helpful to clear some basic likelihood and Fisher matrix concepts which are used to constrain cosmological parameters... Thanks a lot sir! :)
Great Video, where do i find more videos like these explaining concepts in cosmology?
Thanks very much, a really useful intro. I was coming across references to Fisher matrices in the literature without knowing what they were. Your video means I now have some idea what they are talking about at least. I have ordered the book you mention.
i'm a fucking physics graduate in cosmology and this was helpful! - thank you very much :), such a great use of the internet
Thank you, I've been studying and this helped tons.
Oh man! Thanks a lot for such guidance. It is really valuable.
The example at the very end is important for understanding the Fischer matrix formula at the beginning (when "d^2 likelikhood / d lambda^2" is "converted" into "d P/d lambda_alpha dP/dlambda_beta"). This involves a few assumptions that are only explicited at the end…
when at 3:50 he mentions parabola is not a good approximation instead in log space it is gaussian and thats a good approximation .....what does he mean?
Very impressive explanation. You have some real teaching skill
i was actually searching a video to explain me fisher matrix since i'm studying cosmology and i found the explanation of the exact piece of book i was studying hahahahahahaha
Many thanks, very much appreciated!
Dear god thank you jonathpober for explaining this concept clearly and thoughtfully, and not having a heavy Indian accent and sloppy handwriting
Why does someone having an Indian accent affect your comprehension?
Do not be racist. Be thankful to Indians that although English is not their mother tongue, they are speaking for your convenience. Most of American and British english accents just sucks.
I'm really not sure if you guys are joking or fucking serious... I can't understand indians most of the time so I am racist ? my speaking english is also not really good, so if you wouldn't get me, you also would be racist, right ?
Came here to do my homework about Fisher's information. Stayed for the lesson on cosmology.
great video :D
Thanks
What's that word at 7:17? Sound like fedooshal
Seth Godin teaching Stats? :)
eu sou Brasileiro ñ entenddo sua fala
I am so confused about Likelihood. Isn't likelihood defined as P(theory | data) instead, as described in the wiki link below? But I understand why it can be P(data | theory) as well.
en.wikipedia.org/wiki/Likelihood_function#Definition
P(theory|data) is the posterior, which you really want to know (constrain theory from data). To get there if you take Bayesian approach, you need to multiply prior to likelihood, and with uniform prior the likelihood is just posterior up to normalization.
i know i wanted to watch the hiroshima one cant understand him talk