Intro to Fisher Matrices

Поделиться
HTML-код
  • Опубликовано: 15 сен 2024
  • An introduction to Fisher matrices, with a focus on astronomical applications.

Комментарии • 27

  • @ShubhamKejriwal
    @ShubhamKejriwal 4 года назад +4

    Great! A research paper I read gave me the final formula for some observables and some parameters assuming a gaussian but didn't tell anything about how we actually got there. Instead of searching through books, I stumbled upon this piece of gem and it saved so much of my time. Thank you!

  • @PradeepBanerjeeKr
    @PradeepBanerjeeKr 10 лет назад +22

    @5:24, the covariance matrix is not in general equal to the inverse of the fisher information matrix. Equality holds only for the Gaussian case.

    • @yagzdagabak4459
      @yagzdagabak4459 5 лет назад

      is it equal to the asymptotic covariance matrix in general?

  • @TheRishikeshPandit
    @TheRishikeshPandit 12 лет назад +3

    It was really helpful to clear some basic likelihood and Fisher matrix concepts which are used to constrain cosmological parameters... Thanks a lot sir! :)

  • @mickalansubramoney7527
    @mickalansubramoney7527 13 дней назад

    Great Video, where do i find more videos like these explaining concepts in cosmology?

  • @Orion3T
    @Orion3T 10 лет назад +1

    Thanks very much, a really useful intro. I was coming across references to Fisher matrices in the literature without knowing what they were. Your video means I now have some idea what they are talking about at least. I have ordered the book you mention.

  • @threequarcks
    @threequarcks 11 лет назад +4

    i'm a fucking physics graduate in cosmology and this was helpful! - thank you very much :), such a great use of the internet

  • @thecowgoesvroom672
    @thecowgoesvroom672 3 года назад

    Thank you, I've been studying and this helped tons.

  • @carlosastro21
    @carlosastro21 6 лет назад

    Oh man! Thanks a lot for such guidance. It is really valuable.

  • @EricLebigot
    @EricLebigot 10 лет назад +1

    The example at the very end is important for understanding the Fischer matrix formula at the beginning (when "d^2 likelikhood / d lambda^2" is "converted" into "d P/d lambda_alpha dP/dlambda_beta"). This involves a few assumptions that are only explicited at the end…

    • @wahabfiles6260
      @wahabfiles6260 4 года назад +1

      when at 3:50 he mentions parabola is not a good approximation instead in log space it is gaussian and thats a good approximation .....what does he mean?

  • @bayes40
    @bayes40 8 лет назад

    Very impressive explanation. You have some real teaching skill

  • @98lorevan
    @98lorevan 4 месяца назад

    i was actually searching a video to explain me fisher matrix since i'm studying cosmology and i found the explanation of the exact piece of book i was studying hahahahahahaha

  • @davidpaganin3361
    @davidpaganin3361 5 лет назад

    Many thanks, very much appreciated!

  • @markwagner4929
    @markwagner4929 9 лет назад +12

    Dear god thank you jonathpober for explaining this concept clearly and thoughtfully, and not having a heavy Indian accent and sloppy handwriting

    • @iwtwb8
      @iwtwb8 7 лет назад +5

      Why does someone having an Indian accent affect your comprehension?

    • @manish3889
      @manish3889 6 лет назад +10

      Do not be racist. Be thankful to Indians that although English is not their mother tongue, they are speaking for your convenience. Most of American and British english accents just sucks.

    • @Cashman9111
      @Cashman9111 6 лет назад +4

      I'm really not sure if you guys are joking or fucking serious... I can't understand indians most of the time so I am racist ? my speaking english is also not really good, so if you wouldn't get me, you also would be racist, right ?

  • @luck3949
    @luck3949 5 лет назад

    Came here to do my homework about Fisher's information. Stayed for the lesson on cosmology.

  • @skhaaaan
    @skhaaaan 10 лет назад

    great video :D

  • @pereirasomoza
    @pereirasomoza 11 лет назад

    Thanks

  • @gggrow
    @gggrow 7 лет назад

    What's that word at 7:17? Sound like fedooshal

  • @kunalmandalia1165
    @kunalmandalia1165 8 лет назад

    Seth Godin teaching Stats? :)

  • @willianrodrigues6041
    @willianrodrigues6041 7 лет назад +1

    eu sou Brasileiro ñ entenddo sua fala

  • @danielc4267
    @danielc4267 5 лет назад

    I am so confused about Likelihood. Isn't likelihood defined as P(theory | data) instead, as described in the wiki link below? But I understand why it can be P(data | theory) as well.
    en.wikipedia.org/wiki/Likelihood_function#Definition

    • @arrau08
      @arrau08 3 года назад +1

      P(theory|data) is the posterior, which you really want to know (constrain theory from data). To get there if you take Bayesian approach, you need to multiply prior to likelihood, and with uniform prior the likelihood is just posterior up to normalization.

  • @tredicidesigns
    @tredicidesigns 9 лет назад

    i know i wanted to watch the hiroshima one cant understand him talk