Fisher's Information: Examples

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024

Комментарии • 21

  • @syz911
    @syz911 Год назад

    Your videos are so precise and accurate. What I appreciate is you spent so much of time writing all these without compromising accuracy. Many people over-simplify or skip important assumptions when presenting definitions or theorems. I have a request: Could you please discuss about the Fisher Information and C-R bound for multi-parameters? Thanks!

    • @statisticsmatt
      @statisticsmatt  Год назад

      Thanks for your kind words. I'll add these topics to my to do list. But being honest, I don't know when I'll be able to get to it. Will eventually. Be very patients (many thanks). Don't forget to subscribe and let others know about this channel.

  • @nickmillican22
    @nickmillican22 3 года назад +3

    Love your videos, Matt!
    Question, do you have any videos (or is there a brief explanation) explaining why the 'variance of the partial derivative of the log likelihood' AND the 'negative expected value of the second derivative of the log likelihood' are equivalent (and thus both define the information matrix)? I can understand the intuition behind the second-derivative definition, but I don't understand the variance definition.
    Keep up the great work!

    • @statisticsmatt
      @statisticsmatt  3 года назад +1

      Many thanks for saying that you love these videos. Much appreciated. First note this formula without proof. Var(x) = E(x^2) - [E(x)]^2. Let x="partial derivative of the log likelihood". From this video, ruclips.net/video/xkMstee5gQ0/видео.html, about 3 minutes into it, it shows that E(x)=E(partial derivative of the log likelihood)=0. Thus, Var(x) = E(x^2). This shows that the first and third equations are equal. To show that the second equals the first and third, I'm going to recommend watching at least the first 7 minutes of the above mentioned video.

  • @bhawikajain4022
    @bhawikajain4022 5 месяцев назад

    Hi matt, can you explain why you have witten (n*lambda) to the power {sum xi} in ex1 around 1:31?
    Thank you for such helpful videos!!

    • @statisticsmatt
      @statisticsmatt  5 месяцев назад

      Many thanks for watching! You have found an error in the video, which I highlighted in the description for the video.

  • @erguancho6186
    @erguancho6186 3 года назад +3

    Very useful! Thanks :)

    • @statisticsmatt
      @statisticsmatt  3 года назад

      You're welcome. Many thanks for watching!

  • @xiaowei8546
    @xiaowei8546 2 года назад

    for the Poisson example, for two samples, say 10,10,10,10,10 and 0, 5,10,15, 20, is the fisher information always the same for different lambdas?

    • @statisticsmatt
      @statisticsmatt  2 года назад

      Fisher's Information for a Poisson distribution is (n/lambda). If you know that the two samples came from the same distribution, then Fisher's Information would be the same. However, if you use the sample data to estimate lambda, and then plug into the formula (n/lambda), different values would be achieved. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

    • @xiaowei8546
      @xiaowei8546 2 года назад

      @@statisticsmatt many thanks

  • @csaa7659
    @csaa7659 3 года назад

    Why would the term n*lambda use instead of lambda in the likelihood of the Poisson distribution in example 1?

    • @statisticsmatt
      @statisticsmatt  3 года назад

      You have pointed out an error. Many many thanks. It should be just lambda. I'm going to put this error in the description and give you credit. Also, don't forget to subscribe to the channel.

  • @minerva646
    @minerva646 3 года назад

    Hi, is it possible that you share your notes?

    • @statisticsmatt
      @statisticsmatt  3 года назад

      In the past, I've posted my notes on Gumroad. What videos do you want notes from?

    • @minerva646
      @minerva646 3 года назад

      @@statisticsmatt Thanks, this one please "Fisher's Information: Examples"

    • @statisticsmatt
      @statisticsmatt  3 года назад

      go to this site for a copy of the video notes
      gumroad.com/statisticsmatt
      use "Fisher's Information" to search for the notes.

  • @nathanemil10
    @nathanemil10 3 года назад

    Hey Mr StatsiticsMatt, i have a course test on MLE coming soon. think you can help me out during it?

    • @statisticsmatt
      @statisticsmatt  3 года назад

      You may always ask questions on these videos. Not sure how much I'd be able to help. Please don't forget to subscribe.

  • @lalapanda4216
    @lalapanda4216 Год назад

    wtf

    • @statisticsmatt
      @statisticsmatt  Год назад

      I'm not sure if this is a positive or negative comment. Many thanks for watching. Don't forget to subscribe and let others know about this channel.