What is Fisher Information?

Поделиться
HTML-код
  • Опубликовано: 6 сен 2024
  • Explains the concept of Fisher Information in relation to statistical estimation of parameters based on random measurements. Gives an example of parameter estimation in Gaussian noise, and shows the component functions to help with providing intuition.
    Related videos: (see iaincollings.com)
    • What is Least Squares Estimation? • What is Least Squares ...
    • What is a Random Variable? • What is a Random Varia...
    • What is a Probability Density Function (pdf)? • What is a Probability ...
    • What is a Multivariate Probability Density Function (PDF)? • What is a Multivariate...
    • What is the Kalman Filter? • What is the Kalman Fil...
    • What is a Cumulative Distribution Function (CDF) of a Random Variable? • What is a Cumulative D...
    • What is a Moment Generating Function (MGF)? • What is a Moment Gener...
    • What is a Random Process? • What is a Random Process?
    • Expectation of a Random Variable Equation Explained • Expectation of a Rando...
    • What is a Gaussian Distribution? • What is a Gaussian Dis...
    • How are Matched Filter (MF), Zero Forcing (ZF), and MMSE Related? • How are Matched Filter...
    For a full list of Videos and Summary Sheets, goto: iaincollings.com

Комментарии • 83

  • @akaakaakaak5779
    @akaakaakaak5779 Год назад +12

    Love the format of the video. Emphasising the intuition just makes everything else clearer

  • @payman_azari
    @payman_azari 12 дней назад

    It is a very helpful video; usually, when I see a log in a formula, I skip investigating, thinking it's difficult, but how interestingly you explained the intuition behind it, which is applicable everywhere.

  • @NONAME_G_R_I_D_
    @NONAME_G_R_I_D_ Год назад +10

    Great video! I have been following your content for quite a while now and you really always try to give the intuition behind the each process. This really helps to understand the material :) Much appreciated!!!

    • @iain_explains
      @iain_explains  Год назад +1

      I'm so glad to hear that you like the videos, and find the intuition helpful.

  • @sdsa007
    @sdsa007 Год назад +1

    Im a visual learner the graphs really helped! but I almost gave up half way through the video but I'm glad I hung on!

    • @iain_explains
      @iain_explains  Год назад +1

      I'm glad you found the graphs helpful.

  • @user-cl7vh1tz3t
    @user-cl7vh1tz3t 10 месяцев назад +1

    This is really a great explanation. You made a difficult concept (at least for me) very easy to understand. I’ve been watching other videos with animations and all, but I only understood this well after watching your explanation. Thank you very much.

    • @iain_explains
      @iain_explains  9 месяцев назад

      That's so great to hear. I'm glad it was helpful!

  • @ImranMoezKhan
    @ImranMoezKhan 2 года назад +2

    What a wonderful coincidence! Here I am deriving a CRB for a noise variance model I'm researching, and running MLE simulations to verify it, and your video with this great explanation of FI comes up :-). I've read that the multivariate FIM can be considered a metric in the parameter space, and with your explanation of how the derivative takes into account the variation of the PDF wrt to the parameter, I can almost visualize it for an intuitive understanding - fascinating concepts. Thanks Iain!

    • @iain_explains
      @iain_explains  2 года назад +1

      I'm so glad it was helpful. The concept is not particularly intuitive, especially when considering the multivariate case with the FIM. Perhaps the FIM can be the topic of a future video.

  • @mujahidali6988
    @mujahidali6988 Месяц назад

    Thanks, your way of explanation in an intuitive way is really superb.

  • @sintumavuya7495
    @sintumavuya7495 10 месяцев назад

    Thank you for explaining the logic behind that formula. Knowing the why helps me remember easily and just makes it all make sense.

    • @iain_explains
      @iain_explains  10 месяцев назад +1

      That's great to hear. I'm glad you found the video helpful.

  • @stillwalking78
    @stillwalking78 10 месяцев назад

    The most informative video on Fisher information I have seen, pun intended! 😄

  • @sharp8710
    @sharp8710 4 месяца назад

    Thank you for the video. Love the way you used simple examples to explain the theory intuitively and decomposed the expression explaining the meaning of each part!

  • @ZardoshtHodaie
    @ZardoshtHodaie Год назад

    The beauty of math becomes evident when a good teacher teaches it :) ... thank you! thank you!

  • @mahtabsv
    @mahtabsv 3 месяца назад

    Thank you very much for this amazing video! You made understanding this concept very easy.

  • @rohansinghthelord
    @rohansinghthelord 6 месяцев назад

    I'm a little confused as to why we take the log. Specifically, wouldn't we want the part of the function that changes the most to have more weight in the expectation? Aren't small changes not that notable in comparison?

  • @mikewang4626
    @mikewang4626 Год назад

    Thanks a lot for your intuitive explanation with diagrams. The explanation about why Fisher Information looks like that is quite useful to understand the definition!

    • @iain_explains
      @iain_explains  Год назад

      That's great to hear. I'm glad you liked the video.

  • @oO_toOomy_Oo
    @oO_toOomy_Oo 9 месяцев назад

    I appreciate your work Mr. lain, it is very helpful, it gives great sense of the signals and systems.

  • @chengshen7833
    @chengshen7833 2 года назад

    Thanks a lot. This really provides a excellent complementary explanation to S. Kay's book. In the book Fisher Information was interpreted as the 'curvature of the log-likelihood function', where the expectation of squared 1st derivative can be converted to negative of expectation of 2nd derivative, and f is viewed as a function of theta with Y fixed. The meaning of natural log will become more subtle when it comes to the derivation of CRLB.

    • @iain_explains
      @iain_explains  2 года назад

      Glad it was helpful! It's a concept that took me a long time to get intuition on, when I first learned about it.

  • @menglu5776
    @menglu5776 11 месяцев назад

    Thank you so much, I was literature research for some novel Cramer rao lower bound application. You video helped me a lot!

  • @ujjwaltyagi3030
    @ujjwaltyagi3030 Год назад

    Also in the video we never talked about which function gives most info about theta? is it pills , coffee or arm?

  • @SavasErdim-ly8xo
    @SavasErdim-ly8xo Год назад

    Great video to understand the Fisher Information intuitively. Thank you Prof. Iain.

  • @bobcoolmen1571
    @bobcoolmen1571 4 месяца назад

    Excellent video thank you sir.

  • @SignalProcessingWithPaul
    @SignalProcessingWithPaul Год назад

    Hey Ian, great content. Have you considered doing a video on the Cramer-Rao bound (and how it relates to Fisher information)? I was thinking of doing some statistical signal processing videos on my channel but you've covered so much already haha.

    • @iain_explains
      @iain_explains  Год назад

      Thanks for the suggestion. Yes, I've thought about it. It's the inverse of the Fisher information, but there's really not much intuition as to why this is the case - except that the maths turns out that way.

  • @mustaphasadok3172
    @mustaphasadok3172 2 года назад

    Amazing... Thank you professor. In the litterature there is rare clear book on the subject beside Pr Steve Mc Kay collection.

  • @govindbadgoti8678
    @govindbadgoti8678 2 года назад

    i am from india ........ your video is so informative

  • @vedantnipane2268
    @vedantnipane2268 9 месяцев назад

    🎯 Key Takeaways for quick navigation:
    00:01 🌡️ *Fisher Information measures information content in measurements about a parameter. An example is given with various methods to measure stomach temperature.*
    03:23 📊 *The Fisher Information formula is explained, involving the expected value of the squared derivative of the log of the probability density function (pdf) of a random variable.*
    07:22 📉 *Fischer Information is inversely proportional to the variance of noise in measurements. Small noise leads to high information, while large noise results in low information.*
    14:22 🔄 *The log function in the Fisher Information formula enhances small values in the pdf, ensuring contributions from all parts of the distribution, giving a comprehensive measure.*
    16:50 📈 *Fischer Information decreases as noise (standard deviation) increases, illustrated with visualizations of pdf changes and their impact on the information measure.*
    Made with HARPA AI

  • @a.nelprober4971
    @a.nelprober4971 Год назад

    Am I a dummy? For the fisher info of theta I have computed theta/(pop variance). (I have theta instead of 1)

  • @aliosmancetin9542
    @aliosmancetin9542 Год назад

    Awesome video! You concentrate on giving the intuiton, thanks!

  • @marirsg
    @marirsg Год назад

    Beautifully explained ! Thank you

  • @cerioscha
    @cerioscha Год назад

    Great video thanks !

  • @qudratullahazimy4037
    @qudratullahazimy4037 2 года назад

    Absolutely great explanation! made my my life easy.

  • @nzambabignoumba445
    @nzambabignoumba445 9 месяцев назад

    Wonderful!!!!

  • @lostmylaundrylist9997
    @lostmylaundrylist9997 Год назад +1

    Excellent!

  • @niveditashrivastava8374
    @niveditashrivastava8374 Год назад

    Very informative video. The normal distribution is plotted for a particular value of the mean. How can we perform differentiation wrt to the mean? Am I missing something here.

    • @iain_explains
      @iain_explains  Год назад

      In the definition of Fisher Information, there is a log function. This cancels out the exponential function in the function f(y;theta).

  • @xinpeiwu6086
    @xinpeiwu6086 Год назад

    absolutely made everything understandable, better than my college professor😁

  • @gamingandmusic9217
    @gamingandmusic9217 2 года назад

    sir, can you please tell the difference between
    1.Maximum likelihood (ML)
    2.Maximum Aposteriori (MAP)
    3.Least squares (LS)
    4.Minimum mean square error (MMSE)
    5. zero forcing (ZF).
    moreover, are the equalizer and receiver the same?
    if possible, please post a video on this topic sir. Thank you so much for inspiring us sir.

    • @iain_explains
      @iain_explains  2 года назад +2

      Have you checked out my webpage? iaincollings.com I've already got videos on all the topics you ask about. There are lots of them, but the three most relevant would be: "What are Maximum Likelihood (ML) and Maximum a posteriori (MAP)?" ruclips.net/video/9Ahdh_8xAEI/видео.html and "How are Matched Filter (MF), Zero Forcing (ZF), and MMSE Related?" ruclips.net/video/U3qjVgX2poM/видео.html and "What is Least Squares Estimation?" ruclips.net/video/BZ9VlmmuotM/видео.html

    • @gamingandmusic9217
      @gamingandmusic9217 2 года назад +1

      @@iain_explains Thank you so much sir.you have taken time to give me reply and all the links. Thank you so much agin sir.

  • @ujjwaltyagi3030
    @ujjwaltyagi3030 Год назад

    mistake at 15:07, fisher information would be 16 as 1/(0.25)^2=16

    • @iain_explains
      @iain_explains  Год назад

      No, you're mistaken. I said the variance is 0.25. The symbol for variance is sigma^2.

    • @ujjwaltyagi3030
      @ujjwaltyagi3030 Год назад +1

      @@iain_explains ok thanks my bad

  • @user-fk2pc9zp3t
    @user-fk2pc9zp3t Год назад

    讲的真好,谢谢了

  • @pitmaler4439
    @pitmaler4439 2 года назад

    Thanks. Is the FI only useful when you compare a situation with the identical PDF?
    There don't seem to be an unit for the FI, or can you compare parameter with different PDFs?

    • @iain_explains
      @iain_explains  2 года назад +2

      Just to clarify, the FI is not "comparing situations". It is a measure of the information in a random variable drawn from a (single) PDF. Of course the FI measure for different random variables (PDFs) can be compared.

    • @pitmaler4439
      @pitmaler4439 2 года назад +1

      @@iain_explains Yes, I just thought, you get a number for the FI. But for what purpose? Without the unit you cannot put the number in a relation. Now I read that they use that value for the Cramer-Rao bound.

    • @khalifi2100
      @khalifi2100 2 года назад +2

      ​ Example: like the uniform distribution, the FI is zero, and this is why they call the Uniform PDF: "uninformative PDF". So FI is a helpful measure even outside the use of Cramer-Rao bound calculation.

  • @arjunsnair4986
    @arjunsnair4986 2 года назад

    Thank you sir

  • @tuongnguyen9391
    @tuongnguyen9391 2 года назад

    Could you kindly explain "what is polarization " of polar code :) ?

    • @iain_explains
      @iain_explains  2 года назад +1

      I'll have to give that one some more thought. I don't really have a good intuitive explanation right now.

    • @tuongnguyen9391
      @tuongnguyen9391 2 года назад +1

      @@iain_explains Thank you professor

  • @InquilineKea
    @InquilineKea 4 месяца назад

    12:00

  • @musaadalruwaili5772
    @musaadalruwaili5772 2 года назад

    Hi, I really enjoy your videos, and I have learned a lot. I tried to find your email to contact you, and I only find your University's email. I am a Ph.D. student, and I am working on a D2D base on the NOMA system. So, could you please explain the D2D system and how it works? Thank you

    • @iain_explains
      @iain_explains  2 года назад

      Thanks for the suggestion. I'll put it on my "to do" list.