41 - Proof: Gamma prior is conjugate to Poisson likelihood

Поделиться
HTML-код
  • Опубликовано: 22 дек 2024

Комментарии • 14

  • @xochdt
    @xochdt 3 года назад +1

    This was uploaded 7 years ago, yet it's still saving carees. Thank you so much.

  • @jemimanamale4447
    @jemimanamale4447 3 года назад

    At 1:23 should the it be x^a-1? I’m seeing different resources using either x or lambda

  • @isyakumuhammad6982
    @isyakumuhammad6982 5 лет назад +1

    thank you very much , this is a very clear video.
    could you please provide me a link the explain an inverse Gaussian distribution with power normal prior.
    thanks.

  • @ediltonbrandao8407
    @ediltonbrandao8407 Год назад

    thnks from Brazil

  • @toishiki
    @toishiki 8 лет назад +1

    Thank you sooo much!! Very clear!

  • @marlonkakurira9931
    @marlonkakurira9931 4 года назад

    Well understood. Many thanks!!!

  • @S4N0I1
    @S4N0I1 6 лет назад +2

    i love you for this

  • @alessandro64484
    @alessandro64484 9 лет назад

    hi, i have a question: is it possible to compute the joint distribution of the poisson? π(lambda,X1,...,Xn)?

  • @mashhoodahmad6101
    @mashhoodahmad6101 3 года назад

    Dear Sir Please tell how we will deal with if summation (1 to N)(theta -x) as you told summation (1 to N)(X)=N.Xbar

    • @choendenkyirong8313
      @choendenkyirong8313 3 года назад

      I think you'd just simplify as such:
      sum( theta - x_i ) = sum(theta) - sum(x_i) = n*theta - n*xbar = n(theta - xbar)

    • @mashhoodahmad6101
      @mashhoodahmad6101 3 года назад

      @@choendenkyirong8313 Thankyou so much Sir

  • @RobertoMartin1
    @RobertoMartin1 10 лет назад +1

    That was excellent

  • @danielmburu6936
    @danielmburu6936 9 лет назад

    wow thanks so much for the info

  • @lemyul
    @lemyul 5 лет назад

    thakns for sketching out