2.3.6 Bayesian Inference for the Gaussian - The Mean

Поделиться
HTML-код
  • Опубликовано: 9 сен 2024
  • In the previous section we discussed maximum likelihood estimation of the parameters of a Gaussian. In this section we turn to Bayesian estimation of these parameters, by combining their likelihoods with prior distributions. Examining the form of the likelihood function motivates the choice of the prior distribution for which the posterior distribution will be in the same family as the likelihood i.e. the conjugate prior. We determine the posterior distribution for the mean and examine how its mean and variance relate to the parameters of the prior, the number of datapoints, and the maximum likelihood estimate.

Комментарии • 4

  • @oojassalunke9421
    @oojassalunke9421 10 дней назад

    Can you update your playlist if the new videos?

  • @kH-ul4hk
    @kH-ul4hk Месяц назад

    Love these videos Sina!
    This book recommends the following mathematical background: multivariate calculus and basic linear algebra. Do you have any recommended resources to get the necessary background ASAP? I feel like you can spend years diving deep into just those two areas, but I just want to know enough to understand this book and do the exercises.

    • @sinatootoonian9129
      @sinatootoonian9129  Месяц назад

      Have a look at this book: mml-book.github.io/book/mml-book.pdf and supplement as needed with Schaum's outlines, which are cheap and have lots of problems with solutions. The one for linear algebra is particularly good.