I've come across this several times, 'the posterior is proportional to the numerator.' I'm a little lost; don't we need an exact result? Or is there a simpler method to account for the denominator in some desired output?
Thanks for your comment. You've identified the key issue with doing exact Bayesian inference -- that we cannot calculate the denominator in most applied circumstances. This motivates most of the main computational approaches to doing approximate inference, including Markov chain Monte Carlo (MCMC). These methods only require the unnormalised posterior (aka the numerator of Bayes' rule) to generate samples from the posterior. In turn, these samples are used to understand and (approximately) summarise the posterior. Hope that helps! Best, Ben
@@SpartacanUsuals This was what I was wondering too! Thank you Ben! I had another question: is it standard procedure to add a y or x value with alpha. how would a sum that includes the realization x help with the MCMC method? Thanks
If a prior is for example that P(theta = theta*) = 1; i.e. the person is without looking at the data certain that theta is equal to theta*, e.g. 0.5 (fair coin). would the posterior distribution then always be equal to the prior and there is never any updating?
I've come across this several times, 'the posterior is proportional to the numerator.' I'm a little lost; don't we need an exact result? Or is there a simpler method to account for the denominator in some desired output?
Thanks for your comment. You've identified the key issue with doing exact Bayesian inference -- that we cannot calculate the denominator in most applied circumstances. This motivates most of the main computational approaches to doing approximate inference, including Markov chain Monte Carlo (MCMC). These methods only require the unnormalised posterior (aka the numerator of Bayes' rule) to generate samples from the posterior. In turn, these samples are used to understand and (approximately) summarise the posterior. Hope that helps! Best, Ben
@@SpartacanUsuals This was what I was wondering too! Thank you Ben! I had another question: is it standard procedure to add a y or x value with alpha. how would a sum that includes the realization x help with the MCMC method?
Thanks
If a prior is for example that P(theta = theta*) = 1; i.e. the person is without looking at the data certain that theta is equal to theta*, e.g. 0.5 (fair coin). would the posterior distribution then always be equal to the prior and there is never any updating?
How do you divide by 0? When theta is equals to 1 or 0
I got it, it's infinite
Cool !!
Prior posterior not defined wtf