Suppose and our (conjugate) prior for is . The posterior for is given by:
We let be the number of successes in those trials, so that
This is known as a Beta-binomial distribution. where and .
This is the probability of a future observation given counts have been observed. For the predictive, we integrate the likelihood over the posterior Gamma , Note and .
The following identities are useful particularly with the Normal distribution because the Normal distribution can be described completely by it mean and variance.
(3.2) | ||||
(3.3) |
Show that the mean and variance of the predictive of a Poisson likelihood with a gamma prior can be expressed as (where and .):
Note that the uncertainty in a future observation is denoted by and that
The uncertainty of a future observation can be split up into two parts
Uncertainty of the sampling distribution :
Parameter uncertainty:
Predicting only from the MLE only gives (1) but fails to take into account (2).
Parameter uncertainly gets smaller for large samples: .
Recall that the sum of independent normal random variables is also normal. Therefore, since both and , conditional on y and , are normally distributed, so is . The predictive distribution is therefore
It is worthwhile to have some intuition about the form of the variance of : In general, our uncertainty about a new sample is a function of our uncertainty about the center of the population as well as how variable the population is . As we become more and more certain about where is, and the posterior variance of goes to zero. But certainty about does not reduce the sampling variability , and so our uncertainty about never goes below .
Note as It can be shown that
where is the sample variance.