Let be a nonnegative integer-valued rv with , for 0, 1, 2, 3, …. Then the probability generating function (pgf) of is
is well defined, i.e. is (absolutely) convergent for any numerical value of in , because converges. It may, of course, converge in a larger region than this.
A simple but useful example is when is a Bernoulli variable with
with X taking no other values. Then
which is defined for all .
For real in [0,1], is increasing from to provided has a proper distribution. We shall also consider cases where is not proper, so that . One example could be the hitting time of in the gambler’s ruin problem.
There is a unique correspondence (1:1) between a pmf and the corresponding pgf . So if we know that has a pgf which may be expanded:
then must have the pmf , , , , .
The moments of a (proper) rv can be derived from :
by which we mean evaluated at . Also
This gives so that .
Distributions of sums of independent rvs can be found.
Let and be independent rvs with pmfs and respectively, and let have pmf . Then has pgf
the product NOT the sum of the pgf’s of and .
(a) is a result from mathematical analysis which is not proved here.
For example if has pgf
then has pmf , , , ….
We shall solve some problems by finding an expression for , and then obtaining the pmf by expanding as a power series. To do this we shall use some standard expansions. If necessary we use the formula for a Taylor (or Maclaurin) series about .
Proof of (b) - first version. Differentiate term by term:
which on setting gives
Proof of (b) - second version. Differentiate inside the expectation:
which on setting gives . This is possible because expectation is linear, i.e. we can let in
Example. Find the mean and variance of a random variable with pgf .
and so . Further,
so and .
Proof of (c).
We are using here the fact that and are independent, from which it follows that any function of is independent of any function of . In particular here and are independent, and the expectation of the product of independent rvs is the product of their expectations. ∎
Let , , …, be mutually independent rvs each with the same distribution and therefore the same pgf . Then their sum
has pgf
This follows from repeated application of the previous result.
Example. Let be a Bernoulli process, so that . Calculate the pgf of , and hence its distribution.
By 3.1.3:
Now we can expand this (Binomial expansion) to get
Reading off the coefficient of , we see that has a Binomial distribution.