Home page for accesible maths 5 Models for discrete random variables

Style control - access keys in brackets

Font (2 3) - + Letter spacing (4 5) - + Word spacing (6 7) - + Line spacing (8 9) - +

5.3 Bernoulli random variables

Jacob Bernoulli (1654-1705) was a member of a family of whom as many as 12 have contributed to some branch of mathematics or physics and at least 5 have written on probability. Jacob and his brother John were great rivals and would only communicate in print arguing over the correctness of each other’s mathematical proofs.

Consider an experiment where the sample space is {0,1} and the probability of a 1 is θ (0θ1). A random variable R with such a pmf is termed a Bernoulli random variable. Examples include:

  • number of heads on the toss a of biased coin,

  • 1 if the next patient has cancer, 0 otherwise

  • 1 if the next person smokes and is over 6 ft tall, 0 otherwise

  • 1 if the next baby is a boy, 0 otherwise.

Here outcomes are sometimes called failure and success for 0 and 1 respectively. Note that all of these examples give an indicator function for the event concerned; indicator functions of events are always Bernoulli random variables.

For Bernoulli random variables

P(R=1)=θ, P(R=0)=1-θ,

and P(R=r)=0 otherwise.

The pmf of a Bernoulli random variable R is pR(0)=(1-θ), pR(1)=θ, and pR(r)=0 otherwise. We say RBernoulli(θ).

Exercise 5.3.

Find the expectation and variance of a Bernoulli random variable.

Solution.
E[R] = r=0rpR(r)
= 0×(1-θ)+1×θ
= θ.
E[R2] = r=0r2pR(r)
= 02×(1-θ)+12×θ
= θ.
Var(R) = E[R2]-(E[R])2
= θ-θ2=θ(1-θ).

Notice if θ=0 or 1 the variance is 0, whereas if θ=0.5 the variance is maximised. Is this logical?