Home page for accesible maths 4 Discrete random variables

Style control - access keys in brackets

Font (2 3) - + Letter spacing (4 5) - + Word spacing (6 7) - + Line spacing (8 9) - +

4.6 Chebychev’s inequality

A first step to understanding why expectation and variance matter is given by Chebychev’s inequality. Let R be any random variable. Suppose E(R)=m and Var(R)=σ2. Let c>0 be any constant: we will find a bound on the probability

P(|R-m|>cσ)

that R is more than c standard deviations away from its expected value.
Let A be the event that |R-m|>cσ, and let IA be the indicator of A. Recall that

E(IA)=1×P(IA=1)+0×P(IA=0)=P(A).

Also define the function g(r)=(r-m)2/(cσ)2, and notice that

g(r) 0

for all r, and

g(r) 1

whenever |R-m|>cσ, i.e. whenever A occurs.

So if A does not occur, then IA=0g(R).
And if A does occur, then IA=1g(R).
So IAg(R) and it follows that

P(A) = E(IA)
E[g(R)]
= E[(r-m)2/(cσ)2]
= 1c2σ2E[(R-m)2]
= 1c2.

We see that

P(|R-m|>cσ)1c2.