Home page for accesible maths 6 Expectation (II)

Style control - access keys in brackets

Font (2 3) - + Letter spacing (4 5) - + Word spacing (6 7) - + Line spacing (8 9) - +

6.1 Bivariate Expectations

We know how to obtain expectations for univariate random variables. The definition extends easily to bivariate random variables. The expectation of any function g(X,Y) is given by:

Discrete random variables
𝖤[g(X,Y)]=s=-t=-g(s,t)pXY(s,t),
Continuous random variables
𝖤[g(X,Y)]=s=-t=-g(s,t)fXY(s,t)dtds.

In the rest of this section results are given for the continuous random variable case only, however these extend immediately to discrete random variables.

Moments of either variable alone can be obtained from the joint distribution or from the relevant marginal.

𝖤[X]=s=-t-sfXY(s,t)dtds=-s{-fXY(s,t)dt}ds=-sfX(s)ds,

and, more generally, for a function g,

𝖤[g(X)]=s=-t=-g(s)fXY(s,t)dtds=-g(s)fX(s)ds.

Similarly for Y and any function h (including h(Y)=Y),

𝖤[h(Y)]=t=-s=-h(t)fXY(s,t)dsdt=-h(t)fY(t)dt.

Using linearity of integrals we also have for any functions g and h

𝖤[g(X)+h(Y)] =--[g(s)+h(t)]fXY(s,t)dtds
=--g(s)fXY(s,t)dtds+--h(t)fXY(s,t)dtds
=𝖤[g(X)]+𝖤[h(Y)].

In particular

𝖤[X+Y]=𝖤[X]+𝖤[Y],

regardless of the joint distribution of (X,Y).

If X and Y are independent we also have for any functions g and h

𝖤[g(X)h(Y)] =s=-t=-g(s)h(t)fXY(s,t)dtds
=s=-t=-g(s)h(t)fX(s)fY(t)dtds
=s=-g(s)fX(s){t=-h(t)fY(t)dt}ds
={-g(s)fX(s)ds}{-h(t)fY(t)dt}
=𝖤[g(X)]𝖤[h(Y)].

In particular, if X and Y are independent, then

𝖤[XY]=𝖤[X]𝖤[Y].

Firstly we note that for dependent random variables 𝖤[XY]𝖤[X]𝖤[Y], in general. For example, setting Y=X gives

𝖤[XY]=𝖤[X2]𝖤[X]2=𝖤[X]𝖤[Y],

the difference between the two being 𝖵𝖺𝗋[X].

More subtly, even when 𝖤[XY]=𝖤[X]𝖤[Y], X and Y need not be independent.

Example 6.1.1.

Let XN(0,1) and Y=X2-1. Find 𝖤[XY] and 𝖤[X]𝖤[Y].

Solution.  𝖤[X]=0, so 𝖤[X]𝖤[Y]=0. Also

𝖤[XY]=𝖤[X3-X]=𝖤[X3]-𝖤[X]=0-0=0,

since 𝖤[Xr]=0 for r an odd integer. So 𝖤[XY]=𝖤[X]𝖤[Y]=0.

The joint distribution of (X,Y) is illustrated on Figure 6.1. Clearly the variables X and Y are strongly related, as given X we know Y exactly.

Figure 6.1: Link, Caption: A 1000 realisations of (X,Y), where XN(0,1) and Y=X2-1. X and Y are uncorrelated (ρ=0) but not independent.
Example 6.1.2.

Find the expected value of X-Y if 𝖤[X]=𝖤[Y]. Does this result depend on other features of the joint distribution of (X,Y)?

Solution.  𝖤[X-Y]=𝖤[X]+𝖤[-Y]=𝖤[X]-𝖤[Y]=0. No other assumptions are needed.

Example 6.1.3.

The random variables (X,Y) have joint pdf

fXY(x,y)={1/20<x<y, 0<y<20otherwise

Find 𝖤[X], 𝖤[Y] and 𝖤[XY]. Does 𝖤[XY]=𝖤[X]𝖤[Y]?

Solution. 

Unnumbered Figure: Link

𝖤[X] =s=02t=s2s12dtds
=12s=022s-s2ds
=12[s2-s33]02
=2/3
𝖤[Y] =s=02t=s2t12dtds
=12s=0222-s22ds
=12[2s-s36]02
=4/3.
𝖤[XY] =s=02t=s2st12dtds
=s=02[st24]t=s2ds
=s=02s-s34ds
=[s22-s416]s=02
=1𝖤[X]𝖤[Y]=8/9.