Recall that, formally, we say that two random variables and are independent if the events and are independent for all sets and , i.e.
for all sets and .
We have seen that when and are both discrete, they are independent if and only if their joint pmf can be factorised as a product of the marginal pmfs.
Similarly, when and are both continuous they are independent if and only if their joint pdf can be factorised as a product of the marginal pdfs.
Two continuous random variables and are independent if and only if
If and are independent then whatever the values of and , take and . Then
This is true for all , and so we may differentiate both sides wrt. and to obtain
If the joint pdf factorises we get for arbitrary sets and ,
∎
Factorisation To check for independence: if we have the joint pdf (or pmf) it is enough to check that it can be factorised as a function of times a function of :
and that the range of does not depend on (see CW question). We do not have to show that the functions and are themselves densities. Also if the range of does not depend on then the range of does not depend on , so we only need to check one of the two possibilities.
If the range of does not depend on (and vice versa) we say that and are variationally independent.
The figure below illustrates the joint density
where the function is one when and zero otherwise, for four different regions . In which cases are and independent?
Unnumbered Figure: First link, Second Link
Unnumbered Figure: First link, Second Link
Solution. TL: ind, TR: not ind, BL: not ind., BR: ind.
Given a joint pdf, a standard way to prove independence is to show factorisation and variational independence. To disprove independence, a counterexample to either suffices. This is straightforward for variational independence, but disproving factorisation is less obvious. The following method is recommended. An alternative is to show that a conditional distribution is not the same as a marginal distribution, but that usually involves more work.
Two point method: Note that can be factorised as a function of times a function of if and only if for all , , , :
Since, in the case of independence, both sides equal .
This is particularly useful for proving that a given joint pdf does not factorise as above. Simply find , and , such that the two sides above are different.
Are the following pairs of random variables independent?
for ,
for ,
for
Solution.
Independent: variationally independent and , so the joint density factorises.
Not independent: factorises BUT the range of depends on .
Not independent: variationally independent BUT with and we have
Note: given variational independence, we first try to factorise ; when we cannot, we look for a counter-example.
Fewer sets and : as in the proof of Theorem 5.6.1, setting and shows that if and are independent then for all , . It turns out (we will not prove this) that for any pair of random variables, whether discrete, continuous or more complicated, ‘ for all ’ is equivalent to and being independent (i.e. one need only consider a subset of the possible sets and ).
Setting and shows that the independence of and implies for all ; again, it can be shown that independence is equivalent to the factorisation of the survivor functions.
Let and be independent exponential random variables with parameters and respectively. Find .
Solution. By independence, for , . So