Suppose we know the joint distribution of but then we find out the value of one of the random variables. What can we say about the other random variable?
We consider the conditional distributions , i.e. the distribution of given that , and , i.e. the distribution of given that .
Recall that when and were discrete random variables the conditional pmfs were:
Similarly when and are continuous random variables the conditional pdfs are
Note that since we can only condition on possible values, we don’t have to worry about zero’s in the denominators: the marginal pmf/pdf has to be positive for the value to occur.
Also note that the conditional pdfs are themselves valid pdfs: they are non-negative and they integrate to . For instance,
Similarly, conditional pmfs sum to .
When the variables are independent discrete RVs then for all , , recall that
,
.
Similarly if are independent continuous RVs then for all , ,
,
.
These results conform with intuition as when and are independent knowing the value of should tell us nothing about and vice versa.
The converse is also true: If the conditional distribution of given is independent of or, equivalently, the conditional distribution of given is independent of , then and are independent.
A piece of string of unit length is tied at one end to a hook. The string is cut at a (uniform) random distance from the hook. The piece remaining is then cut again at a (uniform) random distance from the hook. Given that the remaining length tied to the hook has length , find the pdf of the position of the first cut.
Unnumbered Figure: Link
Solution. Model with and We know for , and for .
Now .
We know and otherwise.
So we need .
for Hence
Continuous random variables and have joint pdf
Find
the conditional pdf of given ,
.
Solution.
Since we need the marginal pdf .
for Hence for
When we have so