Math330 Additional Exercises 2
This sheet gives additional exercises covering all of the course.
-
1.
Let be independent with for known constants .
Calculate the log-likelihood .
Hence calculate the MLE, .
Calculate also the Fisher’s information matrix .
Give a condition on which leads to the orthogonality of and . (2006 A2)
-
2.
Suppose are IID Normal random variables with unknown. Find
-
(a)
the maximum likelihood estimator, , of ;
-
(b)
the maximum likelihood estimator of ;
-
(c)
the asymptotic variance matrix for by using the observed information at ;
-
(d)
an approximate 95% confidence interval for .
(1997 A3)
-
(a)
-
3.
Consider the normal model
where . Find an expression for and calculate its asymptotic distribution.
Particular interest lies in the parameter . Calculate and its asymptotic distribution. (2006 B3 part)
-
4.
Assume that are independent random variables with a Normal(,) distribution. It is well known that, in this case, the maximum likelihood estimator of is given by
and
Furthermore the expected information matrix is
-
(a)
Find the asymptotic distribution of the maximum likelihood estimator of .
-
(b)
Briefly comment on the implications that the diagonality of the information matrix has for parameter estimation.
-
(c)
Give the maximum likelihood estimator of .
-
(d)
Find the asymptotic distribution of the maximum likelihood estimator of , stating the general result used.
(2005 A3)
-
(a)
-
5.
Suppose are observations independently drawn from
where , and , and and are two known covariates associated with observation .
-
(a)
Explain why is a more suitable specification for than .
-
(b)
Write down the log-likelihood function, .
-
(c)
Show that the inverse of the expected information matrix is given by
where
Note: You may use the following result
-
(d)
Find the asymptotic distribution of the maximum likelihood estimator of , where is the expected value of . Quote all results used.
(2004 B1)
-
(a)