Home page for accesible maths 13 Information and Sufficiency

Style control - access keys in brackets

Font (2 3) - + Letter spacing (4 5) - + Word spacing (6 7) - + Line spacing (8 9) - +

13.4 Summary

{mdframed}
  • 1

    The score function is the first derivative of the log-likelihood. The observed information is MINUS the second derivative of the log-likelihood. It will always be positive when evaluated at the MLE.

    DO NOT FORGET THE MINUS SIGN!

  • 2

    The likelihood function adjusts appropriately when more information becomes available. Observed information does what it says. Higher observed information leads to narrower confidence intervals. This is a good thing as narrower confidence intervals mean we are more sure about where the true value lies.

    For a continuous parameter of interest, θ, the calculation of the MLE and its confidence interval follows the steps:

    1. 1

      Write down the likelihood, L(θ).

    2. 2

      Write down the log-likelihood, l(θ).

    3. 3

      Work out the score function, S(θ)=l(θ).

    4. 4

      Solve S(θ^)=0 to get a candidate for the MLE, θ^.

    5. 5

      Work out l′′(θ). Check it is negative at the MLE candidate to verify it is a maximum.

    6. 6

      Work out the observed information, IO(θ^)=-l′′(θ).

    7. 7

      Calculate the confidence interval for θtrue:

      (θ^-1.96IO(θ^),θ^+1.96IO(θ^)).
  • 3

    Changing the data that your inference is based on will change the amount of information, and subsequent inference (e.g. confidence intervals).

  • 4

    A statistic T(𝐱) is said to be sufficient for a parameter θ, if 𝐱 is independent of θ when conditioning on S(𝐱).

  • 5

    An equivalent, and easier to demonstrate condition is that the likelihood can be factorised in the form L(θ)=g(𝐱)×h(T(𝐱),θ), iff T(𝐱) is sufficient.