The score function is the first derivative of the log-likelihood. The observed information is MINUS the second derivative of the log-likelihood. It will always be positive when evaluated at the MLE.
DO NOT FORGET THE MINUS SIGN!
The likelihood function adjusts appropriately when more information becomes available. Observed information does what it says. Higher observed information leads to narrower confidence intervals. This is a good thing as narrower confidence intervals mean we are more sure about where the true value lies.
For a continuous parameter of interest, , the calculation of the MLE and its confidence interval follows the steps:
Write down the likelihood, .
Write down the log-likelihood, .
Work out the score function, .
Solve to get a candidate for the MLE, .
Work out . Check it is negative at the MLE candidate to verify it is a maximum.
Work out the observed information, .
Calculate the confidence interval for :
Changing the data that your inference is based on will change the amount of information, and subsequent inference (e.g. confidence intervals).
A statistic is said to be sufficient for a parameter , if is independent of when conditioning on .
An equivalent, and easier to demonstrate condition is that the likelihood can be factorised in the form , iff is sufficient.