4 Markov chains

4.1 Introduction to Markov chains

Example 4.1.1 (Rainfall in Tel-Aviv (following Cox and Miller)).

A very simple model for the observed sequence of wet and dry days is to assume that:

if today is dry, tomorrow will be wet with probability a=0.25, and

if today is wet, tomorrow will be dry with probability b=0.4,

each of these being irrespective of the pattern of weather on previous days.

Consider a SP Xt with value 1 corresponding to a dry day and 2 to a wet day, and suppose today, day 0, is dry, so that X0=1. Then the predicted distribution of X1 may be written as a vector of values

(P(X1=1|X0=1)P(X1=2|X0=1))=(1-aa)=(0.750.25)=π1T say.

Notice that π1 is actually a conditional probability. What about the day after? By the theorem of total probability,

P(X2=1|X0=1)= P(X2=1|X1=1,X0=1)P(X1=1|X0=1)
+P(X2=1|X1=2,X0=1)P(X1=2|X0=1)

which we can simplify, because the weather on day 2 is influenced only by that on day 1 but not that on day 0, to give

P(X2=1|X0=1)= P(X2=1|X1=1)P(X1=1|X0=1)
+P(X2=1|X1=2)P(X1=2|X0=1).

A similar formula holds for P(X2=2|X0=1), and we can express these together as:

(P(X2=1|X0=1)P(X2=2|X0=1)) =(0.75  0.25)(0.750.250.40.6)
=(0.6625  0.3375)
=π2 say.

A similar argument gives

π3 =(P(X3=1|X0=1)P(X3=2|X0=1))
=(0.6625  0.3375)(0.750.250.40.6)
=(0.6319  0.3681)

and so on,

πn+1=πn(0.750.250.40.6).

Some of the further vectors are:

π4T=(0.62120.3788),π5T=(0.61740.3826),π9T=(0.61550.3845),

after which no change takes place to 4 decimals.

Definition 4.1.2.

A discrete Markov chain (MC) is a SP Xt for which

P(Xt=xt|Xt-1=xt-1,Xt-2=xt-2,,X1=x1)=P(Xt=xt|Xt-1=xt-1)

whatever the values of xt-1, xt-2, …x1.

Definition 4.1.3.

The transition probability matrix (TPM) Pt at time t of a MC is defined by its elements to be:

(Pt)i,j=P(Xt+1=j|Xt=i),

the probabilities of transition to state j at time t+1 from state i at time t.

Remark.

Row i of Pt is the conditional pmf of Xt+1 given Xt=i, so sums to 1. We say that a matrix is stochastic if every row is a pmf.

Markov chains appear in many practical applications from economics to genetics.

Definition 4.1.4.

A MC is homogeneous (in time) if Pt=P does not depend on time t.

We shall henceforth assume that all Markov chains we consider are homogeneous unless otherwise stated.

Exercise 4.1.5.

Professor Urmintrude lives in Galgate, and walks to and from work each day. She possesses one umbrella. If it is raining in the morning, and the umbrella is at home, she will take it to work. If it is not raining she does not take her umbrella, whether or not it is at home. Similarly on her return journey, she takes the umbrella if, and only if, it is raining and the umbrella is at work.

Every morning and evening it rains independently with probability p.

Calculate the transition matrix for the Markov chain {Xt} that describes whether the umbrella is at home just before she leaves for work each morning at the t-th day.

For each day, two states for the umbrella: Home and Work, which we denote 1 and 2.

P(Xt=1|Xt-1=1) = (1-p)+p2
(no rain in morning)+(rain in morning and evening)
P(Xt=1|Xt-1=2) = p
(rain in the evening).

Since the transition probability does not depend on t, this is a homogeneous chain with

P=(1-p+p2p-p2p1-p).
Exercise 4.1.6 (Simulation).

Use a coin to simulate realisations for each of the following three Markov chains. Record your realisation of the state of the Markov chain for 10 to 20 time steps in each case.

  • (i)
    P=(00.500.50.500.5000.500.50.500.50).                                         
  • (ii)
    P=(00.50.50000.50.50.5000.50.50.500).                                         
  • (iii)
    P=(0.500.50.00.500.50.500.5000.500.5).                                         

Note that for each chain, there are four states and you can choose the intial state by tossing a fair coin twice and, for example, choosing 1 if HH appears, 2 if HT appears etc.