A state is persistent (also called recurrent) if
Otherwise a state is transient.
This definition states that a state, , is persistent if the chain, started from state , will always (eventually) return to that state, and transient if it does not necessarily return.
An absorbing state (such that ) is persistent by definition. Contrariwise, a state for which probability of returning to is zero (there is no arrow towards ) is transient.
Quite often, we show that a state is persistent by showing that
The following theorem gives us a way to calculate whether a state is transient or persistent.
A state is transient iff
For a state , let be the number of visits to . Then, using indicator functions,
where is the indicator function, i.e. if is true, and otherwise. So
Let
Then, conditional on , is a geometric random variable with parameter (each return to corresponds to a failure, and successive trials are independent and identically distributed by the Markov property). Hence
and the rhs is finite iff is transient. ∎
Recall the definition of communicating classes from Section 4.2. The following theorem shows that transience and persistence are class properties.
The states within a single communicating class are either all transient or all persistent.
Every persistent class is closed.
Every finite closed class is persistent. In particular, all states in a finite irreducible chain are persistent.
Suppose that a class is not closed. Then there exist , such that . But then
where the penultimate equality uses the Markov property and the last inequality follows since as . Therefore , and hence is not persistent.
Suppose that is a finite closed class and let . Since is finite, there must exist some for which
Since , there exists some such that . But then
But this means that , where is the number of visits to and by the proof of Theorem 4.5.2, and hence is persistent.
∎
It is generally not difficult to identify closed classes, so establishing transience or persistence for finite MCs is straightforward.
(i) For which of the Markov chains in 4.2.2 above, is there a transient state?
: state 1.
(ii) What are transient states of the Markov chain with transition matrix?
There are two classes and with being transient.
The period of a state, , is defined as
A state, , has period if the Markov chain, started at , can only return to after a multiple of time-steps.
All states in the same communicating class of a Markov chain have the same period. In particular, all states of an irreducible Markov chain have the same period.
A Markov chain is aperiodic if all states have period 1. (So an irreducible Markov chain is aperiodic if any state has period 1.)
A periodic chain exhibits some form of deterministic behaviour, for example a Markov chain with period 2 will alternate between two sets of states, one of which it can be in at odd time points, and the other at even time points.
If an irreducible Markov chain has for some then it is aperiodic.
.
Classify the Markov chains in 4.2.2 as either periodic or aperiodic. State the period of the periodic Markov chains.
has period 2; others are aperiodic.
and are aperiodic since each communicating class has a state which can return to itself in 1 step with positive probability. For , is aperiodic for the same reason. The class is aperiodic as it can return to state 2 in 2 steps or 3 steps and gcd(2,3)=1.