Assume that the probabilities vary smoothly with time so that the derivative of exists at , i.e. that
is a fixed matrix, not a function of , since it is evaluated at .
, i.e. the rows of sum to zero.
is called the transition probability rate matrix or infinitesimal generator.
For small ,
In particular so the diagonal elements of must all be negative (or zero). We set . This is the rate of transition out of state since .
Further, for , so that , the rate of transition to state from state . Note that . The outward and inward rates balance.
We shall see below that for the continuous time Markov process, the rate matrix is as important as the probability transition matrix of the discrete-time Markov chain.
As with the probability transition matrix, for two states , if and only if it is possible to move directly from state to state . (The rate matrix can be used to construct diagrams of the dynamics of the Markov chain as in 4.2.1.)
However, the entries in the rate matrix are not probabilities, so the off-diagonal elements can take values greater than 1. (This is similar to pdfs of continuous random variables.) The entries of the rate matrix relate to probabilities via
both for small .
The following two theorems help us to interpret the values of the rate matrix.
Let be the length of stay in state (say from time 0) before a transition to another state occurs. Then has an exponential pdf with mean (the inverse of the rate of transition from state ).
Let , be the survivor function of . Then for small
Thus
But giving . The pdf is which is . ∎
Define and set
We call the jump times of . The above result tells us that if , then . By the Markov property it can be shown that this is independent of . The difference is called the holding time at .
If the continous time Markov process is currently in state , then it moves to state next with probability
For small we have
So
and the result follows because this does not depend on . ∎
Define a discrete stochastic process by for . The above result says that is a discrete MC with transition matrix given by
We call the jump chain of process .
Consider a 3-state continuous time Markov process. The distributions of the time that the process remains in each state have means , and respectively. On leaving state 1 the process moves to state 2. On leaving state 2 the process is equally likely to go to either state 1 or state 3, and on leaving state 3 the process is twice as likely to go to state 1 than state 2. Write down the rate matrix noting that its entries need not be fraction and non-negative.