5 Continuous Markov chains

5.6 Poisson processes

Definition 5.6.1.

The temporal Poisson Process (P.P.) is a cts homogeneous MC N(t). It is defined to be the number of events in the interval (0,t], which occur at random, independently, and with a fixed rate λ per unit time.

We describe this, to first order, for small h>0

P[N(t+h)=j|N(t)=i]{0ifj<i,1-λhifj=i,λhfj=i+1,0ifj>i+1.

Consequently, N(t) has PT rate matrix

Q=(-λλ00-λλ000-λλ)

or, specifically, Qij=λ for j=i+1, Qii=-λ and otherwise Qij=0.

Now let the distribution of N(t) be π(t)=(π(t)0,π(t)1,π(t)2,), i.e. π(t)j=P(N(t)=j).

Theorem 5.6.2.

For the Poisson process

π(t)j=exp(-λt)(λt)jj!.

i.e. N(t) has a Poisson distribution with mean λt.

Proof.

From π(t)=π(t)Q we have the first equation

π(t)0=-λπ(t)0dπ0π0=-λdt[logπ(t)0]0t=[-λt]0tlogπ(t)0=-λt

since π(0)0=1. This gives the first term in the distribution,

π(t)0=exp(-λt).

The equation for the transition to state k+1 gives, for all k0:

π(t)k+1=λπ(t)k-λπ(t)k+1. (5.1)

This is both a difference and differential equation. To solve this let

π(t)k=exp(-λt)ρ(t)k

noting that ρ(0)k=π(0)k=0 for k1 and ρ(t)0=1. By definition, π(t)k+1=exp(-λt)ρ(t)k+1. Differentiating this gives

π(t)k+1=exp(-λt)ρ(t)k+1-λexp(-λt)ρ(t)k+1. (5.2)

Equating (5.1) and (5.2) gives

exp(-λt)ρ(t)k+1-λexp(-λt)ρ(t)k+1=λexp(-λt)ρ(t)k-λexp(-λt)ρ(t)k+1,

which simplifies by cancellation to

ρ(t)k+1=λρ(t)k.

Now ρ(t)0=1, so

ρ(t)1=λρ(t)0=λρ(t)1=λt

by direct integration. Similarly

ρ(t)2=λρ(t)1=λ2tρ(t)2=λ2t22.

Continuing in this way (formally by induction) we obtain

ρ(t)k=λktkk!

which gives the required result. ∎

5.6.1 The Exponential and Gamma distributions for waiting times

From the result 5.2.1 about the length of stay in a given state, we can immediately deduce that the intervals between events occurring in the Poisson process all have an exponential distribution with mean 1/λ. Also these intervals are independent since the length of stay in a state does not depend on what went before.

The time Tn to the nth event in the process is then the sum of the n independent intervals between these first n events, i.e. the sum of n independent exponential random variables. The relationship between the counting process N(t) and the interval process Tn is captured in probability terms by the equation:

P(Tnt)=P(N(t)n).

The cdf for the time to the nth event, Fn(t), is therefore

1-e-λti=0n-1(λt)ii!.

Differentiating this gives the pdf

λe-λti=0n-1(λt)ii!-e-λti=1n-1λ(λt)i-1(i-1)! = λe-λt(i=0n-1(λt)ii!-i=0n-2(λt)ii!)
= λntn-1(n-1)!e-λt.

This is the pdf of a gamma distribution.

Exercise 5.6.3.

Given λ=0.5, calculate the probability that the third event occurs in the interval [4<t6]. ( First evaluate P(T3>4) and P(T3>6) ).

We need

P(T3>4)-P(T3>6) = i=022ie-2/i!-i=023ie-3/i!
= e-2(1+2+2)-e-3(1+3+9/2)
= 5e-2-172e-3.