2. Markov Property
Definition 1.1 Let ๐๐ ๐โฅ0 be a discrete-time stochastic process with a
countable state space ๐ธ.
If โ๐ โฅ 0 and states ๐0, ๐1, โฆ , ๐๐โ1, ๐, ๐ โ ๐ธ
โ ๐๐+1 = ๐|๐๐ = ๐, ๐๐โ1 = ๐๐โ1, โฆ ๐0 = ๐0 = โ ๐๐+1 = ๐|๐๐ = ๐
whenever both sides are well-defined, then the process ๐๐ ๐โฅ0 is
called a discrete-time Markov chain.
In addition, if โ ๐๐+1 = ๐|๐๐ = ๐ = ๐๐๐, that is independently of ๐, the
chain is called homogenous, (HMC).
In this case we define the matrix P = ๐๐๐ as the transition matrix.
2 of 26
3. Stochastic matrix
Definition. A matrix ๐ = ๐๐๐ is called stochastic if
๐๐๐ โฅ 0,
๐โ๐ธ
๐๐๐ = 1
It follows that the transition matrix P is a stochastic matrix.
Even if the state space may have infinite states, the product operation of
stochastic matrix, as well as the product with a vector, are well defined
๐ถ = ๐ด โ ๐ต: ๐๐๐ =
๐โ๐ธ
๐๐๐๐๐๐
๐ฆ๐
= ๐ฅ๐
๐ด: ๐ฆ๐ =
๐โ๐ธ
๐ฅ๐๐๐๐
3 of 26
4. Example
Example 1.1 Machine Replacement
Let ๐๐ ๐โฅ0 be a sequence of i.i.d. random variables with values
in 1,2, โฆ , +โ
๐๐ models the lifetime of the machine ๐.
We assume that after any failure we replace immediately the
machine, and we denote by ๐๐ as the elapsed time the current
machine is in service.
We can show that ๐๐ ๐โฅ0 is a homogeneous Markov Chain,
with non-null entries equal to
๐๐,๐+1 = โ ๐1 > ๐ + 1|๐1 > ๐
๐๐,0 = 1 โ ๐๐,๐+1
4 of 24
6. Distribution of HMC
We define by ๐ the distribution of the initial state ๐0
๐ ๐ = โ ๐0 = ๐
called, initial distribution.
And by ๐๐ the distribution of the state at time ๐, ๐๐.
Using the Bayesโs sequential rule, it follows that
โ ๐๐ = ๐, ๐๐โ1 = ๐๐โ1, โฆ ๐0 = ๐0 = ๐ ๐0 ๐๐0๐1
๐๐1๐2
โฏ ๐๐๐โ1๐๐
or in matrix form
๐๐
๐ = ๐๐๐๐
We write โ๐ โ whenever ๐ ๐ = ๐ฟ๐๐, that is when the chain starts in ๐0 = ๐.
Theorem 1.1 Distribution of an HMC
The discrete time HMC is completely characterized by the initial distribution ๐, and its
transition matrix ๐.
6 of 24
7. Filtration
Let define by โฑ๐ the ๐-algebra generated by the sets
{๐๐ = ๐, ๐๐โ1 = ๐๐โ1, โฆ ๐0 = ๐0}
for any ๐0, ๐1, โฆ , ๐๐ โ ๐ธ
Then we have that โฑ๐ โ โฑ๐+1.
The collection โฑ๐ ๐โฅ0 is called filtration.
If ๐๐ ๐โฅ0 is an HMC then for any A โ โฑ๐ we have
โ ๐๐+1 = ๐|๐๐ = ๐, ๐ด = ๐๐๐
that can be written in a more succinct form as
โ ๐๐+1 = ๐|โฑ๐ = ๐๐๐๐
7 of 24
8. Markov Recurrences
Theorem 2.1 HMCs driven by White Noise
Let ๐๐ ๐โฅ0 (the white noise), be an i.i.d sequence of random variables
with value in ๐น and let
f: ๐ธ ร ๐น โ ๐ธ
be some function. Then with ๐0 โฅ ๐๐ ๐โฅ0,
๐๐+1 = f ๐๐, ๐๐
defines an HMC, with
๐๐,๐ = โ f ๐, ๐๐ = ๐
The same result follows if ๐๐+1 is conditionally independent of
๐0, ๐1, โฆ , ๐๐โ1, ๐1, โฆ , ๐๐โ1 given ๐๐. In this case ๐๐,๐ = โ๐ f ๐, ๐1 = ๐
8 of 24
9. Examples
Example 2.1 1-D Random Walk
๐๐~๐ต๐ ๐
๐๐+1 = ๐๐ โ 1 + 2๐๐
Example 2.2 Repair Shop
๐๐ ๐โฅ0 i.i.d and non-negative integer-valued
๐๐+1 = ๐๐ โ 1 +
+ ๐๐+1
9 of 24
10. Examples
Example 2.3 Inventory with (s,S)-strategy
๐๐ ๐โฅ0 i.i.d non-negative integer-valued
๐๐+1 =
๐๐ โ ๐๐+1
+ if ๐ โค ๐๐ โค ๐
๐ โ ๐๐+1
+
if ๐๐ < ๐
Example 2.4 Branching process (Galton-Watson process)
๐๐ ๐โฅ0 i.i.d with ๐๐ = ๐๐
1
, ๐๐
2
, โฆ ,
๐๐
๐
๐โฅ1
i.i.d. and non-negative integer valued
๐๐+1 =
๐=1
๐๐
๐๐+1
(๐)
10 of 24
11. First-Step Analysis
Let us anticipate that a closes set ๐ด โ ๐ธ is closed if
๐โ๐ด
๐๐๐ = 1, โ๐ โ ๐ด
First step analysis is a simple but powerful technique to
compute many properties, such as for example
absorption probability in closed sets.
Example 3.1 & 3.4 Gamblerโs Ruin
Example 2 Cat Eats Mouse Eat Cheese
11 of 24
12. Hitting probabilities
Definition Let ๐๐ด ๐ = inf ๐ โฅ 0: ๐๐ ๐ โ ๐ด be the hitting time of a set
๐ด โ ๐ธ, we define the hitting probability of ๐ด starting in ๐ โ ๐ธ, as
๐ข๐
๐ด
= โ๐ ๐๐ด < โ
[Nโ97] Theorem 1.3.2. The vector ๐ข๐ด = ๐ข๐
๐ด
: ๐ โ ๐ธ of hitting probabilities is
the minimal non-negative solution to the system of linear equations
๐ข๐
๐ด
= 1 for ๐ โ ๐ด
๐ข๐
๐ด
=
๐โ๐ธ
๐๐๐๐ข๐
๐ด
for ๐ โ ๐ด
(Minimality means that if ๐ฅ = ๐ฅ๐: ๐ โ ๐ธ is another solution with
๐ฅ๐ โฅ 0 for all ๐ โ ๐ธ, then ๐ฅ๐ โฅ ๐ข๐ hi all ๐ โ ๐ธ.)
[Nโ97] Example 1.3.1
12 of 24
1 2 3 4
1 2
1 2
1 2
1 2
13. Mean hitting times
Definition Let ๐๐ด ๐ = inf ๐ โฅ 0: ๐๐ ๐ โ ๐ด be the hitting time of a set
๐ด โ ๐ธ, we define the mean hitting probability of ๐ด starting in ๐ โ ๐ธ,
๐๐
๐ด
= ๐ผ๐ ๐๐ด
[Nโ97] Theorem 1.3.5. The vector of mean hitting times
๐๐ด = ๐๐
๐ด
: ๐ โ ๐ธ
is the minimal non-negative solution to the system of linear equations
๐๐
๐ด
= 0 for ๐ โ ๐ด
๐๐
๐ด
= 1 +
๐โ๐ธ
๐๐๐๐๐
๐ด
for ๐ โ ๐ด
We write ๐๐ for ๐ ๐
13 of 24
14. Exercises
Exercise. Given the transition matrix
๐ =
0 1
2 0 1
2 0
1
2 0 1
2 0 0
0 0 1 0 0
1
3 0 1
3 0 1
3
0 0 0 0 1
โข Draw transition graph of the chain.
โข Compute the absorption probability of the set 5 starting
from 1.
14 of 24
15. Exercises
Exercise. Consider the chain
โข Compute the mean absorption time to ๐ starting
from a.
15 of 24
a c
๐1
๐ + 1 โ1
1
1
๐๐
1
๐ + 1 โ1
๐ + 1 โ1
๐ states
16. Topology of the Transition Matrix
Definition 4.1 Communication
Let ๐, ๐ โ ๐ธ, if โ๐ โฅ 0 such that ๐๐๐
(๐)
> 0, then we say that ๐
is accessible from ๐, and we write ๐ โ ๐.
If ๐ โ ๐ and ๐ โ ๐ then we say ๐ and ๐ communicate, and we
write ๐ โ ๐
Properties of the relation of Communication
โข ๐ โ ๐ (reflexivity)
โข ๐ โ ๐ โ ๐ โ ๐ (symmetry)
โข ๐ โ ๐, ๐ โ ๐ โ ๐ โ ๐ (transitivity)
16 of 24
17. Topology of the Transition Matrix
Definition 4.2 Closed sets
A state ๐ โ ๐ธ is closed if ๐๐๐ = 1.
A set ๐ถ โ ๐ธ is closed if ๐โ๐ถ ๐๐๐ = 1, โ๐ โ ๐ถ
Example 4.1
Definition 4.3 Irreducibility
An HMC is irreducible if it has only one communication class.
17 of 24
1
2
3
4
6
5
8
7
18. Topology of the Transition Matrix
Definition 4.4 Arithmetic definition of Period
Let ๐ โ ๐ธ be a state and consider the set
๐ท๐ = ๐ โ โ: ๐๐๐ > 0
If ๐ท๐ = โ , we set the period ๐๐ = โ otherwise
๐๐ = GCD ๐ท๐
If ๐๐ = 1 we call ๐ aperiodic.
Theorem 4.2 Period is a Class property
If ๐ โ ๐ then ๐๐ = ๐๐.
18 of 24
19. Topology of the Transition Matrix
Theorem 4.1 Cyclic Structure
Any irreducible HMC has a unique partition of ๐ธ into ๐ classes,
๐ถ0, ๐ถ1, โฆ , ๐ถ๐โ1 such that โ๐, ๐ โ ๐ถ๐
๐โ๐ถ๐+1
๐๐๐ = 1
where ๐ is maximal and it is equal to ๐๐, โ๐ โ ๐ธ.
19 of 24
20. Topology of the Transition Matrix
Example 4.2
HMC with period ๐ = 3.
The general structure of P with period ๐ = 4 is the following
P4๐+1
=
0 ๐ด0 0 0
0 0 ๐ด1 0
0 0 0 ๐ด2
๐ด3 0 0 0
, P4๐+2
=
0 0 ๐ต0 0
0 0 0 ๐ต1
๐ต2 0 0 0
0 ๐ต4 0 0
,
P4๐+3
=
0 0 0 ๐ถ0
๐ถ1 0 0 0
0 ๐ถ2 0 0
0 0 ๐ถ3 0
, P4๐+4
=
๐ท0 0 0 0
0 ๐ท1 0 0
0 0 ๐ท2 0
0 0 0 ๐ท3
20 of 24
1
2
3 4
6
5
7
21. Steady state
Definition 5.1 Stationary Distribution
A probability distribution ๐ satisfying
๐๐
= ๐๐
โ ๐, global balance equations
is called a stationary distribution of the transition matrix ๐, or of the
corresponding HMC.
The global balance equation implies that for all states ๐ โ E,
๐๐ =
๐โ๐ธ
๐๐ ๐๐๐
and iterating, we get
๐๐ = ๐๐ โ ๐๐
Theorem 5.1 Steady State
An HMC started whit a stationary distribution is stationary.
21 of 24
22. Examples
Example 5.1
๐ธ = 1,2 and ๐ =
1 โ ๐ผ ๐ผ
๐ฝ 1 โ ๐ฝ
where
๐ผ, ๐ฝ โ 0,1 . Find the stationary distribution.
Example 5.4
It may exist more than one stationary distribution
Example 5.3 Symmetric Random Walk
Example 5.5 Repair Shop
22 of 24
23. Global Balance Equations and Probability Flows
Global balance equation
Consider the component ๐ of ๐๐ = ๐๐ โ ๐
๐ ๐ =
๐โ๐ธ
๐ ๐ ๐๐๐
subtracting ๐ ๐ ๐๐๐ from both terms we have
๐โ๐ธ,๐โ ๐
๐ ๐ ๐๐๐ =
๐โ๐ธ,๐โ ๐
๐ ๐ ๐๐๐
that has a graphical interpretation in terms of
probability flows
23 of 24
๐
24. Detailed Balance Equations and Probability Flows
Detailed balance equation
If a distribution ๐ satisties for all ๐, ๐ โ ๐ธ,
๐ ๐ ๐๐๐ = ๐ ๐ ๐๐๐, detailed balance equations
then ๐ is a stationary distribution.
24 of 24
๐ ๐
25. Time reversal
Transition matrix of the reversed chain
Take the stationary distribution ๐ and define the
stochastic matrix ๐ = (๐๐๐) with entries
๐๐๐ = ๐๐๐
๐ ๐
๐ ๐
then ๐ is the transition matrix of the stationary
time-reversed chain
๐๐๐ = โ๐ ๐โ1 = ๐|๐0 = ๐
25 of 24
26. Time reversal
Theorem 6.1 Reversal test
If ๐and ๐ are stochastic matrices and ๐ is a distribution
satisfying ๐ ๐ ๐๐๐ = ๐ ๐ ๐๐๐, โ๐, ๐ โ ๐ธ, then ๐ is
stationary.
Example 6.2 & 5.2 The urn of Ehrenfest
Example 6.3 Random walk on graphs
Example 6.4 Birth and Death process
26 of 24
27. Strong Markov property
Definition 7.1 Stopping time
A random time ๐ โ โ0 โช +โ is a stopping time with
respect to a stochastic process ๐๐ ๐โฅ0 if the event
๐ = ๐ โ โฑ๐
that is, there is a function fn with values in 0,1 such
that
เซค ๐=๐ = f๐ ๐0, ๐1, โฆ , ๐๐โ1, ๐๐
Example 7.1 & 7.5 Return times & successive return times
๐ ๐ = inf ๐ โฅ 1: ๐๐ = ๐
note that ๐ ๐ โ ๐๐.
27 of 24
28. Strong Markov property
Theorem 7.1 Strong Markov property
Let ๐๐ ๐โฅ0 be an HMC with countable state space ๐ธ
and transition matrix ๐, and ๐ a stopping time with
respect to it. Then for any state ๐ โ ๐ธ, given that ๐๐ = ๐
(i.e., ๐ < โ) the following holds
โข The process after ๐ and the process before ๐ are
independent
โข The process after ๐ is an HMC with transition matrix ๐
Exercise 2.7.2 Markov chain restricted to a subset of ๐ธ
28 of 24
29. Regeneration
Let
๐๐ =
๐โฅ1
เซค ๐๐=๐
be the number of visits to ๐ โ ๐ธ
Theorem 7.2 Visits to a state
The distribution of ๐๐, given ๐0 = ๐, is
โ๐ ๐๐ = ๐ =
๐๐๐๐๐๐
๐โ1
1 โ ๐๐๐ for ๐ โฅ 1
1 โ ๐๐๐ for ๐ = 0
where ๐๐๐ = โ๐ ๐ ๐ < โ , with ๐ ๐ the return time to ๐.
29 of 24
30. Regeneration
Theorem 7.3 Recurrence
For any ๐ โ ๐ธ
โ๐ ๐ ๐ < โ = 1 โ โ๐ ๐๐ = โ = 1
and
โ๐ ๐ ๐ < โ < 1 โ โ๐ ๐๐ = โ = 0 โ ๐ผ๐ ๐๐ < โ
In particular ๐๐ = โ has โ๐-probability 0 or 1
30 of 24
31. Regeneration
Theorem 7.4 Regenerative Cycle Theorem
Let ๐๐ ๐โฅ0 be an HMC with initial state 0 that is almost
surely visited infinitely often โ0 ๐0 = โ = 1.
Denoting by ๐0 = 0, ๐1, ๐2, โฆ the successive times to
visit 0, called regeneration times, the pieces of
trajectory, called regenerative cycles,
๐๐๐
, ๐๐๐+1, ๐๐๐+2, โฆ , ๐๐๐+1โ1 , ๐ โฅ 0
are independent and identically distributed.
31 of 24
32. Bibliography
[Bโ99] Brรฉmaud (1999). Markov chains Gibbs fields,
Monte Carlo simulation and queues. Springer-Verlag
[Nโ97] Norris (1997). Markov Chains. Cambridge
University Press.
32 of 24