PROF. N.
MAGAJI ELE8301CHAPTER 5 KUT
CHAPTER 5
(CONTINUOUS AND DISCRETE-TIME MARKOV
CHAINS)
Topics
Review of Markov Chains
Classification of states
First-time passage
Absorbing states
Tutorials
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
DISCRETE-TIME MARKOV CHAIN
A stochastic process { Xn } where n N = { 0, 1, 2, . . . } is
called a discrete-time Markov chain if
Pr{ Xn+1 = j | X0 = k0, . . . , Xn-1 = kn-1, Xn = i }
= Pr{ Xn+1 = j | Xn = i } transition probabilities
for every i, j, k0, . . . , kn-1 and for every n.
The future behavior of the system depends only on the
current state i and not on any of the previous states.
WHAT IS MARKOVA CHAIN
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
A Markov chain is a stochastic model describing a sequence of possible events in
which the probability of each event depends only on the state attained in the
previous event. In continuous-time, it is known as a Markov process..
Markov Analysis is a method used to forecast the value of a variable whose future
value is influenced only by its current position or state, not by any prior activity that
led the variable to its current position or state
The one-step transition matrix for a Markov chain
with states S = { 0, 1, 2 } is
p00 p01 p02
P p10 p11 p12
p20 p21 p22
where pij = Pr{ X1 = j | X0 = i }
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT CLASSIFICATION OF STATES
Accessible: Possible to go from state i to state j (path exists in
the network from i to j).
d1 d2 d3 d4
0 1 2 3 4 …
a0 a1 a2 a3
Two states communicate if both are accessible from each other. A
system is irreducible if all states communicate.
State i is recurrent if the system will return to it after leaving some
time in the future.
If a state is not recurrent, it is transient.
CLASSIFICATION OF STATES (CONTINUED)
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
A state is periodic if it can only return to itself after a
fixed number of transitions greater than 1 (or multiple
of a fixed number).
A state that is not periodic is aperiodic.
(0.5)
0 0 4
(0.5)
(1) (1) (1) (1)
2 1 2 1
(1) (1)
a. Each state visited b. Each state visited in multiples
every 3 iterations of 3 iterations
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
CLASSIFICATION OF STATES (CONTINUED)
An absorbing state is one that locks in the system once it enters.
d1 d2 d3
3 4
0 1 2
a1 a2 a3
Fig. 5.1
This diagram might represent the wealth of a gambler who
begins with $2 and makes a series of wagers for $1 each.
Let ai be the event of winning in state i and di the event of
losing in state i.
There are two absorbing states: 0 and 4.
CLASSIFICATION OF STATES (CONTINUED)
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
Class: set of states that communicate with each other.
A class is either all recurrent or all transient and may be either
all periodic or aperiodic.
States in a transient class communicate only with each other so
no arcs enter any of the corresponding nodes in the network
diagram from outside the class. Arcs may leave, though,
passing from a node in the class to one outside.
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
ILLUSTRATION OF CONCEPTS
0
Example 1
State 0 1 2 3
0 0 X X 0
1 X 0 0 0 1
3
2 0 0 0 X
3 X 0 0 X
Every pair of states communicates forming a single recurrent
class; moreover, the states are not periodic.
Thus the stochastic process is aperiodic and irreducible.
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
ILLUSTRATION OF CONCEPTS
Example 2 0
State 0 1 2 3 4
0 X X 0 0 0
1 X X 0 0 0 4 1
2 0 0 X 0 0
3 0 0 X X 0
4 X 0 0 0 0
3 2
States 0 and 1 communicate and form a recurrent class.
States 3 and 4 form separate transient classes.
State 2 is an absorbing state and forms a recurrent class.
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
ILLUSTRATION OF CONCEPTS
Example 3 0
State 0 1 2 3
0 0 X X 0
1 0 0 0 X 1
3
2 0 0 0 X
3 X 0 0 0
Every state communicates with every other state, so we
have an irreducible stochastic process.
Periodic? Yes, so Markov chain is irreducible and periodic.
CLASSIFICATION OF STATES
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
Example 1 0.4 0.6 0 0 0
2 0.5 0.5 0 0 0
P3 0 0 0.3 0.7 0
4 0 0 0.5 0.4 0.1
5 0 0 0 0.8 0.2
.6
.7
1 2
.4
3 4
.5
.4 .5 .3 .5 .8
.1
.2
A state j is accessible from state i if pij(n) > 0 for some n > 0.
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
In example, state 2 is accessible from state 1
& state 3 is accessible from state 5
but state 3 is not accessible from state 2.
States i and j communicate if i is accessible from j
and j is accessible from i.
States 1 & 2 communicate; also
states 3, 4 & 5 communicate.
States 2 & 4 do not communicate
States 1 & 2 form one communicating class.
States 3, 4 & 5 form a 2nd communicating class.
If all states in a Markov chain communicate
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
(i.e., all states are members of the same communicating class)
then the chain is irreducible.
The current example is not an irreducible Markov chain.
Neither is the Gambler’s Ruin example which
has 3 classes: {0}, {1, 2, 3} and {4}.
First Passage Times
Let fii = probability that the process will return to state i
(eventually) given that it starts in state i.
If fii = 1, then state i is called recurrent.
If fii < 1, then state i is called transient.
If pii = 1, then state i is called an absorbing state.
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
Above example has no absorbing states
States 0 & 4 are absorbing in Gambler’s Ruin problem.
The period of a state i is the smallest k > 1 such that
all paths leading back to i have a length that is
a multiple of k;
i.e., pii(n) = 0 unless n = k, 2k, 3k, . . .
If a process can be in state i at time n or time n + 1
having started at state i then state i is aperiodic.
Each of the states in the current example are aperiodic.
Linear Equations for Absorption Probabilities
EXAMPLE OF PERIODICITY - GAMBLER’S RUIN
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
States 1, 2 and 3 each have period 2.
0 1 2 3 4
0 1 0 0 0 0
1 1-p 0 p 0 0
2 0 1-p 0 p 0
3 0 0 1-p 0 p
4 0 0 0 0 1
If all states in a Markov chain are recurrent,
aperiodic, & the chain is irreducible then it is
ergodic.
EXISTENCE OF STEADY-STATE PROBABILITIES
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
A Markov chain is ergodic if it is aperiodic and allows
the attainment of any future state from any initial state
after one or more transitions. If these conditions hold,
then
j lim pij( n ) steady state probabilty for state j
n
For example, State-transition network
0.8 0 0.2
P 0.4 0.3 0.3
1 2
0 0.9 0.1
3
Conclusion: chain is ergodic.
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
TUTORIALS
Q1
Solution (i)
(i) Draw the graph of this chain
(ii) Identify the communicating class(es).
(iii) Find the recurrent, transient, and
absorbing state(s) of this chain.
(iii) Compute the fraction of time spent
at state 0 in the long run Solution (iv)
Solution
(ii)The communicating classes are {0;
1}, {2}, {3}, and {4}.
(iii) States 3 and 4 are transient, states 0
and 1 are recurrent, and
state 2 is absorbing (hence recurrent).
TUTORIALS Q2
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
Solution
(ii) Solution (i)
The chain is reducible and its
communicating classes are {0; 1}; {2}
(i) Draw the graph of this chain. Is the and {3}.
chain reducible? (iii) State 3 is transient, and states 0 ; 1 ;
(iii) Find the recurrent, transient, and 2 are recurrent.
absorbing state(s) of this chain.
Solution
(i)
TUTORIALS
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
Q3 Gambler’s Ruin Example: How to estimate p?
Consider the Probability p of winning on any turn give a state Transition Diagram as
Solution (i)
Find a Probability Transition Matrix
Q4 Credit Evaluation Example
• Every month, credit accounts are checked to determine the state of each customer :
– State 0: Fully paid
– State 1: Payment on account is 1 month (1-30 days) overdue
– State 2: Payment on account is 2 months (31-60 days) overdue
– State 3: Account is written off as bad debt
• The transition probability matrix is given as
TUTORIALS
PROF. N. MAGAJI ELE8301CHAPTER 5 KUT
Q4 Credit Evaluation Example(cont.)
What is the probability an account 1 month overdue eventually gets fully paid? Becomes
a bad debt?