The document discusses Markov processes and key concepts such as:
1) States of a system can change between time periods according to transition probabilities; 2) The probability of a system being in a particular state at a time period depends on its previous state; 3) Steady state probabilities exist when state probabilities do not change over time. Examples of Markov processes include modeling customer shopping behavior and accounts receivable aging analysis. Absorbing states and fundamental matrices are important concepts for processes with absorbing states.
1. MARKOV PROCESSES
• Trials of the system: Events/ Points of Time
• State of the System
• Transition Probability
• State Probability
• Steady State probability
• Absorbing State
• Fundamental Matrix
2. Markov Processes
• Trials of the Process: The events that trigger transitions of the system from one
state to another. In many applications successive time periods represent the trials
of the process.
• State of the System: The condition of the system at any particular trial or time
period.
• Transition Probability: Given the system is in state i during one period, the
transition probability pij is the probability that the system will be in state j during
the next period.
• State Probability: The probability that the system will be in any particular
state.(That is i(n) is the probability of the system being in state i in period n.)
• Steady State Probability: The probability that the system will be in any
particular state after a large number of transitions. Once steady state has been
reached , the state probabilities do not change from period to period.
3. MARKOV PROCESSES
•The Evolution of Systems over Repeated
Trials
•State of the System Cannot be
Determined with Certainty
•Transition Probabilities are Used
•Probability of the System being in a
particular State at a time
5. MARKOV PROCESSES
MARKOV CHAIN
Probability of being in a particular state at
any time period depends only on the state
in the immediately preceding time period.
---------------------------------------------------
6. MARKOV PROCESSES
A. MARKET SHARE:
Trials of the Process: Shopping Trips (Daily/Weekly/Monthly)
State of the System: Store Selected in a given period.
Two Shopping Alternatives- Two States (Finite)
State 1. Customer Shops at ABC
State 2. Customer Shops at XYZ.
The System is in State 2 at Trial 4 =>
7. MARKOV PROCESSES
MARKET RESEARCH:
Data - 100 Shoppers over 30 days.
Probability of selecting a store (State) in a given
period in terms of the store (state) that was selected
during the previous period.
Of all customers who shopped at ABC in a day,
90% shopped at ABC the following day while 10%
switched to XYZ.
Similarly, 80% shopped at XYZ the following day
and 20% switched to ABC.
Transition from a state in a given period to another
state in the following period.
8. MARKOV PROCESSES
TRANSITION PROBABILITIES:
Current Period Next Period
ABC XYZ
ABC 0.9 0.1
XYZ 0.2 0.8
Pij = Probability of making a transition
from state i in a given period to state j in
the next period.
10. MARKOV PROCESSES
i(n) = Probability that the System is in
State ‘i’ in period ‘n’. (State Probability)
i = state; n = number of transactions.
1(0) 2(0) = 1 0 or 0 1
ABC XYZ
11. MARKOV PROCESSES
(n) = 1(n) 2(n) Vector of the state
probabilities of the system in period n.
State probabilities for period n+1:
(next period) = (current period) P
(n+1) = (n) P
(0) = 1, 0
therefore, (1) = (0) P (Period 1)
17. MARKOV PROCESSES
Steady State Probabilities:
The probabilities after a large number of
transitions (is dependent of the beginning
state of the system).
1 = Steady state probability for state 1.
2 = Steady state probability for state 2.
1(n+1) 2(n+1) = 1(n) 2(n) p11 p12
p21 p22
0.9 0.1
0.2 0.8
20. MARKOV PROCESSES
B. ACCOUNTS RECEIVABLE ANALYSIS:
Two aging categories:
(i) 0-3 months (ii) 4-6 months
Bad Debt > 6 months.
March 31 # Rs. 5000 A/C receivable.
Date Rupees
December 10 2000
February 10 1000
March 18 500
March 30 1500
How much bad debt? 31 March
Total Balance Method.
22. MARKOV PROCESSES
Transition Probabilities:
pij = probability of a Rs. in State i in one
month moving to State j in the next month.
p11 p12 p13 p14 1 0 0 0
p21 p22 p23 p24 0 1 0 0
P = p31 p32 p33 p34 = .4 0 .3 .3
p41 p42 p43 p44 .4 .2 .3 .1
23. MARKOV PROCESSES
Absorbing State:
Probability of a transition to any other state
is 0. (The system remains in the state indefinitely)
Do not compute steady state probabilities.
Absorbing state probabilities for
Rs. in (4-6) month category.
27. MARKOV PROCESSES
=> Probability that A/C receivable in states
3 or 4 will eventually reach each of the
absorbing states.
Let B = b1 b2
Rs. in (0-3) m Rs. in (4-6) m
If b1 = Rs. 3000 b2 = Rs. 2000
B. NR = 3000 2000 0.89 0.11
0.74 0.26
= 4150 850
Collected Bad Debt
28. MARKOV PROCESSES
Credit Policy --- discount for prompt payment
1 0 0 0
0 1 0 0
New P = -----------------------------
.6 0 .3 .1
.4 .2 .3 .1
N = 1.5 0.17 NR = .97 .03
0.5 1.17 .77 .23
B.NR = 3000 2000 .97 .03
.77 .23
29. MARKOV PROCESSES
= 4450 550
Collected Bad Debt
(Costs +
Discounts) 6 % Reduction in Bad Debt.
30. Markov Processes
• Absorbing State: A state is said to be absorbing if
the probability of making a transition out of that
state is zero. Thus once the system has made a
transition into an absorbing state, it will remain
there.
• Fundamental Matrix: A matrix necessary for the
computation of probabilities associated with
absorbing states of a Markov Process.