Basic Review of Markov Chains and Hidden Markov Algorithms. Teradata solution not limited to these algorithms.
See more on Customer Journey here:
http://www.teradata.com/News-Releases/2016/Teradatas-Customer-Journey-Analytic-Solution-Creates-Behavioral-Insights-to-Deliver-a-Distinct-Customer-Experience/
2. • Wikipedia definition
– A stochastic model describing a sequence of possible events in which the probability
of each event depends only on the state attained in the previous event
• Simple explanation
– A sequence is a chain made up of mini event transitions
• We use Markov Chains for a discriminate model (Supervised)
Markov Chains
3. • Customer Journeys
Markov Chains - Example
View Account
Summary
View Deposit
Details
FAQ Account History Account History View Account
Summary
Account History View Account
Summary
Fund Transfer Fund Transfer Bill Pay Form Bill Pay
Enrollment
Favorable Outcome !
4. • BILL PAY ENROLLMENT
Transition Probabilities with Adjacency Matrix (Training)
AH AS BPF BPI CS FAQ FT OSE PU VDD
AH 0.012 0.016 0.02 0.007 0.005 0.014 0.011 0.008 0.001 0.016
AS 0.036 0.038 0.057 0.010 0.020 0.030 0.038 0.008 0.016 0.040
BPF 0.001 0.004 0.007 0.000 0.007 0.002 0.004 0.002 0.000 0.002
BPI 0.006 0.005 0.000 0.012 0.001 0.005 0.008 0.004 0.001 0.012
CS 0.007 0.007 0.012 0.006 0.000 0.007 0.004 0.007 0.002 0.006
FAQ 0.011 0.018 0.028 0.005 0.005 0.018 0.013 0.004 0.002 0.018
FT 0.011 0.016 0.024 0.005 0.005 0.014 0.013 0.006 0.010 0.016
OSE 0.006 0.010 0.007 0.004 0.004 0.010 0.004 0.000 0.002 0.005
PU 0.007 0.001 0.006 0.000 0.006 0.002 0.007 0.002 0.000 0.006
VDD 0.018 0.016 0.029 0.007 0.004 0.017 0.012 0.007 0.005 0.012
Small values will replace zero values to make the chain “Ergodic”
5. • <NOT> BILL PAY ENROLLMENT
Transition Probabilities with Adjacency Matrix (Training)
AH AS BPF BPI CS FAQ FT OSE PU VDD
AH 0.026 0.027 0.000 0.001 0.004 0.027 0.027 0.003 0.004 0.27
AS 0.059 0.058 0.000 0.002 0.012 0.059 0.057 0.011 0.012 0.058
BPF 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
BPI 0.001 0.001 0.000 0.001 0.000 0.001 0.001 0.000 0.000 0.001
CS 0.005 0.004 0.000 0.000 0.000 0.005 0.004 0.002 0.002 0.005
FAQ 0.027 0.026 0.000 0.001 0.004 0.027 0.027 0.004 0.004 0.028
FT 0.026 0.027 0.000 0.001 0.004 0.027 0.027 0.004 0.003 0.028
OSE 0.004 0.004 0.000 0.000 0.002 0.004 0.004 0.000 0.002 0.004
PU 0.004 0.004 0.000 0.000 0.002 0.004 0.004 0.002 0.000 0.004
VDD 0.027 0.029 0.000 0.001 0.003 0.026 0.027 0.004 0.004 0.027
Small values will replace zero values to make the chain “Ergodic”
6. • What’s the likelihood that these event sequences will lead to Bill Pay
Enrollment ?
Markov Chains - Prediction
View Account
Summary
View Deposit
Details
FAQ Account History View Account
Summary
Fund Transfer
View Account
Summary
Account HistoryFund Transfer View Deposit
Details
Bill Pay FormFAQ
Bill Pay
Enrollment
???
Observable States
7. • What’s the likelihood that these “unknown” event sequences/path will lead to
Bill Pay Enrollment ?
Markov Chains – Simple Prediction
View Account
Summary
View Deposit
Details
FAQ Account History View Account
Summary
Fund Transfer
View Account
Summary
Account HistoryFund Transfer View Deposit
Details
Bill Pay FormFAQ
Bill Pay
Enrollment
???
Log Odds Ratio = SUM(LOG(P[Xi|Xi-1] for
BPF) – SUM(LOG(P[Xi|Xi-1] for NOT BPF)
i > 1
-1.35
4.80
8. Bimodal Distribution of Scores
Score
-9 -8 -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 9 10
# of sequences scored
CONFUSION
Bill Pay Enrollment
Sequences
Non-Bill Pay Enrollment
Sequences
9. • Find Optimal Markov Order. Test 2,3,4 etc.,
• Binary Classification -> K Class through multiple model scoring
• Regularization Technique for sequence lengths variability
• Mutual Information Score filters
• Stacked Ensemble with a Second Classifier built on multi-model MC Scores
• Models exported to Real Time for Scoring
Markov Chains Implementation – Advanced
10. • A Hidden Markov model (HMM) is a statistical Markov model in which the
system being modeled is assumed to be a Markov process with unobserved
(hidden) states
• In Markov Chains, the state is directly visible to the observer, and therefore the
state transition probabilities are the only parameters. In hidden Markov model,
the state is not directly visible, but the output, dependent on the state is visible!
• HMM is commonly useful as a generative model (Unsupervised)
Hidden Markov Model
11. Hidden Markov Models- Example
View Account
Summary
View Deposit
Details
FAQ Account History Account History View Account
Summary
Account History View Account
Summary
Fund Transfer Fund Transfer Bill Pay Form Bill Pay
Enrollment
Observable States
H1 H1 H2 H3 H3 H1
H1 H1 H4 H4 H5 H5
Hidden
States
Hidden
States
12. Hidden Markov Model – Transitions & Emissions
(5 Unsupervised Hidden States)
H1
H2
H3
H4
H5
View Account
Summary
View Deposit
Details
Account History
FAQ
Account
History
Fund Transfer
Bill Pay Form
Bill Pay
Enrollment
Fund Transfer
View Account
Summary
View Deposit
Details
30 %
25 %
30 %
40 %
20 %
80 %
40 %
40 %
10 %
13. • What’s the likelihood that these “unknown” event sequences will lead to Bill
Pay Enrollment ?
Hidden Markov Model – Sequence Prediction
View Account
Summary
View Deposit
Details
FAQ Account History View Account
Summary
Fund Transfer
View Account
Summary
Account HistoryFund Transfer View Deposit
Details
Bill Pay FormFAQ
Bill Pay
Enrollment
???
FORWARD ALGORITHM + MODEL
20%
80%