1. HIDDEN MARKOV
MODEL
Presented by: - Vo Quang Tuyen
- Le Quang Hoa
- Truong Hoang Linh
Problem 2
MOST PROBABLE PATH
2. Markov Models
A Markov Model is specified by
The set of states S = {s1, s2, …. , sNs}. and characterized by
The prior probabilities: 흅풊 = 푃(푞1 = 푠푖)
Probabilities of si being the first state of a state sequence. Collected in vector
휋.
The transition probabilities ai j = P(qn+1 = sj | qn = si)
probability to go from state i to state j. Collected in matrix A
The Markov model produces
A state sequence Q = {q1,….qN}, qn ∈ S over time 1 ≤ 푛 ≤ 푁.
3. Hidden Markov Models
Additionally, for a Hidden Markov model we have
Emission probabilities:
for continuous valued observations, 푥푛 ∈ ℝ퐷. A set of functions:
푏푖 푥푛 = 푝(푥푛|푠푖 )
for discrete observations, 푥 ∈ 푣1, … . 푣푘 :
bi,k = P(xn = vk | qn = si )
Probabilities for the observation of xn = vk, if the system is in state
si.
Collected in matrix B.
Observation sequence:
X = { x1, x2, …., xN}
HMM parameters (for fixed number of states Ns) thus are
Θ = (퐴 , 퐵 , 휋)
4. Trellis Diagram
A trellis diagram can be used to visualize likelihood calculations of HMMs.
5. Trellis Example
Example for Trellis
diagram:
Joint likelihood for
observed sequence X and
state sequence (path) Q:
6. Hidden Markov Models Problem
Given model Θ, what is the hidden state sequence Q that best
explains an observation sequence X
푄∗ = 푎푟푔푚푎푥 푃 푋, 푄 Θ) = ?
푄
7. Viterbi Algorithm
for a HMM with 푁푠 states.
1. Initialization: 훿1 푖 = 휋푖 ∙ 푏푖, 푥1 , 푖 = 1 … 푁푠
휓1(푖) = 0
where πi is the prior probability of being in state si at time n = 1
2. Recursion:
for n > 1 and all j = 1 ... 푁푆
8. Viterbi Algorithm
3. Termination:
Find the best likelihood when the end of the observation sequence t = T is
reached.
4. Backtracking of optimal state sequence:
∗, … , 푞푛∗
푄∗ = {푞1
}
푞푛∗
∗ , 푛 = 푁 − 1, 푁 − 2, … 1
= 휓푛+1 푞푛+1
Read the best sequence of states from the 휓푛 vectors.
9. Viterbi Algorithm / Example
For our weather HMM Θ,find the most probable hidden weather sequence for the
observation sequence
11. Viterbi Algorithm / Example
2. Recursion (n=2):
We calculate the likelihood of getting to state “sunny” from all possible 3 redecessor
states, and choose the most likely one to go on with:
The likelihood is stored in 훿2, the most likely predecessor in 휓2.
The same procedure is executed with states “rainy” and “foggy”:
15. Viterbi Algorithm / Example
3. Termination
The globally most likely path is determined, starting by looking for the last state of the
most likely sequence.
4. Backtracking
The best sequence of states can be read from the 휓 vectors.
n = N – 1 = 2:
n = N – 1 = 1:
16. Viterbi Algorithm / Example
The most likely weather sequence is:
Backtracking:
18. - Hidden Markov Models - A Tutorial for the Course
Computational Intelligence, Barbara Resch.
- Hidden Markov Models, Speech Communication
2, SS 2004 , Erhard Rank & Franz Pernkopf
REFERENCE