HIDDEN MARKOV MODEL AND ITS APPLICATION

2,099 views

Published on

HIDDEN MARKOV MODEL AND ITS APPLICATIONS

Published in: Technology, Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,099
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
149
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • {}
  • HIDDEN MARKOV MODEL AND ITS APPLICATION

    1. 1. Hidden Markov Models ADVANCED BIOMETRIC MEVLANA UNIVERSITY, KONYA Presented By Muzammil Abdulrahman 2013
    2. 2. HMM Motivation  Real-world has structures and processes which have (or produce) observable outputs Usually sequential (process unfolds over time)  Cannot see the event producing the output    Example: speech signals Problem: how to construct a model of the structure or process given only observations
    3. 3. Markov Model   The future is independent on the past given the present Let X 1 , X 2 ...... X n be discrete random variables of states, then the Markov chain can be represented as X1 X2 X3 … Xn P( X t | X 1 , X 2 ,K , X t −1 ) = P ( X t | X t −1 )
    4. 4. Markov Model Example State transition matrix  Weather  Once each day weather is observed     Cloudy Sunny Rainy 0.4 0.3 0.3 Cloudy 0.2 0.6 0.2 Sunny 0.1 0.1 0.8 What is the probability the weather for the next 7 days will be:   State 1: rain State 2: cloudy State 3: sunny Rainy sun, sun, rain, rain, sun, cloudy, sun Each state corresponds to a physical observable event
    5. 5. MM     Then What is the Problem of MM ? Some informations will be missing Why ?? We can’t expect to perfectly observe the complete states of the systems
    6. 6. HMM    The solution to the missing informations will be solve by HMM It’s a Sequential Model Let Z1 , Z 2 ......Z n be a hidden random variables and X 1 , X 2 ...... X n be the observed states Z1 Z2 Z3 Zn X1 X2 X3 Xn The Trellis Diagram of HMM
    7. 7. HMM  The joined prob. of the above variables is n p( X t ... X n , Z1,K , Z n ) = p( Z1) p( X1 / Z1)π k = 2 p( Z k / Z k −1) p( X k / Z k )
    8. 8. HMM Parameters         Transition probs. Emission probs. Initial Distribution Xs are the hidden states Ys are the observed states a’s are the Transition probs. b’s are the emission probs. The initial prob is sometimes assumed to be 1
    9. 9. HMM Examples • Typed word recognition, assume all characters are separated. • Character recognizer outputs probability of the image being particular character, P(image|character). a 0.5 b 0.03 c 0.005 0.31 z Hidden state Observation
    10. 10. HMM Examples  Coin toss: Heads, tails sequence with 2 coins  You are in a room, with a wall  Person behind wall flips coin, tells result  Coin selection and toss is hidden  Cannot observe events, only output (heads, tails) from events   Problem is then to build a model to explain observed sequence of heads and tails
    11. 11. HMM Uses  Uses  Speech recognition  Text processing  Gesture classifications  Bioinformatics
    12. 12. HMM Algorithms  They are used to do the inference on Zs given the sequences of the observed actions X 1 , X 2 ...... X n     The ff algorithms are used in HMM Forward Estimate the Parameters of HMM (T, E & I) using Baum Welch Backward Viterbi
    13. 13. Thank You

    ×