Hidden markov chain and bayes belief networks doctor consortium
Upcoming SlideShare
Loading in...5
×
 

Hidden markov chain and bayes belief networks doctor consortium

on

  • 485 views

Hidden markov chain and bayes belief networks doctor consortium.I made it by myself. I hope it's helpful for you.

Hidden markov chain and bayes belief networks doctor consortium.I made it by myself. I hope it's helpful for you.

Statistics

Views

Total Views
485
Views on SlideShare
484
Embed Views
1

Actions

Likes
0
Downloads
7
Comments
0

1 Embed 1

https://www.linkedin.com 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Hidden markov chain and bayes belief networks doctor consortium Hidden markov chain and bayes belief networks doctor consortium Presentation Transcript

  • Graphical Models of Probability
    • Graphical models use directed or undirected graphs over a set of random variables to explicitly specify variable dependencies and allow for less restrictive independence assumptions while limiting the number of parameters that must be estimated.
    • Bayesian Networks : Directed acyclic graphs that indicate causal structure.
    • Markov Networks : Undirected graphs that capture general dependencies.
    Middle ware, CCNT, ZJU 10/02/11
  • H idden M arkov M odel Zhejiang Univ CCNT Yueshen Xu Middle ware, CCNT, ZJU 10/02/11
  • Overview
    • Markov Chain
    • HMM
    • Three Core Problems and Algorithms
    • Application
    Middleware, CCNT, ZJU 10/02/11
  • Markov Chain Instance We can regard the weather as three states : state1 : Rain state2 : Cloudy state3 : Sun We can obtain the transition matrix with long term observation Middleware, CCNT, ZJU 10/02/11 Tomorrow Rain Cloudy Sun Today Rain 0.4 0.3 0.3 Cloudy 0.2 0.6 0.2 Sun 0.1 0.1 0.8
  • Definition one-step transition probability That is to say, the evolvement of the stochastic process only relies on the current state and has nothing to do with those states before. Then we call this Markov property , and the process is regarded as Markov Process State Space: Observation Sequence: Middleware, CCNT, ZJU 10/02/11
  • Keystone Middleware, CCNT, ZJU state transition matrix 其中: Initial state probability matrix 10/02/11
  • HMM
    • A HMM is a double random process, consisting of two parallel parts:
    • Markov Chain : Describe the transition of the states, which is unobservable, by means of transition probability matrix.
    • Common stochastic process : Describe the stochastic process of the observable events
    Markov Chain (  , A ) Stochastic Process ( B ) State Sequence Observation Sequence q 1 , q 2 , ..., q T o 1 , o 2 , ..., o T HMM Middleware, CCNT, ZJU 10/02/11 Unobservable Observable Core Feature
  • Example: S 1 S 2 S 3 What’s the probability of producing the sequence “abb” for this stochastic process? Middleware, CCNT, ZJU 10/02/11 a 11 0.3 a b 0.80.2 a 22 0.4 a b 0.30.7 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1
  • Instance1: S 1 S 2 S 3 S 1 ->S 1 ->S 2 ->S 3 0.3*0.8*0.5*1.0*0.6*0.5=0.036 Middleware, CCNT, ZJU 10/02/11 a 11 0.3 a b 0.80.2 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1 a 22 0.4 a b 0.30.7
  • Instance2: S 1 S 2 S 3 S 1 ->S 2 ->S 2 ->S 3 0.5*1.0*0.4*0.3*0.6*0.5=0.018 Middleware, CCNT, ZJU 10/02/11 a 11 0.3 a b 0.80.2 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1 a 22 0.4 a b 0.30.7
  • Instance3: S 1 S 2 S 3 S 1 ->S 1 ->S 1 ->S 3 0.3*0.8*0.3*0.8*0.2*1.0=0.01152 Therefore, the total probability is: 0.036+0.018+0.01152=0.06552 Middleware, CCNT, ZJU We just know “abb”, but don’t know “S ? S ? S ? ”-----That’s the point. 10/02/11 a 11 0.3 a b 0.80.2 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1 a 22 0.4 a b 0.30.7
  • Description
    • A HMM can be identified by those parameters below:
    • N: the number of states
    • M: the number of observable events for each state
    • A: the state transition matrix
    • B: observable event probability
    • : the initial state probability
    Middleware, CCNT, ZJU
    • We generally record it as
    10/02/11
  • Three Core Problem
    • Evaluation:
    • In the case that the observation sequence and the model have been preseted, then how can we calculate ?
    • Optimization:
    • Based on question 1, the question is how to choose a special sequence so that the observation sequence O can be explained reasonably?
    • Training
    • Based on question 1, here is how to adjust parameters of the model to maximize ?
    Middleware, CCNT, ZJU We know O, but don’t know Q 10/02/11
  • Solution
    • There is no need to expound those algorithms, since we should pay attention to the application context.
    • Evaluation——Dynamic Programming
    • Forward
    • Backward
    • Optimization——Greedy
    • Viterbi
    • Training——Iterative
    • Baum-Welch & Maximum Likelihood Estimation
    • You can think over and deduce these methods after the workshop.
    Middleware, CCNT, ZJU 10/02/11
  • Application Context
    • Just think over it :
    • The feature of HMM
    • Which kind of problem can it describe and model?
    • Two stochastic sequence
    • One relies on another or two is related .
    • One can be “ seen”, but another can not
    • Just think about the Three Core Problem
    • ……
    • I think we can make a conclusion , just as: Use One sequence to deduce and predict another or Find Out Who is Behind
    “ Iceberg” Problem Middleware, CCNT, ZJU 10/02/11
  • Application Context(1): Voice Recognition
    • Statistical Description
    • The characteristic pattern of voice, from sampling more often: T =t 1 ,t 2 ,…, t n
    • The word sequence W(n): W 1, W 2, ...,W n
    • Therefore, what we concern about is P( W(n)|T )
    Middleware, CCNT, ZJU
    • Formalization Description
    • What we have to solve is :
    • k = arg max{ P( W(n)| T ) }
    10/02/11
  • Application Context(1): Voice Recognition Middleware, CCNT, ZJU Recognition Framework 10/02/11 Baum-Welch Re-estimation Speech database Feature Extraction Converged?  1  2  7 HMM waveform feature Yes No end
  • Application Context(2): Text Information Extraction
    • Figure out the HMM Model :
    • Q1:What ‘s the state and what’s the observation event?
    • Q2:How to figure out those parameters, just like a ij ?
    Middleware, CCNT, ZJU
    • state : what you want to extract
    • observation event : text block or each word etc
    Through Training Samples 10/02/11
  • Application Context(2): Text Information Extraction Middleware, CCNT, ZJU Partition-ing State List Extracted Sequence Document Partitioni-ng Training Sample HMM Extraction Framework country, state , city, street title, author, email, abstract 10/02/11
  • Application Context(3): Other Fields:
    • Face Recognition
    • POS tagging
    • Web Data Extraction
    • Bioinformatics
    • Network intrusion detection
    • Handwriting recognition
    • Document Categorization
    • Multiple Sequence Alignment
    Middleware, CCNT, ZJU Which field are you interested in ? 10/02/11
  • Middleware, CCNT, ZJU 10/02/11
  • B ayes B elief N etwork Yueshen Xu, too Middle ware, CCNT, ZJU 10/02/11
  • Overview
    • Bayes Theorem
    • Naïve Bayes Theorem
    • Bayes Belief Network
    • Application
    Middleware, CCNT, ZJU 10/02/11
  • Bayes Theorem
    • Basic Bayes Formula
    • Basic of basis, but vital.
    prior probability posterior probability complete probability formula Middleware, CCNT, ZJU Condition Inversion 10/02/11
    • The naive Bayes theorem is a simple probabilistic theorem based on applying Bayes theorem with strong independence  assumptions
    Naïve Bayes Theorem Middleware, CCNT, ZJU Chain Rule Conditional Independence C F 1 F 2 … F n Naïve Bayes is a simple Bayes Net 10/02/11
  • Bayes Belief Network: Graph Structure
    • Directed Acyclic Graph (DAG)
    • Nodes are random variables
    • Edges indicate causal influences
    Middleware, CCNT, ZJU Burglary Earthquake Alarm JohnCalls MaryCalls RV parents descendant relationship 10/02/11
  • Bayes Belief Network: Conditional Probability Table
    • Each node has a conditional probability table (CPT) that gives the probability of each of its values given every possible combination of values for its parents.
    • Roots (sources) of the DAG that have no parents are given prior probabilities.
    Middleware, CCNT, ZJU Burglary Earthquake Alarm JohnCalls MaryCalls 10/02/11 P(B) .001 P(E) .002 B E P(A) T T .95 T F .94 F T .29 F F .001 A P(M) T .70 F .01 A P(J) T .90 F .05
  • Bayes Belief Network: Joint Distributions
    • A Bayesian Network implicitly defines a joint distribution.
    • Example
    • Therefore an inefficient approach to inference is:
      • 1) Compute the joint distribution using this equation.
      • 2) Compute any desired conditional probability using the joint distribution.
    Middleware, CCNT, ZJU Conditional Independence 10/02/11
  • Conditional Independence & D-separation
    • D-separation
    • Let X,Y and Z be three sets of node
    • If X and Y are d-separation by Z then X and Y are conditional independent given Z
    • D-separation
    • A is d-separation from B given C if every undirected path between them is blocked
    • Path blocking
    • Three cases that expand on three basic independence structures.
    Middleware, CCNT, ZJU 10/02/11
  • Application: Simple Document Classification(1)
    • Step1: Assume for the moment that there are only two mutually exclusive classes, S and ¬S (eg, spam and not spam), such that every element(email) is in either one or the other, that is to say:
    • Step2: what we concern about is :
    Middleware, CCNT, ZJU 10/02/11
  • Application: Simple Document Classification(2)
    • Step3: Dividing one by the other gives, and the be re-factored .
    • Step4: Taking the logarithm of all these ratios for decreasing calculated quantity:
    >0 or <0 Known Sample Training Middleware, CCNT, ZJU 10/02/11
  • Application: Overall
    • Medical diagnosis
    • Pathfinder system outperforms leading experts in diagnosis of lymph-node disease.
    • Microsoft applications
    • Problem diagnosis: printer problems
    • Recognizing user intents for HCI
    • Text categorization and spam filtering
    • Student modeling for intelligent tutoring systems.
    • Biochemical Data Analysis
    • Predicting mutagenicity
    • So many…
    Which field are you interested in ? Middleware, CCNT, ZJU 10/02/11
  • Middleware, CCNT, ZJU 10/02/11