Probabilistic Programming of
Non-Parametric Bayesian
Dynamic Discrete State Models
University of Southampton
Southampton, ...
Applications
Across domains:
– Exploration (visual summary of large amount of data;
answer “what if?” by modifying paramet...
Discrete State Models
– Each observation explained by a latent state
-e.g., GPS observation at (50.931157, -1.401897) expl...
Non-parametric Bayes
What number of states to specify?
– Traditional solution: model selection/averaging
– Have to conside...
Non-parametric Bayes
What number of states to specify?
– Traditional solution: model selection/averaging
– Have to conside...
Example Discrete State Models
Example problems potentially solvable with NP-HMMs:
- You have appliance usage data for a ho...
Probabilistic Programming
– Specify model using domain specific language
– Benefit: run inference of unknown model paramet...
Probabilistic Programming
Limitations (of Infer.net):
– Does not handle Dirichlet process models (non-parametric
Bayes for...
My Limited, Small Scale Answer
– Probabilistic programming for non-parametric discrete state
models (HMMs, mixture models)...
Define Your Own Sensor
Implement two methods:

class Sensor(object):
def __init__(self,K,hyperparams):
self._K = K #trunca...
Live Demo
4 lines of code to run inference on a custom, multi-modal,
non-parametric HMM:
K = <truncation parameter>
gSenso...
Download
git clone https://github.com/jamesmcinerney/np-hmm.git
(alpha version)

12
Upcoming SlideShare
Loading in …5
×

Implementation of Variational Inference for Non-Parametric Hidden Markov Models

739 views

Published on

Description of a Python implementation of fast variational inference on HMMs with unbounded numbers of states, using the stick-breaking construction.

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
739
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Implementation of Variational Inference for Non-Parametric Hidden Markov Models

  1. 1. Probabilistic Programming of Non-Parametric Bayesian Dynamic Discrete State Models University of Southampton Southampton, UK 14th Oct 2013 James McInerney jem1c10@ecs.soton.ac.uk
  2. 2. Applications Across domains: – Exploration (visual summary of large amount of data; answer “what if?” by modifying parameters) – Inference (understand hidden structure of data; fill in missing data) – Prediction 2
  3. 3. Discrete State Models – Each observation explained by a latent state -e.g., GPS observation at (50.931157, -1.401897) explained by me being at “home” – Mixture model -e.g., my next location is independent of previous locations – Hidden Markov model -e.g., my next location depends on previous location (first-order) 3
  4. 4. Non-parametric Bayes What number of states to specify? – Traditional solution: model selection/averaging – Have to consider many different numbers of states: slow 4
  5. 5. Non-parametric Bayes What number of states to specify? – Traditional solution: model selection/averaging – Have to consider many different numbers of states: slow – More elegant: Dirichlet process (non-parametric Bayes) 5
  6. 6. Example Discrete State Models Example problems potentially solvable with NP-HMMs: - You have appliance usage data for a home, and want to predict if and when appliances will be used the next day - You have location data of teams in AtomicOrchid and want to find team assignments based on proximity (dealing with noise of GPS and ephemeral proximity) - You have user activity on a website or application and want to infer their state of mind and predict future actions 6
  7. 7. Probabilistic Programming – Specify model using domain specific language – Benefit: run inference of unknown model parameters with a click of a button – E.g., Infer.net, Church, Stan, Alchemy 7
  8. 8. Probabilistic Programming Limitations (of Infer.net): – Does not handle Dirichlet process models (non-parametric Bayes for discrete states) – Very limited handling of HMMs 8
  9. 9. My Limited, Small Scale Answer – Probabilistic programming for non-parametric discrete state models (HMMs, mixture models) in Python – User can specify any number of sensors on data: - Multivariate Gaussian sensor (any # dimensions) - Discrete sensor - von Mises (periodic data) - mixture of Gaussians – Implemented using variational approximation (= fast) 9
  10. 10. Define Your Own Sensor Implement two methods: class Sensor(object): def __init__(self,K,hyperparams): self._K = K #truncation parameter self._hyperparams = hyperparams def loglik(self,X): #given data set X, provide the likelihood #of each (N) data point. #returns: (N,K) matrix, with unnormalised #log liklihood for each component and each data point def m(self,X,exp_z): #given expected value of z, calculate the #variational parameters of each component (w.r.t. this sensor) 10
  11. 11. Live Demo 4 lines of code to run inference on a custom, multi-modal, non-parametric HMM: K = <truncation parameter> gSensor = sensors.MVGaussianSensor(K,XDim) dSensor = sensors.DiscreteSensor(K) exp_z,_,exp_a,Zmax = general_inf.infer(N,[X,Y],K,[gSensor,dSensor]) 11
  12. 12. Download git clone https://github.com/jamesmcinerney/np-hmm.git (alpha version) 12

×