Upcoming SlideShare
×

# 19 cv mil_temporal_models

431 views

Published on

0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
431
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
16
0
Likes
0
Embeds 0
No embeds

No notes for slide

### 19 cv mil_temporal_models

1. 1. Computer vision: models, learning and inference Chapter 19 Temporal models Please send errata to s.prince@cs.ucl.ac.uk
2. 2. GoalTo track object state from frame to frame in a videoDifficulties: • Clutter (data association) • One image may not be enough to fully define state • Relationship between frames may be complicated
3. 3. Structure• Temporal models• Kalman filter• Extended Kalman filter• Unscented Kalman filter• Particle filters• Applications Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 3
4. 4. Temporal Models• Consider an evolving system• Represented by an unknown vector, w• This is termed the state• Examples: – 2D Position of tracked object in image – 3D Pose of tracked object in world – Joint positions of articulated model• OUR GOAL: To compute the marginal posterior distribution over w at time t. Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 4
5. 5. Estimating StateTwo contributions to estimating the state:1. A set of measurements xt, which provide information about the state wt at time t. This is a generative model: the measurements are derived from the state using a known probability relation Pr(xt|w1…wT)2. A time series model, which says something about the expected way that the system will evolve e.g., Pr(wt|w1...wt-1,wt+1…wT) Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 5
6. 6. Assumptions• Only the immediate past matters (Markov) – the probability of the state at time t is conditionally independent of states at times 1...t-2 given the state at time t-1.• Measurements depend on only the current state – the likelihood of the measurements at time t is conditionally independent of all of the other measurements and the states at times 1...t-1, t+1..t given the state at time t. Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 6
7. 7. Graphical ModelWorld statesMeasurements Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 7
8. 8. Recursive EstimationTime 1Time 2 fromTime t temporal model Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 8
9. 9. Computing the prior (time evolution)Each time, the prior is based on the Chapman-KolmogorovequationPrior at time t Temporal model Posterior at time t-1 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 9
10. 10. SummaryAlternate between:Temporal Evolution Temporal modelMeasurement Update Measurement model Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 10
11. 11. Structure• Temporal models• Kalman filter• Extended Kalman filter• Unscented Kalman filter• Particle filters• Applications Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 11
12. 12. Kalman FilterThe Kalman filter is just a special case of this type ofrecursive estimation procedure.Temporal model and measurement model carefullychosen so that if the posterior at time t-1 was Gaussianthen the• prior at time t will be Gaussian• posterior at time t will be GaussianThe Kalman filter equations are rules for updating the means and covariances of these Gaussians Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 12
13. 13. The Kalman FilterPrevious time step PredictionMeasurement likelihood Combination Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 13
14. 14. Kalman Filter DefinitionTime evolution equation State transition matrix Additive Gaussian noiseMeasurement equation Additive Gaussian noise Relates state and measurement Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 14
15. 15. Kalman Filter DefinitionTime evolution equation State transition matrix Additive Gaussian noiseMeasurment equation Additive Gaussian noise Relates state and measurement Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 15
16. 16. Temporal evolutionComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 16
17. 17. Measurement incorporation Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 17
18. 18. Kalman FilterThis is not the usual way these equations are presented.Part of the reason for this is the size of the inverses: is usuallylandscape and so T is inefficientDefine Kalman gain: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 18
19. 19. Mean TermUsing Matrix inversion relations: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 19
20. 20. Covariance TermKalman FilterUsing Matrix inversion relations: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 20
21. 21. Final Kalman Filter EquationInnovation (difference betweenactual and predicted measurements Prior variance minus a term due to information from measurement Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 21
22. 22. Kalman Filter SummaryTime evolution equationMeasurement equationInference Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 22
23. 23. Kalman Filter Example 1Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 23
24. 24. Kalman Filter Example 2Alternates: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 24
25. 25. Smoothing• Estimates depend only on measurements up to the current point in time.• Sometimes want to estimate state based on future measurements as wellFixed Lag Smoother:This is an on-line scheme in which the optimal estimate for a state at time t -t is calculated based on measurements up to time t, where t is the time lag. i.e. we wish to calculate Pr(wt- |x1 . . .xt ).Fixed Interval Smoother:We have a fixed time interval of measurements and want to calculate the optimal state estimate based on all of these measurements. In other words, instead of calculating Pr(wt |x1 . . .xt ) we now estimate Pr(wt |x1 . . .xT) where T is the total length of the interval. Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 25
26. 26. Fixed lag smoother State evolution equation Estimatedelayed by Measurement equation Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 26
27. 27. Fixed-lag Kalman Smoothing Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 27
28. 28. Fixed interval smoothingBackward set of recursionswhere Equivalent to belief propagation / forward-backward algorithm Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 28
29. 29. Temporal ModelsComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 29
30. 30. Problems with the Kalman filter• Requires linear temporal and measurement equations• Represents result as a normal distribution: what if the posterior is genuinely multi- modal? Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 30
31. 31. Structure• Temporal models• Kalman filter• Extended Kalman filter• Unscented Kalman filter• Particle filters• Applications Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 31
32. 32. Roadmap Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 32
33. 33. Extended Kalman FilterAllows non-linear measurement and temporal equationsKey idea: take Taylor expansion and treat as locally linear Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 33
34. 34. JacobiansBased on Jacobians matrices of derivatives Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 34
35. 35. Extended Kalman Filter Equations Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 35
36. 36. Extended Kalman FilterComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 36
37. 37. Problems with EKFComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 37
38. 38. Structure• Temporal models• Kalman filter• Extended Kalman filter• Unscented Kalman filter• Particle filters• Applications Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 38
39. 39. Unscented Kalman FilterKey ideas:• Approximate distribution as a sum of weighted particles with correct mean and covariance• Pass particles through non-linear function of the form• Compute mean and covariance of transformed variables Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 39
40. 40. Unscented Kalman FilterApproximate with particles:Choose so that Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 40
41. 41. One possible schemeWith: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 41
42. 42. ReconstitutionComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 42
43. 43. Unscented Kalman Filter Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 43
44. 44. Measurement incorportationMeasurement incorporation works in a similar way:Approximate predicted distribution by set of particlesParticles chosen so that mean and covariance the same Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 44
45. 45. Measurement incorportationPass particles through measurement equationandrecompute mean and variance:Measurement update equations:Kalman gain now computed from particles: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 45
46. 46. Problems with UKFComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 46
47. 47. Structure• Temporal models• Kalman filter• Extended Kalman filter• Unscented Kalman filter• Particle filters• Applications Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 47
48. 48. Particle filtersKey idea: • Represent probability distribution as a set of weighted particlesAdvantages and disadvantages: + Can represent non-Gaussian multimodal densities + No need for data association - Expensive Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 48
49. 49. Condensation AlgorithmStage 1: Resample from weighted particles according to theirweight to get unweighted particles Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 49
50. 50. Condensation AlgorithmStage 2: Pass unweighted samples through temporal modeland add noise Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 50
51. 51. Condensation AlgorithmStage 3: Weight samples by measurement density Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 51
52. 52. Data AssociationComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 52
53. 53. Structure• Temporal models• Kalman filter• Extended Kalman filter• Unscented Kalman filter• Particle filters• Applications Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 53
54. 54. Tracking pedestriansComputer vision: models, learning and inference. ©2011 Simon J.D. Prince 54
55. 55. Tracking contour in clutter Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 55
56. 56. Simultaneous localization and mapping Computer vision: models, learning and inference. ©2011 Simon J.D. Prince 56