SlideShare a Scribd company logo
1 of 213
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I
Aim of the Tutorial ,[object Object],[object Object],[object Object],[object Object],[object Object]
A Vision Application Binary Image Segmentation How ? Cost function Models  our  knowledge about natural images Optimize cost function to obtain the segmentation
A Vision Application Object - white, Background - green/grey Graph G = (V,E) Each vertex corresponds to a pixel Edges define a 4-neighbourhood  grid  graph Assign a label to each vertex from L = {obj,bkg} Binary Image Segmentation
A Vision Application Graph G = (V,E) Cost of a labelling f : V    L Per Vertex Cost Cost of label ‘obj’ low Cost of label ‘bkg’ high Object - white, Background - green/grey Binary Image Segmentation
A Vision Application Graph G = (V,E) Cost of a labelling f : V    L Cost of label ‘obj’ high Cost of label ‘bkg’ low Per Vertex Cost UNARY COST Object - white, Background - green/grey Binary Image Segmentation
A Vision Application Graph G = (V,E) Cost of a labelling f : V    L Per Edge Cost Cost of same label low Cost of different labels high Object - white, Background - green/grey Binary Image Segmentation
A Vision Application Graph G = (V,E) Cost of a labelling f : V    L Cost of same label high Cost of different labels low Per Edge Cost PAIRWISE COST Object - white, Background - green/grey Binary Image Segmentation
A Vision Application Graph G = (V,E) Problem: Find the labelling with minimum cost f* Object - white, Background - green/grey Binary Image Segmentation
A Vision Application Graph G = (V,E) Problem: Find the labelling with minimum cost f* Binary Image Segmentation
Another Vision Application Object Detection using Parts-based Models How ? Once again, by defining a good cost function
Another Vision Application H T L1 Each vertex corresponds to a part - ‘Head’, ‘Torso’, ‘Legs’ 1 Edges define a TREE Assign a label to each vertex from L = {positions} Graph G = (V,E) L2 L3 L4 Object Detection using Parts-based Models
Another Vision Application 2 Each vertex corresponds to a part - ‘Head’, ‘Torso’, ‘Legs’ Assign a label to each vertex from L = {positions} Graph G = (V,E) Edges define a TREE H T L1 L2 L3 L4 Object Detection using Parts-based Models
Another Vision Application 3 Each vertex corresponds to a part - ‘Head’, ‘Torso’, ‘Legs’ Assign a label to each vertex from L = {positions} Graph G = (V,E) Edges define a TREE H T L1 L2 L3 L4 Object Detection using Parts-based Models
Another Vision Application Cost of a labelling f : V    L Unary cost : How well does part match image patch? Pairwise cost : Encourages  valid configurations Find best labelling f* Graph G = (V,E) 3 H T L1 L2 L3 L4 Object Detection using Parts-based Models
Another Vision Application Cost of a labelling f : V    L Unary cost : How well does part match image patch? Pairwise cost : Encourages  valid configurations Find best labelling f* Graph G = (V,E) 3 H T L1 L2 L3 L4 Object Detection using Parts-based Models
Yet Another Vision Application Stereo Correspondence Disparity Map How ? Minimizing a cost function
Yet Another Vision Application Stereo Correspondence Graph G = (V,E) Vertex corresponds to a pixel Edges define grid graph L = {disparities}
Yet Another Vision Application Stereo Correspondence Cost of labelling f : Unary cost + Pairwise Cost Find minimum cost f*
The General Problem b a e d c f Graph G = ( V, E )  Discrete label set  L = {1,2,…,h} Assign a label to each vertex f: V    L 1 1 2 2 2 3 Cost of a labelling Q(f) Unary Cost Pairwise Cost Find f* = arg min Q(f)
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Energy Function V a V b V c V d Label  l 0 Label  l 1 D a D b D c D d Random Variables V   = {V a , V b , ….} Labels L   = {l 0 , l 1 , ….} Data D Labelling f: {a, b, …. }    {0,1, …}
Energy Function V a V b V c V d D a D b D c D d Q(f) = ∑ a   a;f(a) Unary Potential 2 5 4 2 6 3 3 7 Label  l 0 Label  l 1 Easy to minimize Neighbourhood
Energy Function V a V b V c V d D a D b D c D d E : (a,b)    E iff V a  and V b  are neighbours E = { (a,b) , (b,c) , (c,d) } 2 5 4 2 6 3 3 7 Label  l 0 Label  l 1
Energy Function V a V b V c V d D a D b D c D d +∑ (a,b)   ab;f(a)f(b) Pairwise Potential 0 1 1 0 0 2 1 1 4 1 0 3 2 5 4 2 6 3 3 7 Label  l 0 Label  l 1 Q(f) = ∑ a   a;f(a)
Energy Function V a V b V c V d D a D b D c D d 0 1 1 0 0 2 1 1 4 1 0 3 Parameter 2 5 4 2 6 3 3 7 Label  l 0 Label  l 1 +∑ (a,b)   ab;f(a)f(b) Q(f;   ) = ∑ a   a;f(a)
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;   ) = ∑ a   a;f(a)  + ∑ (a,b)   ab;f(a)f(b) Label  l 0 Label  l 1
MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;   ) = ∑ a   a;f(a)  + ∑ (a,b)   ab;f(a)f(b) 2 + 1 + 2 + 1 + 3 + 1 + 3 = 13 Label  l 0 Label  l 1
MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;   ) = ∑ a   a;f(a)  + ∑ (a,b)   ab;f(a)f(b) Label  l 0 Label  l 1
MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;   ) = ∑ a   a;f(a)  + ∑ (a,b)   ab;f(a)f(b) 5 + 1 + 4 + 0 + 6 + 4 + 7 = 27 Label  l 0 Label  l 1
MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;   ) = ∑ a   a;f(a)  + ∑ (a,b)   ab;f(a)f(b) f* = arg min Q(f;   ) q* = min Q(f;   ) = Q(f*;   ) Label  l 0 Label  l 1
MAP Estimation 16 possible labellings f* = {1, 0, 0, 1} q* = 13 20 1 1 1 0 27 0 1 1 0 19 1 0 1 0 22 0 0 1 0 20 1 1 0 0 27 0 1 0 0 15 1 0 0 0 18 0 0 0 0 Q(f;   ) f(d) f(c) f(b) f(a) 16 1 1 1 1 23 0 1 1 1 15 1 0 1 1 18 0 0 1 1 18 1 1 0 1 25 0 1 0 1 13 1 0 0 1 16 0 0 0 1 Q(f;   ) f(d) f(c) f(b) f(a)
Computational Complexity Segmentation 2 |V| |V| = number of pixels ≈ 320 * 480 = 153600
Computational Complexity |L| = number of pixels ≈ 153600 Detection |L| |V|
Computational Complexity |V| = number of pixels ≈ 153600 Stereo |L| |V| Can we do better than brute-force? MAP Estimation is NP-hard !!
Computational Complexity |V| = number of pixels ≈ 153600 Stereo |L| |V| Exact algorithms do exist for special cases Good approximate algorithms for general case But first … two important definitions
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Min-Marginals V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 f* = arg min Q(f;   ) such that f(a) = i Min-marginal q a;i Label  l 0 Label  l 1 Not a marginal (no summation)
Min-Marginals 16 possible labellings q a;0  = 15 20 1 1 1 0 27 0 1 1 0 19 1 0 1 0 22 0 0 1 0 20 1 1 0 0 27 0 1 0 0 15 1 0 0 0 18 0 0 0 0 Q(f;   ) f(d) f(c) f(b) f(a) 16 1 1 1 1 23 0 1 1 1 15 1 0 1 1 18 0 0 1 1 18 1 1 0 1 25 0 1 0 1 13 1 0 0 1 16 0 0 0 1 Q(f;   ) f(d) f(c) f(b) f(a)
Min-Marginals 16 possible labellings q a;1  = 13 16 1 1 1 1 23 0 1 1 1 15 1 0 1 1 18 0 0 1 1 18 1 1 0 1 25 0 1 0 1 13 1 0 0 1 16 0 0 0 1 Q(f;   ) f(d) f(c) f(b) f(a) 20 1 1 1 0 27 0 1 1 0 19 1 0 1 0 22 0 0 1 0 20 1 1 0 0 27 0 1 0 0 15 1 0 0 0 18 0 0 0 0 Q(f;   ) f(d) f(c) f(b) f(a)
Min-Marginals and MAP ,[object Object],[object Object],min f  Q(f;   ) such that f(a) = i q a;i  min i  min i  ( ) V a  has to take one label min f  Q(f;   )
Summary MAP Estimation f* = arg min Q(f;   ) Q(f;   ) = ∑ a   a;f(a)  + ∑ (a,b)   ab;f(a)f(b) Min-marginals q a;i  =  min Q(f;   ) s.t. f(a) = i Energy Function
Outline ,[object Object],[object Object],[object Object],[object Object]
Reparameterization V a V b 2 5 4 2 0 1 1 0 2 + 2 + - 2 - 2 Add a constant to all   a;i Subtract that constant from all   b;k 6 1 1 5 0 1 10 1 0 7 0 0 Q(f;   ) f(b) f(a)
Reparameterization Add a constant to all   a;i Subtract that constant from all   b;k Q(f;   ’) = Q(f;   ) V a V b 2 5 4 2 0 0 2 + 2 + - 2 - 2 1 1 6  + 2 - 2 1 1 5  + 2 - 2 0 1 10  + 2 - 2 1 0 7  + 2 - 2 0 0 Q(f;   ) f(b) f(a)
Reparameterization V a V b 2 5 4 2 0 1 1 0 - 3 + 3 Add a constant to one   b;k  Subtract that constant from   ab;ik  for all ‘i’   - 3 6 1 1 5 0 1 10 1 0 7 0 0 Q(f;   ) f(b) f(a)
Reparameterization V a V b 2 5 4 2 0 1 1 0 - 3 + 3 - 3 Q(f;   ’) = Q(f;   ) Add a constant to one   b;k  Subtract that constant from   ab;ik  for all ‘i’   6  - 3 + 3 1 1 5 0 1 10  - 3 + 3 1 0 7 0 0 Q(f;   ) f(b) f(a)
Reparameterization - 2 - 2 - 2 + 2 + 1 + 1 + 1 - 1 - 4 + 4 - 4 - 4  ’ a;i  =   a;i  ’ b;k  =   b;k  ’ ab;ik  =   ab;ik + M ab;k - M ab;k + M ba;i - M ba;i Q(f;   ’)  = Q(f;   ) V a V b 2 5 4 2 3 1 0 1 2 V a V b 2 5 4 2 3 1 1 0 1 V a V b 2 5 4 2 3 1 2 1 0
Reparameterization Q(f;   ’)  = Q(f;   ), for all f  ’  is a reparameterization of   , iff  ’       ’ b;k  =   b;k Kolmogorov, PAMI, 2006  ’ a;i  =   a;i  ’ ab;ik  =   ab;ik + M ab;k - M ab;k + M ba;i - M ba;i Equivalently V a V b 2 5 4 2 0 0 2 + 2 + - 2 - 2 1 1
Recap MAP Estimation f* = arg min Q(f;   ) Q(f;   ) = ∑ a   a;f(a)  + ∑ (a,b)   ab;f(a)f(b) Min-marginals q a;i  =  min Q(f;   ) s.t. f(a) = i Q(f;   ’)  = Q(f;   ), for all f  ’      Reparameterization
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Belief Propagation ,[object Object],[object Object],[object Object],[object Object]
Two Variables V a V b 2 5 2 1 0 V a V b 2 5 4 0 1 Choose the  right  constant  ’ b;k  =  q b;k Add a constant to one   b;k  Subtract that constant from   ab;ik  for all ‘i’
Two Variables V a V b 2 5 2 1 0 V a V b 2 5 4 0 1 Choose the  right  constant  ’ b;k  =  q b;k  a;0  +   ab;00 =  5 + 0  a;1  +   ab;10 =  2 + 1 min M ab;0  =
Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 4 0 1 Choose the  right  constant  ’ b;k  =  q b;k
Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 4 0 1 Choose the  right  constant  ’ b;k  =  q b;k f(a) = 1  ’ b;0  =  q b;0 Potentials along the red path add up to 0
Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 4 0 1 Choose the  right  constant  ’ b;k  =  q b;k  a;0  +   ab;01 =  5 + 1  a;1  +   ab;11 =  2 + 0 min M ab;1  =
Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 6 -2 -1 Choose the  right  constant  ’ b;k  =  q b;k f(a) = 1  ’ b;0  =  q b;0 f(a) = 1  ’ b;1  =  q b;1 Minimum of min-marginals = MAP estimate
Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 6 -2 -1 Choose the  right  constant  ’ b;k  =  q b;k f(a) = 1  ’ b;0  =  q b;0 f(a) = 1  ’ b;1  =  q b;1 f*(b) = 0 f*(a) = 1
Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 6 -2 -1 Choose the  right  constant  ’ b;k  =  q b;k f(a) = 1  ’ b;0  =  q b;0 f(a) = 1  ’ b;1  =  q b;1 We get all the min-marginals of V b
Recap We only need to know two sets of equations General form of Reparameterization Reparameterization of (a,b) in Belief Propagation M ab;k  =  min i  {   a;i  +   ab;ik  }  M ba;i  =  0  ’ a;i  =   a;i  ’ ab;ik  =   ab;ik + M ab;k - M ab;k + M ba;i - M ba;i  ’ b;k  =   b;k
Three Variables V a V b 2 5 2 1 0 V c 4 6 0 1 0 1 3 2 3 Reparameterize the edge (a,b) as before l 0 l 1
Three Variables V a V b 2 5 5 -3 V c 6 6 0 1 -2 3 Reparameterize the edge (a,b) as before f(a) = 1 f(a) = 1 -2 -1 2 3 l 0 l 1
Three Variables V a V b 2 5 5 -3 V c 6 6 0 1 -2 3 Reparameterize the edge (a,b) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 -2 -1 2 3 l 0 l 1
Three Variables V a V b 2 5 5 -3 V c 6 6 0 1 -2 3 Reparameterize the edge (b,c) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 -2 -1 2 3 l 0 l 1
Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 Reparameterize the edge (b,c) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 f(b) = 1 f(b) = 0 -2 -1 -4 -3 l 0 l 1
Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 Reparameterize the edge (b,c) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 f(b) = 1 f(b) = 0 q c;0 q c;1 -2 -1 -4 -3 l 0 l 1
Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 f(a) = 1 f(a) = 1 f(b) = 1 f(b) = 0 q c;0 q c;1 f*(c) = 0 f*(b) = 0 f*(a) = 1 Generalizes to any length chain -2 -1 -4 -3 l 0 l 1
Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 f(a) = 1 f(a) = 1 f(b) = 1 f(b) = 0 q c;0 q c;1 f*(c) = 0 f*(b) = 0 f*(a) = 1 Only Dynamic Programming -2 -1 -4 -3 l 0 l 1
Why Dynamic Programming? 3 variables    2 variables + book-keeping n variables    (n-1) variables + book-keeping Start from left, go to right Reparameterize current edge (a,b) M ab;k  =  min i  {   a;i  +   ab;ik  }  Repeat  ’ ab;ik  =   ab;ik + M ab;k - M ab;k  ’ b;k  =   b;k
Why Dynamic Programming? Start from left, go to right Reparameterize current edge (a,b) M ab;k  =  min i  {   a;i  +   ab;ik  }  Repeat Messages Message Passing Why stop at dynamic programming?  ’ ab;ik  =   ab;ik + M ab;k - M ab;k  ’ b;k  =   b;k
Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 Reparameterize the edge (c,b) as before -2 -1 -4 -3 l 0 l 1
Three Variables V a V b 2 5 9 -3 V c 11 12 -11 -9 -2 9 Reparameterize the edge (c,b) as before -2 -1 -9 -7  ’ b;i  = q b;i l 0 l 1
Three Variables V a V b 2 5 9 -3 V c 11 12 -11 -9 -2 9 Reparameterize the edge (b,a) as before -2 -1 -9 -7 l 0 l 1
Three Variables V a V b 9 11 9 -9 V c 11 12 -11 -9 -9 9 Reparameterize the edge (b,a) as before -9 -7 -9 -7  ’ a;i  = q a;i l 0 l 1
Three Variables V a V b 9 11 9 -9 V c 11 12 -11 -9 -9 9 Forward Pass       Backward Pass -9 -7 -9 -7 All min-marginals are computed l 0 l 1
Belief Propagation on Chains Start from left, go to right Reparameterize current edge (a,b) M ab;k  =  min i  {   a;i  +   ab;ik  }  Repeat till the end of the chain Start from right, go to left Repeat till the end of the chain  ’ ab;ik  =   ab;ik + M ab;k - M ab;k  ’ b;k  =   b;k
Belief Propagation on Chains ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Won’t need this .. But good to know
Computational Complexity ,[object Object],[object Object],O(|E||L| 2 ) ,[object Object],O(|E||L|)
Belief Propagation on Trees V b V a Forward Pass: Leaf    Root All min-marginals are computed Backward Pass: Root    Leaf V c V d V e V g V h
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Belief Propagation on Cycles V a V b V d V c Where do we start? Arbitrarily  a;0  a;1  b;0  b;1  d;0  d;1  c;0  c;1 Reparameterize (a,b)
Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  c;0  c;1 Potentials along the red path add up to 0
Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0 -   a;0  -   a;1   ’ a;0  -   a;0  = q a;0  ’ a;1  -   a;1  = q a;1
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Pick minimum min-marginal. Follow red path. -   a;0  -   a;1   ’ a;0  -   a;0  = q a;0  ’ a;1  -   a;1  = q a;1
Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  c;0  c;1 Potentials along the red path add up to 0
Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0 -   a;0  -   a;1   ’ a;1  -   a;1  = q a;1  ’ a;0  -   a;0  ≤ q a;0 ≤
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Problem Solved -   a;0  -   a;1   ’ a;1  -   a;1  = q a;1  ’ a;0  -   a;0  ≤ q a;0 ≤
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Problem Not Solved -   a;0  -   a;1   ’ a;1  -   a;1  = q a;1  ’ a;0  -   a;0  ≤ q a;0 ≥
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 -   a;0  -   a;1  Reparameterize (a,b) again
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’’ b;0  ’’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Reparameterize (a,b) again But doesn’t this overcount some potentials?
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’’ b;0  ’’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Reparameterize (a,b) again Yes. But we will do it anyway
Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’’ b;0  ’’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Keep reparameterizing edges in some order Hope for convergence and a good solution
Belief Propagation ,[object Object],[object Object],O(|E||L| 2 ) ,[object Object],O(|E||L|)
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Computational Issues of BP Complexity per iteration O(|E||L| 2 ) Special Pairwise Potentials  ab;ik  = w ab d(|i-k|) O(|E||L|) Felzenszwalb & Huttenlocher, 2004 i - k d Potts i - k d Truncated Linear i - k d Truncated Quadratic
Computational Issues of BP Memory requirements O(|E||L|) Half of original BP Kolmogorov, 2006 Some approximations exist But memory still remains an issue Yu, Lin, Super and Tan, 2007 Lasserre, Kannan and Winn, 2007
Computational Issues of BP Order of reparameterization Randomly Residual Belief Propagation In some fixed order The one that results in maximum change Elidan et al. , 2006
Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? Run BP. Assume it converges.
Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? Choose variables in a tree. Change their labels. Value of energy does not decrease
Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? Choose variables in a cycle. Change their labels. Value of energy does not decrease
Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? For cycles, if BP converges then exact MAP Weiss and Freeman, 2001
Results Object Detection Felzenszwalb and Huttenlocher, 2004 Labels - Poses of parts Unary Potentials: Fraction of foreground pixels Pairwise Potentials: Favour Valid Configurations H T A1 A2 L1 L2
Results Object Detection Felzenszwalb and Huttenlocher, 2004
Results Binary Segmentation Szeliski et al. , 2008 Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels
Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels Belief Propagation
Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Global optimum Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels
Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels Stereo Correspondence
Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels Belief Propagation Stereo Correspondence
Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Global optimum Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels Stereo Correspondence
Summary of BP Exact for chains Exact for trees Approximate MAP for general cases Not even convergence guaranteed So can we do something better?
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
TRW Message Passing ,[object Object],[object Object],[object Object],[object Object],We will look at the most general MAP estimation Not  trees No assumption on potentials
Things to Remember ,[object Object],[object Object],[object Object],[object Object]
Mathematical Optimization min  g 0 (x) subject to  g i (x)  ≤ 0 i=1, … , N ,[object Object],[object Object],[object Object],x*  = arg Optimal Solution g 0 (x*) Optimal Value
Integer Programming min  g 0 (x) subject to  g i (x)  ≤ 0 i=1, … , N ,[object Object],[object Object],[object Object],x*  = arg Optimal Solution g 0 (x*) Optimal Value x k     Z
Feasible Region Generally NP-hard to optimize
Linear Programming min  g 0 (x) subject to  g i (x)  ≤ 0 i=1, … , N ,[object Object],[object Object],[object Object],x*  = arg Optimal Solution g 0 (x*) Optimal Value
Linear Programming min  g 0 (x) subject to  g i (x)  ≤ 0 i=1, … , N ,[object Object],[object Object],[object Object],x*  = arg Optimal Solution g 0 (x*) Optimal Value
Linear Programming min  c T x subject to  Ax  ≤ b i=1, … , N ,[object Object],[object Object],[object Object],x*  = arg Optimal Solution c T x* Optimal Value Polynomial-time Solution
Feasible Region Polynomial-time Solution
Feasible Region Optimal solution lies on a vertex (obj func linear)
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Integer Programming Formulation V a V b Label  l 0 Label  l 1 2 5 4 2 0 1 1 0 2 Unary Potentials  a;0  = 5  a;1  = 2  b;0  = 2  b;1  = 4 Labelling f(a) = 1 f(b) = 0 y a;0  = 0 y a;1  = 1 y b;0  = 1 y b;1  = 0 Any f(.) has equivalent boolean variables y a;i
Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Unary Potentials  a;0  = 5  a;1  = 2  b;0  = 2  b;1  = 4 Labelling f(a) = 1 f(b) = 0 y a;0  = 0 y a;1  = 1 y b;0  = 1 y b;1  = 0 Find the optimal variables y a;i Label  l 0 Label  l 1
Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Unary Potentials  a;0  = 5  a;1  = 2  b;0  = 2  b;1  = 4 Sum of Unary Potentials ∑ a  ∑ i    a;i  y a;i y a;i    {0,1}, for all V a , l i ∑ i  y a;i  = 1, for all V a Label  l 0 Label  l 1
Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Pairwise Potentials  ab;00  = 0  ab;10  = 1  ab;01  = 1  ab;11  = 0 Sum of Pairwise Potentials ∑ (a,b)  ∑ ik    ab;ik  y a;i y b;k y a;i    {0,1} ∑ i  y a;i  = 1 Label  l 0 Label  l 1
Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Pairwise Potentials  ab;00  = 0  ab;10  = 1  ab;01  = 1  ab;11  = 0 Sum of Pairwise Potentials ∑ (a,b)  ∑ ik    ab;ik  y ab;ik y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k Label  l 0 Label  l 1
Integer Programming Formulation min ∑ a  ∑ i    a;i  y a;i  + ∑ (a,b)  ∑ ik    ab;ik  y ab;ik y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k
Integer Programming Formulation min   T y y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k    = [ …   a;i  …. ; …   ab;ik  ….] y  = [ … y a;i  …. ; … y ab;ik  ….]
One variable, two labels y a;0    {0,1} y a;1    {0,1} y a;0  + y a;1  = 1  y  = [ y a;0  y a;1 ]    = [   a;0    a;1 ] y a;0 y a;1
Two variables, two labels ,[object Object],[object Object],y = [ y a;0  y a;1  y b;0  y b;1  y ab;00  y ab;01  y ab;10  y ab;11 ] y a;0    {0,1} y a;1    {0,1} y a;0  + y a;1  = 1  y b;0    {0,1} y b;1    {0,1} y b;0  + y b;1  = 1  y ab;00  = y a;0  y b;0 y ab;01  = y a;0  y b;1 y ab;10  = y a;1  y b;0 y ab;11  = y a;1  y b;1
In General Marginal Polytope
In General ,[object Object],y    {0,1} (|V||L| + |E||L| 2 ) Number of constraints |V||L| + |V| + |E||L| 2 y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k
Integer Programming Formulation min   T y y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k    = [ …   a;i  …. ; …   ab;ik  ….] y  = [ … y a;i  …. ; … y ab;ik  ….]
Integer Programming Formulation min   T y y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k Solve to obtain MAP labelling y*
Integer Programming Formulation min   T y y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k But we can’t solve it in general
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Linear Programming Relaxation min   T y y a;i    {0,1} ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k Two reasons why we can’t solve this
Linear Programming Relaxation min   T y y a;i    [0,1] ∑ i  y a;i  = 1 y ab;ik  =   y a;i  y b;k One reason why we can’t solve this
Linear Programming Relaxation min   T y y a;i    [0,1] ∑ i  y a;i  = 1 ∑ k  y ab;ik  =   ∑ k y a;i  y b;k One reason why we can’t solve this
Linear Programming Relaxation min   T y y a;i    [0,1] ∑ i  y a;i  = 1 One reason why we can’t solve this = 1 ∑ k  y ab;ik  =   y a;i ∑ k  y b;k
Linear Programming Relaxation min   T y y a;i    [0,1] ∑ i  y a;i  = 1 ∑ k  y ab;ik  =   y a;i One reason why we can’t solve this
Linear Programming Relaxation min   T y y a;i    [0,1] ∑ i  y a;i  = 1 ∑ k  y ab;ik  =   y a;i No reason why we can’t solve this * * memory requirements, time complexity
One variable, two labels y a;0    {0,1} y a;1    {0,1} y a;0  + y a;1  = 1  y  = [ y a;0  y a;1 ]    = [   a;0    a;1 ] y a;0 y a;1
One variable, two labels y a;0    [0,1] y a;1    [0,1] y a;0  + y a;1  = 1  y  = [ y a;0  y a;1 ]    = [   a;0    a;1 ] y a;0 y a;1
Two variables, two labels ,[object Object],[object Object],y = [ y a;0  y a;1  y b;0  y b;1  y ab;00  y ab;01  y ab;10  y ab;11 ] y a;0    {0,1} y a;1    {0,1} y a;0  + y a;1  = 1  y b;0    {0,1} y b;1    {0,1} y b;0  + y b;1  = 1  y ab;00  = y a;0  y b;0 y ab;01  = y a;0  y b;1 y ab;10  = y a;1  y b;0 y ab;11  = y a;1  y b;1
Two variables, two labels ,[object Object],[object Object],y = [ y a;0  y a;1  y b;0  y b;1  y ab;00  y ab;01  y ab;10  y ab;11 ] y a;0    [0,1] y a;1    [0,1] y a;0  + y a;1  = 1  y b;0    [0,1] y b;1    [0,1] y b;0  + y b;1  = 1  y ab;00  = y a;0  y b;0 y ab;01  = y a;0  y b;1 y ab;10  = y a;1  y b;0 y ab;11  = y a;1  y b;1
Two variables, two labels ,[object Object],[object Object],y = [ y a;0  y a;1  y b;0  y b;1  y ab;00  y ab;01  y ab;10  y ab;11 ] y a;0    [0,1] y a;1    [0,1] y a;0  + y a;1  = 1  y b;0    [0,1] y b;1    [0,1] y b;0  + y b;1  = 1  y ab;00  + y ab;01  = y a;0 y ab;10  = y a;1  y b;0 y ab;11  = y a;1  y b;1
Two variables, two labels ,[object Object],[object Object],y = [ y a;0  y a;1  y b;0  y b;1  y ab;00  y ab;01  y ab;10  y ab;11 ] y a;0    [0,1] y a;1    [0,1] y a;0  + y a;1  = 1  y b;0    [0,1] y b;1    [0,1] y b;0  + y b;1  = 1  y ab;00  + y ab;01  = y a;0 y ab;10  + y ab;11  = y a;1
In General Marginal Polytope Local Polytope
In General ,[object Object],y    [0,1] (|V||L| + |E||L| 2 ) Number of constraints |V||L| + |V| + |E||L|
Linear Programming Relaxation min   T y y a;i    [0,1] ∑ i  y a;i  = 1 ∑ k  y ab;ik  =   y a;i No reason why we can’t solve this
Linear Programming Relaxation Extensively studied Optimization Schlesinger, 1976 Koster, van Hoesel and Kolen, 1998 Theory Chekuri et al, 2001 Archer et al, 2004 Machine Learning Wainwright et al., 2001
Linear Programming Relaxation Many interesting Properties ,[object Object],Wainwright et al., 2001  But we are interested in NP-hard cases ,[object Object]
Linear Programming Relaxation ,[object Object],[object Object],[object Object],Many interesting Properties -  Integrality Gap Manokaran et al., 2008 ,[object Object]
Linear Programming Relaxation ,[object Object],Many interesting Properties -  Dual Optimal value of dual = Optimal value of primal Easier-to-solve
Dual of the LP Relaxation Wainwright et al., 2001 V a V b V c V d V e V f V g V h V i  min   T y y a;i    [0,1] ∑ i  y a;i  = 1 ∑ k  y ab;ik  =   y a;i
Dual of the LP Relaxation Wainwright et al., 2001 V a V b V c V d V e V f V g V h V i  V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  1  2  3  4  5  6  1  2  3  4  5  6     i  i  =    i  ≥ 0
Dual of the LP Relaxation Wainwright et al., 2001  1  2  3  4  5  6 q*(  1 )     i  i  =   q*(  2 ) q*(  3 ) q*(  4 ) q*(  5 ) q*(  6 )     i   q*(  i ) Dual of LP  V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  i  ≥ 0 max
Dual of the LP Relaxation Wainwright et al., 2001  1  2  3  4  5  6 q*(  1 )     i  i       q*(  2 ) q*(  3 ) q*(  4 ) q*(  5 ) q*(  6 ) Dual of LP  V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  i  ≥ 0     i   q*(  i ) max
Dual of the LP Relaxation Wainwright et al., 2001     i  i       max      i   q*(  i ) I can easily compute q*(  i ) I can easily maintain reparam constraint So can I easily solve the dual?
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
TRW Message Passing Kolmogorov, 2006 V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  1  2  3  1  2  3  4  5  6  4  5  6     i  i           i   q*(  i ) Pick a variable V a
TRW Message Passing Kolmogorov, 2006     i  i           i   q*(  i ) V c V b V a  1 c;0  1 c;1  1 b;0  1 b;1  1 a;0  1 a;1 V a V d V g  4 a;0  4 a;1  4 d;0  4 d;1  4 g;0  4 g;1
TRW Message Passing Kolmogorov, 2006  1  1  +    4  4  +   rest        1   q*(  1 ) +   4   q*(  4 ) +  K V c V b V a V a V d V g Reparameterize to obtain min-marginals of V a  1 c;0  1 c;1  1 b;0  1 b;1  1 a;0  1 a;1  4 a;0  4 a;1  4 d;0  4 d;1  4 g;0  4 g;1
TRW Message Passing Kolmogorov, 2006  1  ’ 1  +    4  ’ 4  +   rest   V c V b V a  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1 V a V d V g  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 One pass of Belief Propagation  1   q*(  ’ 1 ) +   4   q*(  ’ 4 ) +  K
TRW Message Passing Kolmogorov, 2006  1  ’ 1  +    4  ’ 4  +   rest       V c V b V a V a V d V g Remain the same  1   q*(  ’ 1 ) +   4   q*(  ’ 4 ) +  K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1
TRW Message Passing Kolmogorov, 2006  1  ’ 1  +    4  ’ 4  +   rest        1   min{  ’ 1 a;0 ,  ’ 1 a;1 } +   4   min{  ’ 4 a;0 ,  ’ 4 a;1 } +  K V c V b V a V a V d V g  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1
TRW Message Passing Kolmogorov, 2006  1  ’ 1  +    4  ’ 4  +   rest       V c V b V a V a V d V g Compute weighted average of min-marginals of V a  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1   min{  ’ 1 a;0 ,  ’ 1 a;1 } +   4   min{  ’ 4 a;0 ,  ’ 4 a;1 } +  K
TRW Message Passing Kolmogorov, 2006  1  ’ 1  +    4  ’ 4  +   rest       V c V b V a V a V d V g  ’’ a;0  =   1  ’ 1 a;0 +   4  ’ 4 a;0  1  +   4  ’’ a;1  =   1  ’ 1 a;1 +   4  ’ 4 a;1  1  +   4  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1   min{  ’ 1 a;0 ,  ’ 1 a;1 } +   4   min{  ’ 4 a;0 ,  ’ 4 a;1 } +  K
TRW Message Passing Kolmogorov, 2006  1  ’’ 1  +    4  ’’ 4  +   rest   V c V b V a V a V d V g  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1   min{  ’ 1 a;0 ,  ’ 1 a;1 } +   4   min{  ’ 4 a;0 ,  ’ 4 a;1 } +  K  ’’ a;0  =   1  ’ 1 a;0 +   4  ’ 4 a;0  1  +   4  ’’ a;1  =   1  ’ 1 a;1 +   4  ’ 4 a;1  1  +   4
TRW Message Passing Kolmogorov, 2006  1  ’’ 1  +    4  ’’ 4  +   rest        V c V b V a V a V d V g  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1   min{  ’ 1 a;0 ,  ’ 1 a;1 } +   4   min{  ’ 4 a;0 ,  ’ 4 a;1 } +  K  ’’ a;0  =   1  ’ 1 a;0 +   4  ’ 4 a;0  1  +   4  ’’ a;1  =   1  ’ 1 a;1 +   4  ’ 4 a;1  1  +   4
TRW Message Passing Kolmogorov, 2006  1  ’’ 1  +    4  ’’ 4  +   rest        V c V b V a V a V d V g  1   min{  ’’ a;0 ,  ’’ a;1 } +   4   min{  ’’ a;0 ,  ’’ a;1 } +  K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  ’’ a;0  =   1  ’ 1 a;0 +   4  ’ 4 a;0  1  +   4  ’’ a;1  =   1  ’ 1 a;1 +   4  ’ 4 a;1  1  +   4
TRW Message Passing Kolmogorov, 2006  1  ’’ 1  +    4  ’’ 4  +   rest        V c V b V a V a V d V g (  1   +   4 ) min{  ’’ a;0 ,   ’’ a;1 } +  K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  ’’ a;0  =   1  ’ 1 a;0 +   4  ’ 4 a;0  1  +   4  ’’ a;1  =   1  ’ 1 a;1 +   4  ’ 4 a;1  1  +   4
TRW Message Passing Kolmogorov, 2006  1  ’’ 1  +    4  ’’ 4  +   rest        V c V b V a V a V d V g (  1   +   4 ) min{  ’’ a;0 ,   ’’ a;1 } +  K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 min {p 1 +p 2 , q 1 +q 2 } min {p 1 , q 1 } + min {p 2 , q 2 } ≥
TRW Message Passing Kolmogorov, 2006  1  ’’ 1  +    4  ’’ 4  +   rest        V c V b V a V a V d V g Objective function increases or remains constant  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 (  1   +   4 ) min{  ’’ a;0 ,   ’’ a;1 } +  K
TRW Message Passing Initialize   i . Take care of reparam constraint Choose random variable V a Compute min-marginals of V a  for all trees Node-average the min-marginals REPEAT Kolmogorov, 2006 Can also do edge-averaging
Example 1 V a V b 0 1 1 0 2 5 4 2 l 0 l 1 V b V c 0 2 3 1 4 2 6 3 V c V a 1 4 1 0 6 3 6 4  2  =1  3  =1  1  =1 5  6  7  Pick variable V a . Reparameterize.
Example 1 V a V b -3 -2 -1 -2 5 7 4 2 V b V c 0 2 3 1 4 2 6 3 V c V a -3 1 -3 -3 6 3 10 7  2  =1  3  =1  1  =1 5  6  7  Average the min-marginals of V a   l 0 l 1
Example 1 V a V b -3 -2 -1 -2 7.5 7 4 2 V b V c 0 2 3 1 4 2 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2  =1  3  =1  1  =1 7  6  7  Pick variable V b . Reparameterize.  l 0 l 1
Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.5 7 V b V c -5 -3 -1 -3 9 6 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2  =1  3  =1  1  =1 7  6  7  Average the min-marginals of V b   l 0 l 1
Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.75 6.5 V b V c -5 -3 -1 -3 8.75 6.5 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2  =1  3  =1  1  =1 6.5  6.5  7  Value of dual does not increase  l 0 l 1
Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.75 6.5 V b V c -5 -3 -1 -3 8.75 6.5 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2  =1  3  =1  1  =1 6.5  6.5  7  Maybe it will increase for V c   NO l 0 l 1
Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.75 6.5 V b V c -5 -3 -1 -3 8.75 6.5 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2  =1  3  =1  1  =1 Strong Tree Agreement  Exact MAP Estimate f 1 (a) = 0 f 1 (b) = 0 f 2 (b) = 0 f 2 (c) = 0 f 3 (c) = 0 f 3 (a) = 0 l 0 l 1
Example 2 V a V b 0 1 1 0 2 5 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 1 1 0 0 3 4 8  2  =1  3  =1  1  =1 4  0  4  Pick variable V a . Reparameterize.  l 0 l 1
Example 2 V a V b -2 -1 -1 -2 4 7 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 9  2  =1  3  =1  1  =1 4  0  4  Average the min-marginals of V a   l 0 l 1
Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2  =1  3  =1  1  =1 4  0  4  Value of dual does not increase  l 0 l 1
Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2  =1  3  =1  1  =1 4  0  4  Maybe it will decrease for V b  or V c   NO l 0 l 1
Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2  =1  3  =1  1  =1 f 1 (a) = 1 f 1 (b) = 1 f 2 (b) = 1 f 2 (c) = 0 f 3 (c) = 1 f 3 (a) = 1 f 2 (b) = 0 f 2 (c) = 1 Weak Tree Agreement  Not Exact MAP Estimate  l 0 l 1
Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2  =1  3  =1  1  =1 Weak Tree Agreement  Convergence point of TRW  l 0 l 1 f 1 (a) = 1 f 1 (b) = 1 f 2 (b) = 1 f 2 (c) = 0 f 3 (c) = 1 f 3 (a) = 1 f 2 (b) = 0 f 2 (c) = 1
Obtaining the Labelling Only solves the dual. Primal solutions? V a V b V c V d V e V f V g V h V i  ’  =      i  i       Fix the label Of V a
Obtaining the Labelling Only solves the dual. Primal solutions? V a V b V c V d V e V f V g V h V i  ’  =      i  i       Fix the label Of V b Continue in some fixed order Meltzer et al., 2006
Outline ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Computational Issues of TRW ,[object Object],Basic Component is Belief Propagation Felzenszwalb & Huttenlocher, 2004 ,[object Object],Kolmogorov, 2006 ,[object Object],Kolmogorov, 2006
Theoretical Properties of TRW ,[object Object],Kolmogorov, 2006 ,[object Object],Wainwright et al., 2001 ,[object Object],Kolmogorov and Wainwright, 2005  ab;00  +   ab;11  ≤   ab;01  +   ab;10
Results Binary Segmentation Szeliski et al. , 2008 Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels
Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels TRW
Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Belief Propagation Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels
Results Stereo Correspondence Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels
Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels TRW Stereo Correspondence
Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Belief Propagation Pairwise Potentials:  0, if same labels 1 -   exp(|D a  - D b |), if different labels Stereo Correspondence
Results Non-submodular problems Kolmogorov, 2006 BP TRW-S 30x30 grid K 50 BP TRW-S BP outperforms TRW-S
Summary ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Related New(er) Work ,[object Object],Globerson and Jaakkola, 2007 Komodakis, Paragios and Tziritas 2007 Weiss et al., 2006 Schlesinger and Giginyak, 2007 ,[object Object],Ravikumar, Agarwal and Wainwright, 2008
Related New(er) Work ,[object Object],Sontag and Jaakkola, 2007 Komodakis and Paragios, 2008 Kumar, Kolmogorov and Torr, 2007 Werner, 2008 Sontag et al., 2008 Kumar and Torr, 2008
Questions on Part I ? Code + Standard Data http://vision.middlebury.edu/MRF

More Related Content

What's hot

Bayesian Inference and Uncertainty Quantification for Inverse Problems
Bayesian Inference and Uncertainty Quantification for Inverse ProblemsBayesian Inference and Uncertainty Quantification for Inverse Problems
Bayesian Inference and Uncertainty Quantification for Inverse ProblemsMatt Moores
 
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...Matt Moores
 
Characterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean EmbeddingsCharacterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean EmbeddingsDon Sheehy
 
A new practical algorithm for volume estimation using annealing of convex bodies
A new practical algorithm for volume estimation using annealing of convex bodiesA new practical algorithm for volume estimation using annealing of convex bodies
A new practical algorithm for volume estimation using annealing of convex bodiesVissarion Fisikopoulos
 
Flow Network Talk
Flow Network TalkFlow Network Talk
Flow Network TalkImane Haf
 
KARNAUGH MAP(K-MAP)
KARNAUGH MAP(K-MAP)KARNAUGH MAP(K-MAP)
KARNAUGH MAP(K-MAP)mihir jain
 
CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...
CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...
CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...CHENNAKESAVA KADAPA
 
An efficient algorithm for the computation of Bernoulli numbers
 An efficient algorithm for the computation of Bernoulli numbers An efficient algorithm for the computation of Bernoulli numbers
An efficient algorithm for the computation of Bernoulli numbersXequeMateShannon
 
Prim Algorithm and kruskal algorithm
Prim Algorithm and kruskal algorithmPrim Algorithm and kruskal algorithm
Prim Algorithm and kruskal algorithmAcad
 
B.tech admission in india
B.tech admission in indiaB.tech admission in india
B.tech admission in indiaEdhole.com
 
On Resolution Proofs for Combinational Equivalence
On Resolution Proofs for Combinational EquivalenceOn Resolution Proofs for Combinational Equivalence
On Resolution Proofs for Combinational Equivalencesatrajit
 
Reducing Structural Bias in Technology Mapping
Reducing Structural Bias in Technology MappingReducing Structural Bias in Technology Mapping
Reducing Structural Bias in Technology Mappingsatrajit
 

What's hot (20)

Karnaugh maps
Karnaugh mapsKarnaugh maps
Karnaugh maps
 
Bayesian Inference and Uncertainty Quantification for Inverse Problems
Bayesian Inference and Uncertainty Quantification for Inverse ProblemsBayesian Inference and Uncertainty Quantification for Inverse Problems
Bayesian Inference and Uncertainty Quantification for Inverse Problems
 
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and ...
 
Graph
GraphGraph
Graph
 
Graphs
GraphsGraphs
Graphs
 
Characterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean EmbeddingsCharacterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean Embeddings
 
A new practical algorithm for volume estimation using annealing of convex bodies
A new practical algorithm for volume estimation using annealing of convex bodiesA new practical algorithm for volume estimation using annealing of convex bodies
A new practical algorithm for volume estimation using annealing of convex bodies
 
Temporal graph
Temporal graphTemporal graph
Temporal graph
 
kmaps
 kmaps kmaps
kmaps
 
Flow Network Talk
Flow Network TalkFlow Network Talk
Flow Network Talk
 
KARNAUGH MAP(K-MAP)
KARNAUGH MAP(K-MAP)KARNAUGH MAP(K-MAP)
KARNAUGH MAP(K-MAP)
 
CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...
CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...
CutFEM on hierarchical B-Spline Cartesian grids with applications to fluid-st...
 
An efficient algorithm for the computation of Bernoulli numbers
 An efficient algorithm for the computation of Bernoulli numbers An efficient algorithm for the computation of Bernoulli numbers
An efficient algorithm for the computation of Bernoulli numbers
 
Prim Algorithm and kruskal algorithm
Prim Algorithm and kruskal algorithmPrim Algorithm and kruskal algorithm
Prim Algorithm and kruskal algorithm
 
Chap8 new
Chap8 newChap8 new
Chap8 new
 
B.tech admission in india
B.tech admission in indiaB.tech admission in india
B.tech admission in india
 
KMAP
KMAPKMAP
KMAP
 
On Resolution Proofs for Combinational Equivalence
On Resolution Proofs for Combinational EquivalenceOn Resolution Proofs for Combinational Equivalence
On Resolution Proofs for Combinational Equivalence
 
Approximation Algorithms
Approximation AlgorithmsApproximation Algorithms
Approximation Algorithms
 
Reducing Structural Bias in Technology Mapping
Reducing Structural Bias in Technology MappingReducing Structural Bias in Technology Mapping
Reducing Structural Bias in Technology Mapping
 

Similar to ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 1

Running Free with the Monads
Running Free with the MonadsRunning Free with the Monads
Running Free with the Monadskenbot
 
6161103 2.9 dot product
6161103 2.9 dot product6161103 2.9 dot product
6161103 2.9 dot productetcenterrbru
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)theijes
 
Mathematical preliminaries in Automata
Mathematical preliminaries in AutomataMathematical preliminaries in Automata
Mathematical preliminaries in AutomataMobeen Mustafa
 
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...venkatapranaykumarGa
 
FPGA based BCH Decoder
FPGA based BCH DecoderFPGA based BCH Decoder
FPGA based BCH Decoderijsrd.com
 
Raices de ecuaciones
Raices de ecuacionesRaices de ecuaciones
Raices de ecuacionesNatalia
 
Raices de ecuaciones
Raices de ecuacionesRaices de ecuaciones
Raices de ecuacionesNatalia
 
RxJava In Baby Steps
RxJava In Baby StepsRxJava In Baby Steps
RxJava In Baby StepsAnnyce Davis
 
Digital Communication Exam Help
Digital Communication Exam HelpDigital Communication Exam Help
Digital Communication Exam HelpLive Exam Helper
 
Directed Acyclic Graph
Directed Acyclic Graph Directed Acyclic Graph
Directed Acyclic Graph AJAL A J
 
F4 c1 functions__new__1_
F4 c1 functions__new__1_F4 c1 functions__new__1_
F4 c1 functions__new__1_Shantipa
 
1.  Write an equation in standard form of the parabola that has th.docx
1.  Write an equation in standard form of the parabola that has th.docx1.  Write an equation in standard form of the parabola that has th.docx
1.  Write an equation in standard form of the parabola that has th.docxKiyokoSlagleis
 
Distance vector routing algorithm
Distance vector routing algorithmDistance vector routing algorithm
Distance vector routing algorithmGaurav Rawat
 
20130108 ast signature based boolean matching in the presence of don’t cares_...
20130108 ast signature based boolean matching in the presence of don’t cares_...20130108 ast signature based boolean matching in the presence of don’t cares_...
20130108 ast signature based boolean matching in the presence of don’t cares_...Hanson Chi
 
07122555 0937185627
07122555 093718562707122555 0937185627
07122555 0937185627faisupak
 

Similar to ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 1 (20)

Running Free with the Monads
Running Free with the MonadsRunning Free with the Monads
Running Free with the Monads
 
Função afim resumo teórico e exercícios - celso brasil
Função afim   resumo teórico e exercícios - celso brasilFunção afim   resumo teórico e exercícios - celso brasil
Função afim resumo teórico e exercícios - celso brasil
 
6161103 2.9 dot product
6161103 2.9 dot product6161103 2.9 dot product
6161103 2.9 dot product
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)
 
Mathematical preliminaries in Automata
Mathematical preliminaries in AutomataMathematical preliminaries in Automata
Mathematical preliminaries in Automata
 
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
 
FPGA based BCH Decoder
FPGA based BCH DecoderFPGA based BCH Decoder
FPGA based BCH Decoder
 
Raices de ecuaciones
Raices de ecuacionesRaices de ecuaciones
Raices de ecuaciones
 
Raices de ecuaciones
Raices de ecuacionesRaices de ecuaciones
Raices de ecuaciones
 
RxJava In Baby Steps
RxJava In Baby StepsRxJava In Baby Steps
RxJava In Baby Steps
 
Digital Communication Exam Help
Digital Communication Exam HelpDigital Communication Exam Help
Digital Communication Exam Help
 
Directed Acyclic Graph
Directed Acyclic Graph Directed Acyclic Graph
Directed Acyclic Graph
 
CH04.ppt
CH04.pptCH04.ppt
CH04.ppt
 
F4 c1 functions__new__1_
F4 c1 functions__new__1_F4 c1 functions__new__1_
F4 c1 functions__new__1_
 
1.  Write an equation in standard form of the parabola that has th.docx
1.  Write an equation in standard form of the parabola that has th.docx1.  Write an equation in standard form of the parabola that has th.docx
1.  Write an equation in standard form of the parabola that has th.docx
 
Chapter 2
Chapter 2Chapter 2
Chapter 2
 
Distance vector routing algorithm
Distance vector routing algorithmDistance vector routing algorithm
Distance vector routing algorithm
 
20130108 ast signature based boolean matching in the presence of don’t cares_...
20130108 ast signature based boolean matching in the presence of don’t cares_...20130108 ast signature based boolean matching in the presence of don’t cares_...
20130108 ast signature based boolean matching in the presence of don’t cares_...
 
number-theory
number-theorynumber-theory
number-theory
 
07122555 0937185627
07122555 093718562707122555 0937185627
07122555 0937185627
 

More from zukun

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009zukun
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVzukun
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Informationzukun
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statisticszukun
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibrationzukun
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionzukun
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluationzukun
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-softwarezukun
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptorszukun
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectorszukun
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-introzukun
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video searchzukun
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video searchzukun
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video searchzukun
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learningzukun
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionzukun
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick startzukun
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysiszukun
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structureszukun
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities zukun
 

More from zukun (20)

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCV
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Information
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statistics
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibration
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer vision
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluation
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-software
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-intro
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video search
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video search
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video search
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learning
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer vision
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick start
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structures
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities
 

Recently uploaded

How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Science lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lessonScience lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lessonJericReyAuditor
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfakmcokerachita
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxAnaBeatriceAblay2
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,Virag Sontakke
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 

Recently uploaded (20)

How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Science lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lessonScience lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lesson
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdf
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 

ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 1

  • 1. MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I
  • 2.
  • 3. A Vision Application Binary Image Segmentation How ? Cost function Models our knowledge about natural images Optimize cost function to obtain the segmentation
  • 4. A Vision Application Object - white, Background - green/grey Graph G = (V,E) Each vertex corresponds to a pixel Edges define a 4-neighbourhood grid graph Assign a label to each vertex from L = {obj,bkg} Binary Image Segmentation
  • 5. A Vision Application Graph G = (V,E) Cost of a labelling f : V  L Per Vertex Cost Cost of label ‘obj’ low Cost of label ‘bkg’ high Object - white, Background - green/grey Binary Image Segmentation
  • 6. A Vision Application Graph G = (V,E) Cost of a labelling f : V  L Cost of label ‘obj’ high Cost of label ‘bkg’ low Per Vertex Cost UNARY COST Object - white, Background - green/grey Binary Image Segmentation
  • 7. A Vision Application Graph G = (V,E) Cost of a labelling f : V  L Per Edge Cost Cost of same label low Cost of different labels high Object - white, Background - green/grey Binary Image Segmentation
  • 8. A Vision Application Graph G = (V,E) Cost of a labelling f : V  L Cost of same label high Cost of different labels low Per Edge Cost PAIRWISE COST Object - white, Background - green/grey Binary Image Segmentation
  • 9. A Vision Application Graph G = (V,E) Problem: Find the labelling with minimum cost f* Object - white, Background - green/grey Binary Image Segmentation
  • 10. A Vision Application Graph G = (V,E) Problem: Find the labelling with minimum cost f* Binary Image Segmentation
  • 11. Another Vision Application Object Detection using Parts-based Models How ? Once again, by defining a good cost function
  • 12. Another Vision Application H T L1 Each vertex corresponds to a part - ‘Head’, ‘Torso’, ‘Legs’ 1 Edges define a TREE Assign a label to each vertex from L = {positions} Graph G = (V,E) L2 L3 L4 Object Detection using Parts-based Models
  • 13. Another Vision Application 2 Each vertex corresponds to a part - ‘Head’, ‘Torso’, ‘Legs’ Assign a label to each vertex from L = {positions} Graph G = (V,E) Edges define a TREE H T L1 L2 L3 L4 Object Detection using Parts-based Models
  • 14. Another Vision Application 3 Each vertex corresponds to a part - ‘Head’, ‘Torso’, ‘Legs’ Assign a label to each vertex from L = {positions} Graph G = (V,E) Edges define a TREE H T L1 L2 L3 L4 Object Detection using Parts-based Models
  • 15. Another Vision Application Cost of a labelling f : V  L Unary cost : How well does part match image patch? Pairwise cost : Encourages valid configurations Find best labelling f* Graph G = (V,E) 3 H T L1 L2 L3 L4 Object Detection using Parts-based Models
  • 16. Another Vision Application Cost of a labelling f : V  L Unary cost : How well does part match image patch? Pairwise cost : Encourages valid configurations Find best labelling f* Graph G = (V,E) 3 H T L1 L2 L3 L4 Object Detection using Parts-based Models
  • 17. Yet Another Vision Application Stereo Correspondence Disparity Map How ? Minimizing a cost function
  • 18. Yet Another Vision Application Stereo Correspondence Graph G = (V,E) Vertex corresponds to a pixel Edges define grid graph L = {disparities}
  • 19. Yet Another Vision Application Stereo Correspondence Cost of labelling f : Unary cost + Pairwise Cost Find minimum cost f*
  • 20. The General Problem b a e d c f Graph G = ( V, E ) Discrete label set L = {1,2,…,h} Assign a label to each vertex f: V  L 1 1 2 2 2 3 Cost of a labelling Q(f) Unary Cost Pairwise Cost Find f* = arg min Q(f)
  • 21.
  • 22. Energy Function V a V b V c V d Label l 0 Label l 1 D a D b D c D d Random Variables V = {V a , V b , ….} Labels L = {l 0 , l 1 , ….} Data D Labelling f: {a, b, …. }  {0,1, …}
  • 23. Energy Function V a V b V c V d D a D b D c D d Q(f) = ∑ a  a;f(a) Unary Potential 2 5 4 2 6 3 3 7 Label l 0 Label l 1 Easy to minimize Neighbourhood
  • 24. Energy Function V a V b V c V d D a D b D c D d E : (a,b)  E iff V a and V b are neighbours E = { (a,b) , (b,c) , (c,d) } 2 5 4 2 6 3 3 7 Label l 0 Label l 1
  • 25. Energy Function V a V b V c V d D a D b D c D d +∑ (a,b)  ab;f(a)f(b) Pairwise Potential 0 1 1 0 0 2 1 1 4 1 0 3 2 5 4 2 6 3 3 7 Label l 0 Label l 1 Q(f) = ∑ a  a;f(a)
  • 26. Energy Function V a V b V c V d D a D b D c D d 0 1 1 0 0 2 1 1 4 1 0 3 Parameter 2 5 4 2 6 3 3 7 Label l 0 Label l 1 +∑ (a,b)  ab;f(a)f(b) Q(f;  ) = ∑ a  a;f(a)
  • 27.
  • 28. MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) Label l 0 Label l 1
  • 29. MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) 2 + 1 + 2 + 1 + 3 + 1 + 3 = 13 Label l 0 Label l 1
  • 30. MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) Label l 0 Label l 1
  • 31. MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) 5 + 1 + 4 + 0 + 6 + 4 + 7 = 27 Label l 0 Label l 1
  • 32. MAP Estimation V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) f* = arg min Q(f;  ) q* = min Q(f;  ) = Q(f*;  ) Label l 0 Label l 1
  • 33. MAP Estimation 16 possible labellings f* = {1, 0, 0, 1} q* = 13 20 1 1 1 0 27 0 1 1 0 19 1 0 1 0 22 0 0 1 0 20 1 1 0 0 27 0 1 0 0 15 1 0 0 0 18 0 0 0 0 Q(f;  ) f(d) f(c) f(b) f(a) 16 1 1 1 1 23 0 1 1 1 15 1 0 1 1 18 0 0 1 1 18 1 1 0 1 25 0 1 0 1 13 1 0 0 1 16 0 0 0 1 Q(f;  ) f(d) f(c) f(b) f(a)
  • 34. Computational Complexity Segmentation 2 |V| |V| = number of pixels ≈ 320 * 480 = 153600
  • 35. Computational Complexity |L| = number of pixels ≈ 153600 Detection |L| |V|
  • 36. Computational Complexity |V| = number of pixels ≈ 153600 Stereo |L| |V| Can we do better than brute-force? MAP Estimation is NP-hard !!
  • 37. Computational Complexity |V| = number of pixels ≈ 153600 Stereo |L| |V| Exact algorithms do exist for special cases Good approximate algorithms for general case But first … two important definitions
  • 38.
  • 39. Min-Marginals V a V b V c V d 2 5 4 2 6 3 3 7 0 1 1 0 0 2 1 1 4 1 0 3 f* = arg min Q(f;  ) such that f(a) = i Min-marginal q a;i Label l 0 Label l 1 Not a marginal (no summation)
  • 40. Min-Marginals 16 possible labellings q a;0 = 15 20 1 1 1 0 27 0 1 1 0 19 1 0 1 0 22 0 0 1 0 20 1 1 0 0 27 0 1 0 0 15 1 0 0 0 18 0 0 0 0 Q(f;  ) f(d) f(c) f(b) f(a) 16 1 1 1 1 23 0 1 1 1 15 1 0 1 1 18 0 0 1 1 18 1 1 0 1 25 0 1 0 1 13 1 0 0 1 16 0 0 0 1 Q(f;  ) f(d) f(c) f(b) f(a)
  • 41. Min-Marginals 16 possible labellings q a;1 = 13 16 1 1 1 1 23 0 1 1 1 15 1 0 1 1 18 0 0 1 1 18 1 1 0 1 25 0 1 0 1 13 1 0 0 1 16 0 0 0 1 Q(f;  ) f(d) f(c) f(b) f(a) 20 1 1 1 0 27 0 1 1 0 19 1 0 1 0 22 0 0 1 0 20 1 1 0 0 27 0 1 0 0 15 1 0 0 0 18 0 0 0 0 Q(f;  ) f(d) f(c) f(b) f(a)
  • 42.
  • 43. Summary MAP Estimation f* = arg min Q(f;  ) Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) Min-marginals q a;i = min Q(f;  ) s.t. f(a) = i Energy Function
  • 44.
  • 45. Reparameterization V a V b 2 5 4 2 0 1 1 0 2 + 2 + - 2 - 2 Add a constant to all  a;i Subtract that constant from all  b;k 6 1 1 5 0 1 10 1 0 7 0 0 Q(f;  ) f(b) f(a)
  • 46. Reparameterization Add a constant to all  a;i Subtract that constant from all  b;k Q(f;  ’) = Q(f;  ) V a V b 2 5 4 2 0 0 2 + 2 + - 2 - 2 1 1 6 + 2 - 2 1 1 5 + 2 - 2 0 1 10 + 2 - 2 1 0 7 + 2 - 2 0 0 Q(f;  ) f(b) f(a)
  • 47. Reparameterization V a V b 2 5 4 2 0 1 1 0 - 3 + 3 Add a constant to one  b;k Subtract that constant from  ab;ik for all ‘i’ - 3 6 1 1 5 0 1 10 1 0 7 0 0 Q(f;  ) f(b) f(a)
  • 48. Reparameterization V a V b 2 5 4 2 0 1 1 0 - 3 + 3 - 3 Q(f;  ’) = Q(f;  ) Add a constant to one  b;k Subtract that constant from  ab;ik for all ‘i’ 6 - 3 + 3 1 1 5 0 1 10 - 3 + 3 1 0 7 0 0 Q(f;  ) f(b) f(a)
  • 49. Reparameterization - 2 - 2 - 2 + 2 + 1 + 1 + 1 - 1 - 4 + 4 - 4 - 4  ’ a;i =  a;i  ’ b;k =  b;k  ’ ab;ik =  ab;ik + M ab;k - M ab;k + M ba;i - M ba;i Q(f;  ’) = Q(f;  ) V a V b 2 5 4 2 3 1 0 1 2 V a V b 2 5 4 2 3 1 1 0 1 V a V b 2 5 4 2 3 1 2 1 0
  • 50. Reparameterization Q(f;  ’) = Q(f;  ), for all f  ’ is a reparameterization of  , iff  ’    ’ b;k =  b;k Kolmogorov, PAMI, 2006  ’ a;i =  a;i  ’ ab;ik =  ab;ik + M ab;k - M ab;k + M ba;i - M ba;i Equivalently V a V b 2 5 4 2 0 0 2 + 2 + - 2 - 2 1 1
  • 51. Recap MAP Estimation f* = arg min Q(f;  ) Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) Min-marginals q a;i = min Q(f;  ) s.t. f(a) = i Q(f;  ’) = Q(f;  ), for all f  ’   Reparameterization
  • 52.
  • 53.
  • 54. Two Variables V a V b 2 5 2 1 0 V a V b 2 5 4 0 1 Choose the right constant  ’ b;k = q b;k Add a constant to one  b;k Subtract that constant from  ab;ik for all ‘i’
  • 55. Two Variables V a V b 2 5 2 1 0 V a V b 2 5 4 0 1 Choose the right constant  ’ b;k = q b;k  a;0 +  ab;00 = 5 + 0  a;1 +  ab;10 = 2 + 1 min M ab;0 =
  • 56. Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 4 0 1 Choose the right constant  ’ b;k = q b;k
  • 57. Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 4 0 1 Choose the right constant  ’ b;k = q b;k f(a) = 1  ’ b;0 = q b;0 Potentials along the red path add up to 0
  • 58. Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 4 0 1 Choose the right constant  ’ b;k = q b;k  a;0 +  ab;01 = 5 + 1  a;1 +  ab;11 = 2 + 0 min M ab;1 =
  • 59. Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 6 -2 -1 Choose the right constant  ’ b;k = q b;k f(a) = 1  ’ b;0 = q b;0 f(a) = 1  ’ b;1 = q b;1 Minimum of min-marginals = MAP estimate
  • 60. Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 6 -2 -1 Choose the right constant  ’ b;k = q b;k f(a) = 1  ’ b;0 = q b;0 f(a) = 1  ’ b;1 = q b;1 f*(b) = 0 f*(a) = 1
  • 61. Two Variables V a V b 2 5 5 -2 -3 V a V b 2 5 6 -2 -1 Choose the right constant  ’ b;k = q b;k f(a) = 1  ’ b;0 = q b;0 f(a) = 1  ’ b;1 = q b;1 We get all the min-marginals of V b
  • 62. Recap We only need to know two sets of equations General form of Reparameterization Reparameterization of (a,b) in Belief Propagation M ab;k = min i {  a;i +  ab;ik } M ba;i = 0  ’ a;i =  a;i  ’ ab;ik =  ab;ik + M ab;k - M ab;k + M ba;i - M ba;i  ’ b;k =  b;k
  • 63. Three Variables V a V b 2 5 2 1 0 V c 4 6 0 1 0 1 3 2 3 Reparameterize the edge (a,b) as before l 0 l 1
  • 64. Three Variables V a V b 2 5 5 -3 V c 6 6 0 1 -2 3 Reparameterize the edge (a,b) as before f(a) = 1 f(a) = 1 -2 -1 2 3 l 0 l 1
  • 65. Three Variables V a V b 2 5 5 -3 V c 6 6 0 1 -2 3 Reparameterize the edge (a,b) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 -2 -1 2 3 l 0 l 1
  • 66. Three Variables V a V b 2 5 5 -3 V c 6 6 0 1 -2 3 Reparameterize the edge (b,c) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 -2 -1 2 3 l 0 l 1
  • 67. Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 Reparameterize the edge (b,c) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 f(b) = 1 f(b) = 0 -2 -1 -4 -3 l 0 l 1
  • 68. Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 Reparameterize the edge (b,c) as before f(a) = 1 f(a) = 1 Potentials along the red path add up to 0 f(b) = 1 f(b) = 0 q c;0 q c;1 -2 -1 -4 -3 l 0 l 1
  • 69. Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 f(a) = 1 f(a) = 1 f(b) = 1 f(b) = 0 q c;0 q c;1 f*(c) = 0 f*(b) = 0 f*(a) = 1 Generalizes to any length chain -2 -1 -4 -3 l 0 l 1
  • 70. Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 f(a) = 1 f(a) = 1 f(b) = 1 f(b) = 0 q c;0 q c;1 f*(c) = 0 f*(b) = 0 f*(a) = 1 Only Dynamic Programming -2 -1 -4 -3 l 0 l 1
  • 71. Why Dynamic Programming? 3 variables  2 variables + book-keeping n variables  (n-1) variables + book-keeping Start from left, go to right Reparameterize current edge (a,b) M ab;k = min i {  a;i +  ab;ik } Repeat  ’ ab;ik =  ab;ik + M ab;k - M ab;k  ’ b;k =  b;k
  • 72. Why Dynamic Programming? Start from left, go to right Reparameterize current edge (a,b) M ab;k = min i {  a;i +  ab;ik } Repeat Messages Message Passing Why stop at dynamic programming?  ’ ab;ik =  ab;ik + M ab;k - M ab;k  ’ b;k =  b;k
  • 73. Three Variables V a V b 2 5 5 -3 V c 6 12 -6 -5 -2 9 Reparameterize the edge (c,b) as before -2 -1 -4 -3 l 0 l 1
  • 74. Three Variables V a V b 2 5 9 -3 V c 11 12 -11 -9 -2 9 Reparameterize the edge (c,b) as before -2 -1 -9 -7  ’ b;i = q b;i l 0 l 1
  • 75. Three Variables V a V b 2 5 9 -3 V c 11 12 -11 -9 -2 9 Reparameterize the edge (b,a) as before -2 -1 -9 -7 l 0 l 1
  • 76. Three Variables V a V b 9 11 9 -9 V c 11 12 -11 -9 -9 9 Reparameterize the edge (b,a) as before -9 -7 -9 -7  ’ a;i = q a;i l 0 l 1
  • 77. Three Variables V a V b 9 11 9 -9 V c 11 12 -11 -9 -9 9 Forward Pass   Backward Pass -9 -7 -9 -7 All min-marginals are computed l 0 l 1
  • 78. Belief Propagation on Chains Start from left, go to right Reparameterize current edge (a,b) M ab;k = min i {  a;i +  ab;ik } Repeat till the end of the chain Start from right, go to left Repeat till the end of the chain  ’ ab;ik =  ab;ik + M ab;k - M ab;k  ’ b;k =  b;k
  • 79.
  • 80.
  • 81. Belief Propagation on Trees V b V a Forward Pass: Leaf  Root All min-marginals are computed Backward Pass: Root  Leaf V c V d V e V g V h
  • 82.
  • 83. Belief Propagation on Cycles V a V b V d V c Where do we start? Arbitrarily  a;0  a;1  b;0  b;1  d;0  d;1  c;0  c;1 Reparameterize (a,b)
  • 84. Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  c;0  c;1 Potentials along the red path add up to 0
  • 85. Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
  • 86. Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
  • 87. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
  • 88. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0 -  a;0 -  a;1  ’ a;0 -  a;0 = q a;0  ’ a;1 -  a;1 = q a;1
  • 89. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Pick minimum min-marginal. Follow red path. -  a;0 -  a;1  ’ a;0 -  a;0 = q a;0  ’ a;1 -  a;1 = q a;1
  • 90. Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  c;0  c;1 Potentials along the red path add up to 0
  • 91. Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  d;0  d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
  • 92. Belief Propagation on Cycles V a V b V d V c  a;0  a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0
  • 93. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Potentials along the red path add up to 0 -  a;0 -  a;1  ’ a;1 -  a;1 = q a;1  ’ a;0 -  a;0 ≤ q a;0 ≤
  • 94. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Problem Solved -  a;0 -  a;1  ’ a;1 -  a;1 = q a;1  ’ a;0 -  a;0 ≤ q a;0 ≤
  • 95. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Problem Not Solved -  a;0 -  a;1  ’ a;1 -  a;1 = q a;1  ’ a;0 -  a;0 ≤ q a;0 ≥
  • 96. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’ b;0  ’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 -  a;0 -  a;1 Reparameterize (a,b) again
  • 97. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’’ b;0  ’’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Reparameterize (a,b) again But doesn’t this overcount some potentials?
  • 98. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’’ b;0  ’’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Reparameterize (a,b) again Yes. But we will do it anyway
  • 99. Belief Propagation on Cycles V a V b V d V c  ’ a;0  ’ a;1  ’’ b;0  ’’ b;1  ’ d;0  ’ d;1  ’ c;0  ’ c;1 Keep reparameterizing edges in some order Hope for convergence and a good solution
  • 100.
  • 101.
  • 102. Computational Issues of BP Complexity per iteration O(|E||L| 2 ) Special Pairwise Potentials  ab;ik = w ab d(|i-k|) O(|E||L|) Felzenszwalb & Huttenlocher, 2004 i - k d Potts i - k d Truncated Linear i - k d Truncated Quadratic
  • 103. Computational Issues of BP Memory requirements O(|E||L|) Half of original BP Kolmogorov, 2006 Some approximations exist But memory still remains an issue Yu, Lin, Super and Tan, 2007 Lasserre, Kannan and Winn, 2007
  • 104. Computational Issues of BP Order of reparameterization Randomly Residual Belief Propagation In some fixed order The one that results in maximum change Elidan et al. , 2006
  • 105. Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? Run BP. Assume it converges.
  • 106. Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? Choose variables in a tree. Change their labels. Value of energy does not decrease
  • 107. Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? Choose variables in a cycle. Change their labels. Value of energy does not decrease
  • 108. Theoretical Properties of BP Exact for Trees Pearl, 1988 What about any general random field? For cycles, if BP converges then exact MAP Weiss and Freeman, 2001
  • 109. Results Object Detection Felzenszwalb and Huttenlocher, 2004 Labels - Poses of parts Unary Potentials: Fraction of foreground pixels Pairwise Potentials: Favour Valid Configurations H T A1 A2 L1 L2
  • 110. Results Object Detection Felzenszwalb and Huttenlocher, 2004
  • 111. Results Binary Segmentation Szeliski et al. , 2008 Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels
  • 112. Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels Belief Propagation
  • 113. Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Global optimum Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels
  • 114. Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels Stereo Correspondence
  • 115. Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels Belief Propagation Stereo Correspondence
  • 116. Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Global optimum Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels Stereo Correspondence
  • 117. Summary of BP Exact for chains Exact for trees Approximate MAP for general cases Not even convergence guaranteed So can we do something better?
  • 118.
  • 119.
  • 120.
  • 121.
  • 122.
  • 123. Feasible Region Generally NP-hard to optimize
  • 124.
  • 125.
  • 126.
  • 128. Feasible Region Optimal solution lies on a vertex (obj func linear)
  • 129.
  • 130. Integer Programming Formulation V a V b Label l 0 Label l 1 2 5 4 2 0 1 1 0 2 Unary Potentials  a;0 = 5  a;1 = 2  b;0 = 2  b;1 = 4 Labelling f(a) = 1 f(b) = 0 y a;0 = 0 y a;1 = 1 y b;0 = 1 y b;1 = 0 Any f(.) has equivalent boolean variables y a;i
  • 131. Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Unary Potentials  a;0 = 5  a;1 = 2  b;0 = 2  b;1 = 4 Labelling f(a) = 1 f(b) = 0 y a;0 = 0 y a;1 = 1 y b;0 = 1 y b;1 = 0 Find the optimal variables y a;i Label l 0 Label l 1
  • 132. Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Unary Potentials  a;0 = 5  a;1 = 2  b;0 = 2  b;1 = 4 Sum of Unary Potentials ∑ a ∑ i  a;i y a;i y a;i  {0,1}, for all V a , l i ∑ i y a;i = 1, for all V a Label l 0 Label l 1
  • 133. Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Pairwise Potentials  ab;00 = 0  ab;10 = 1  ab;01 = 1  ab;11 = 0 Sum of Pairwise Potentials ∑ (a,b) ∑ ik  ab;ik y a;i y b;k y a;i  {0,1} ∑ i y a;i = 1 Label l 0 Label l 1
  • 134. Integer Programming Formulation V a V b 2 5 4 2 0 1 1 0 2 Pairwise Potentials  ab;00 = 0  ab;10 = 1  ab;01 = 1  ab;11 = 0 Sum of Pairwise Potentials ∑ (a,b) ∑ ik  ab;ik y ab;ik y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k Label l 0 Label l 1
  • 135. Integer Programming Formulation min ∑ a ∑ i  a;i y a;i + ∑ (a,b) ∑ ik  ab;ik y ab;ik y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k
  • 136. Integer Programming Formulation min  T y y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k  = [ …  a;i …. ; …  ab;ik ….] y = [ … y a;i …. ; … y ab;ik ….]
  • 137. One variable, two labels y a;0  {0,1} y a;1  {0,1} y a;0 + y a;1 = 1 y = [ y a;0 y a;1 ]  = [  a;0  a;1 ] y a;0 y a;1
  • 138.
  • 139. In General Marginal Polytope
  • 140.
  • 141. Integer Programming Formulation min  T y y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k  = [ …  a;i …. ; …  ab;ik ….] y = [ … y a;i …. ; … y ab;ik ….]
  • 142. Integer Programming Formulation min  T y y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k Solve to obtain MAP labelling y*
  • 143. Integer Programming Formulation min  T y y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k But we can’t solve it in general
  • 144.
  • 145. Linear Programming Relaxation min  T y y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k Two reasons why we can’t solve this
  • 146. Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 y ab;ik = y a;i y b;k One reason why we can’t solve this
  • 147. Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = ∑ k y a;i y b;k One reason why we can’t solve this
  • 148. Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 One reason why we can’t solve this = 1 ∑ k y ab;ik = y a;i ∑ k y b;k
  • 149. Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = y a;i One reason why we can’t solve this
  • 150. Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = y a;i No reason why we can’t solve this * * memory requirements, time complexity
  • 151. One variable, two labels y a;0  {0,1} y a;1  {0,1} y a;0 + y a;1 = 1 y = [ y a;0 y a;1 ]  = [  a;0  a;1 ] y a;0 y a;1
  • 152. One variable, two labels y a;0  [0,1] y a;1  [0,1] y a;0 + y a;1 = 1 y = [ y a;0 y a;1 ]  = [  a;0  a;1 ] y a;0 y a;1
  • 153.
  • 154.
  • 155.
  • 156.
  • 157. In General Marginal Polytope Local Polytope
  • 158.
  • 159. Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = y a;i No reason why we can’t solve this
  • 160. Linear Programming Relaxation Extensively studied Optimization Schlesinger, 1976 Koster, van Hoesel and Kolen, 1998 Theory Chekuri et al, 2001 Archer et al, 2004 Machine Learning Wainwright et al., 2001
  • 161.
  • 162.
  • 163.
  • 164. Dual of the LP Relaxation Wainwright et al., 2001 V a V b V c V d V e V f V g V h V i  min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = y a;i
  • 165. Dual of the LP Relaxation Wainwright et al., 2001 V a V b V c V d V e V f V g V h V i  V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  1  2  3  4  5  6  1  2  3  4  5  6   i  i =   i ≥ 0
  • 166. Dual of the LP Relaxation Wainwright et al., 2001  1  2  3  4  5  6 q*(  1 )   i  i =  q*(  2 ) q*(  3 ) q*(  4 ) q*(  5 ) q*(  6 )   i q*(  i ) Dual of LP  V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  i ≥ 0 max
  • 167. Dual of the LP Relaxation Wainwright et al., 2001  1  2  3  4  5  6 q*(  1 )   i  i   q*(  2 ) q*(  3 ) q*(  4 ) q*(  5 ) q*(  6 ) Dual of LP  V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  i ≥ 0   i q*(  i ) max
  • 168. Dual of the LP Relaxation Wainwright et al., 2001   i  i   max   i q*(  i ) I can easily compute q*(  i ) I can easily maintain reparam constraint So can I easily solve the dual?
  • 169.
  • 170. TRW Message Passing Kolmogorov, 2006 V a V b V c V d V e V f V g V h V i V a V b V c V d V e V f V g V h V i  1  2  3  1  2  3  4  5  6  4  5  6   i  i     i q*(  i ) Pick a variable V a
  • 171. TRW Message Passing Kolmogorov, 2006   i  i     i q*(  i ) V c V b V a  1 c;0  1 c;1  1 b;0  1 b;1  1 a;0  1 a;1 V a V d V g  4 a;0  4 a;1  4 d;0  4 d;1  4 g;0  4 g;1
  • 172. TRW Message Passing Kolmogorov, 2006  1  1 +  4  4 +  rest    1 q*(  1 ) +  4 q*(  4 ) + K V c V b V a V a V d V g Reparameterize to obtain min-marginals of V a  1 c;0  1 c;1  1 b;0  1 b;1  1 a;0  1 a;1  4 a;0  4 a;1  4 d;0  4 d;1  4 g;0  4 g;1
  • 173. TRW Message Passing Kolmogorov, 2006  1  ’ 1 +  4  ’ 4 +  rest V c V b V a  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1 V a V d V g  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 One pass of Belief Propagation  1 q*(  ’ 1 ) +  4 q*(  ’ 4 ) + K
  • 174. TRW Message Passing Kolmogorov, 2006  1  ’ 1 +  4  ’ 4 +  rest   V c V b V a V a V d V g Remain the same  1 q*(  ’ 1 ) +  4 q*(  ’ 4 ) + K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1
  • 175. TRW Message Passing Kolmogorov, 2006  1  ’ 1 +  4  ’ 4 +  rest    1 min{  ’ 1 a;0 ,  ’ 1 a;1 } +  4 min{  ’ 4 a;0 ,  ’ 4 a;1 } + K V c V b V a V a V d V g  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1
  • 176. TRW Message Passing Kolmogorov, 2006  1  ’ 1 +  4  ’ 4 +  rest   V c V b V a V a V d V g Compute weighted average of min-marginals of V a  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1 min{  ’ 1 a;0 ,  ’ 1 a;1 } +  4 min{  ’ 4 a;0 ,  ’ 4 a;1 } + K
  • 177. TRW Message Passing Kolmogorov, 2006  1  ’ 1 +  4  ’ 4 +  rest   V c V b V a V a V d V g  ’’ a;0 =  1  ’ 1 a;0 +  4  ’ 4 a;0  1 +  4  ’’ a;1 =  1  ’ 1 a;1 +  4  ’ 4 a;1  1 +  4  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1 min{  ’ 1 a;0 ,  ’ 1 a;1 } +  4 min{  ’ 4 a;0 ,  ’ 4 a;1 } + K
  • 178. TRW Message Passing Kolmogorov, 2006  1  ’’ 1 +  4  ’’ 4 +  rest V c V b V a V a V d V g  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1 min{  ’ 1 a;0 ,  ’ 1 a;1 } +  4 min{  ’ 4 a;0 ,  ’ 4 a;1 } + K  ’’ a;0 =  1  ’ 1 a;0 +  4  ’ 4 a;0  1 +  4  ’’ a;1 =  1  ’ 1 a;1 +  4  ’ 4 a;1  1 +  4
  • 179. TRW Message Passing Kolmogorov, 2006  1  ’’ 1 +  4  ’’ 4 +  rest   V c V b V a V a V d V g  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  1 min{  ’ 1 a;0 ,  ’ 1 a;1 } +  4 min{  ’ 4 a;0 ,  ’ 4 a;1 } + K  ’’ a;0 =  1  ’ 1 a;0 +  4  ’ 4 a;0  1 +  4  ’’ a;1 =  1  ’ 1 a;1 +  4  ’ 4 a;1  1 +  4
  • 180. TRW Message Passing Kolmogorov, 2006  1  ’’ 1 +  4  ’’ 4 +  rest   V c V b V a V a V d V g  1 min{  ’’ a;0 ,  ’’ a;1 } +  4 min{  ’’ a;0 ,  ’’ a;1 } + K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  ’’ a;0 =  1  ’ 1 a;0 +  4  ’ 4 a;0  1 +  4  ’’ a;1 =  1  ’ 1 a;1 +  4  ’ 4 a;1  1 +  4
  • 181. TRW Message Passing Kolmogorov, 2006  1  ’’ 1 +  4  ’’ 4 +  rest   V c V b V a V a V d V g (  1 +  4 ) min{  ’’ a;0 ,  ’’ a;1 } + K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  ’’ a;0 =  1  ’ 1 a;0 +  4  ’ 4 a;0  1 +  4  ’’ a;1 =  1  ’ 1 a;1 +  4  ’ 4 a;1  1 +  4
  • 182. TRW Message Passing Kolmogorov, 2006  1  ’’ 1 +  4  ’’ 4 +  rest   V c V b V a V a V d V g (  1 +  4 ) min{  ’’ a;0 ,  ’’ a;1 } + K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 min {p 1 +p 2 , q 1 +q 2 } min {p 1 , q 1 } + min {p 2 , q 2 } ≥
  • 183. TRW Message Passing Kolmogorov, 2006  1  ’’ 1 +  4  ’’ 4 +  rest   V c V b V a V a V d V g Objective function increases or remains constant  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 (  1 +  4 ) min{  ’’ a;0 ,  ’’ a;1 } + K
  • 184. TRW Message Passing Initialize  i . Take care of reparam constraint Choose random variable V a Compute min-marginals of V a for all trees Node-average the min-marginals REPEAT Kolmogorov, 2006 Can also do edge-averaging
  • 185. Example 1 V a V b 0 1 1 0 2 5 4 2 l 0 l 1 V b V c 0 2 3 1 4 2 6 3 V c V a 1 4 1 0 6 3 6 4  2 =1  3 =1  1 =1 5 6 7 Pick variable V a . Reparameterize.
  • 186. Example 1 V a V b -3 -2 -1 -2 5 7 4 2 V b V c 0 2 3 1 4 2 6 3 V c V a -3 1 -3 -3 6 3 10 7  2 =1  3 =1  1 =1 5 6 7 Average the min-marginals of V a l 0 l 1
  • 187. Example 1 V a V b -3 -2 -1 -2 7.5 7 4 2 V b V c 0 2 3 1 4 2 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2 =1  3 =1  1 =1 7 6 7 Pick variable V b . Reparameterize. l 0 l 1
  • 188. Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.5 7 V b V c -5 -3 -1 -3 9 6 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2 =1  3 =1  1 =1 7 6 7 Average the min-marginals of V b l 0 l 1
  • 189. Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.75 6.5 V b V c -5 -3 -1 -3 8.75 6.5 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2 =1  3 =1  1 =1 6.5 6.5 7 Value of dual does not increase l 0 l 1
  • 190. Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.75 6.5 V b V c -5 -3 -1 -3 8.75 6.5 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2 =1  3 =1  1 =1 6.5 6.5 7 Maybe it will increase for V c NO l 0 l 1
  • 191. Example 1 V a V b -7.5 -7 -5.5 -7 7.5 7 8.75 6.5 V b V c -5 -3 -1 -3 8.75 6.5 6 3 V c V a -3 1 -3 -3 6 3 7.5 7  2 =1  3 =1  1 =1 Strong Tree Agreement Exact MAP Estimate f 1 (a) = 0 f 1 (b) = 0 f 2 (b) = 0 f 2 (c) = 0 f 3 (c) = 0 f 3 (a) = 0 l 0 l 1
  • 192. Example 2 V a V b 0 1 1 0 2 5 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 1 1 0 0 3 4 8  2 =1  3 =1  1 =1 4 0 4 Pick variable V a . Reparameterize. l 0 l 1
  • 193. Example 2 V a V b -2 -1 -1 -2 4 7 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 9  2 =1  3 =1  1 =1 4 0 4 Average the min-marginals of V a l 0 l 1
  • 194. Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2 =1  3 =1  1 =1 4 0 4 Value of dual does not increase l 0 l 1
  • 195. Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2 =1  3 =1  1 =1 4 0 4 Maybe it will decrease for V b or V c NO l 0 l 1
  • 196. Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2 =1  3 =1  1 =1 f 1 (a) = 1 f 1 (b) = 1 f 2 (b) = 1 f 2 (c) = 0 f 3 (c) = 1 f 3 (a) = 1 f 2 (b) = 0 f 2 (c) = 1 Weak Tree Agreement Not Exact MAP Estimate l 0 l 1
  • 197. Example 2 V a V b -2 -1 -1 -2 4 8 2 2 V b V c 1 0 0 1 0 0 0 0 V c V a 0 0 1 -1 0 3 4 8  2 =1  3 =1  1 =1 Weak Tree Agreement Convergence point of TRW l 0 l 1 f 1 (a) = 1 f 1 (b) = 1 f 2 (b) = 1 f 2 (c) = 0 f 3 (c) = 1 f 3 (a) = 1 f 2 (b) = 0 f 2 (c) = 1
  • 198. Obtaining the Labelling Only solves the dual. Primal solutions? V a V b V c V d V e V f V g V h V i  ’ =   i  i   Fix the label Of V a
  • 199. Obtaining the Labelling Only solves the dual. Primal solutions? V a V b V c V d V e V f V g V h V i  ’ =   i  i   Fix the label Of V b Continue in some fixed order Meltzer et al., 2006
  • 200.
  • 201.
  • 202.
  • 203. Results Binary Segmentation Szeliski et al. , 2008 Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels
  • 204. Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels TRW
  • 205. Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al. , 2008 Belief Propagation Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels
  • 206. Results Stereo Correspondence Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels
  • 207. Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels TRW Stereo Correspondence
  • 208. Results Szeliski et al. , 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Belief Propagation Pairwise Potentials: 0, if same labels 1 -  exp(|D a - D b |), if different labels Stereo Correspondence
  • 209. Results Non-submodular problems Kolmogorov, 2006 BP TRW-S 30x30 grid K 50 BP TRW-S BP outperforms TRW-S
  • 210.
  • 211.
  • 212.
  • 213. Questions on Part I ? Code + Standard Data http://vision.middlebury.edu/MRF

Editor's Notes

  1. 3-cycle figure
  2. Colour code ‘0’ etc. …. Key equations here