SlideShare a Scribd company logo
1 of 43
Download to read offline
Active Contours




                        www.numerical-tours.com

Gabriel Peyré
CEREMADE, Université Paris-Dauphine
Overview


 • Parametric Edge-based Active
   Contours


 • Implicit   Edge-based Active Contours


 • Region-based Active Contours
                                           2
Parametric Active Contours
Local minimum:                   argmin E( ) = L( ) + ⇥R( )
                                                  Data   Regularization
Boundary conditions:                             fidelity
                                                                 x0
 – Open curve:     (0) = x0   and   (1) = x1 .
 – Closed curve:    (0) = (1).

                                                            x1




                                                                      3
Parametric Active Contours
Local minimum:                          argmin E( ) = L( ) + ⇥R( )
                                                               Data   Regularization
Boundary conditions:                                          fidelity
                                                                                  x0
 – Open curve:         (0) = x0    and     (1) = x1 .
 – Closed curve:         (0) = (1).

Snakes energy: (depends on parameterization)                                 x1
              1                                         1
 L( ) =           W ( (t))|| (t)||dt,     R( ) =            || (t)|| + µ||    (t)||dt
          0                                        0




        Image f                   Weight W (x)                 Curve                    3
Geodesic Active Contours
 Geodesic active contours: (intrinsic)          Replace W by W + ,
                            1
      E( ) = L( ) =             W ( (t))|| (t)||dt
                        0
     (local) minimum of the weighted length L.
     local geodesic (not minimal path).




              Weight W (x)                    Curve                  4
Curve Evolution
Family of curves { s (t)}s>0 minimizing E( s ).
                                                  s

Do not confound:
     t: abscise along the curve.
                                                      s+ds
     s: artificial “time” of evolution.




                                                        5
Curve Evolution
Family of curves { s (t)}s>0 minimizing E( s ).
                                                  s

Do not confound:
     t: abscise along the curve.
                                                      s+ds
     s: artificial “time” of evolution.

Local minimum of:        min E( )
                         d
Minimization flow:             s   =   E( s )
                         ds




                                                        5
Curve Evolution
Family of curves { s (t)}s>0 minimizing E( s ).
                                                     s

Do not confound:
     t: abscise along the curve.
                                                                    s+ds
     s: artificial “time” of evolution.

Local minimum of:        min E( )
                       d
Minimization flow:          s =     E( s )
                       ds
Warning: the set of curves is not a vector space.
                                        1
Inner product at :       µ, ⇥⇥ =            µ(t), ⇥(t)⇥|| (t)||dt
                                    0
      Riemannian manifold of infinite dimension.

                                                                      5
Curve Evolution
Family of curves { s (t)}s>0 minimizing E( s ).
                                                      s

Do not confound:
     t: abscise along the curve.
                                                                        s+ds
     s: artificial “time” of evolution.

Local minimum of:        min E( )
                       d
Minimization flow:          s =     E( s )
                       ds
Warning: the set of curves is not a vector space.
                                        1
Inner product at :       µ, ⇥⇥ =            µ(t), ⇥(t)⇥|| (t)||dt
                                    0
      Riemannian manifold of infinite dimension.
Numerical implementation:         (k+1)
                                            =   (k)
                                                      ⇥k E(   (k)
                                                                    )
                                                                          5
Intrinsic Curve Evolutions
E( ) only depends on { (t)  t              [0, 1]}.

Intrinsic energy E: evolution along the normal
                                                       s
    d
       ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t)            ns
    ds
                      speed              normal


                         s (t)
Normal:      ns (t) =
                        || s (t)||

                                           1
Curvature:    ⇥s (t) = ns (t), s (t)⇥
                                      || s (t)||2


                                                                6
Mean Curvature Motion
                                            1
No data-fidelity:         E( ) =                 || (t)||dt
                                        0
                                                  d
         Curve-shortening flow.                         s   =         E( s )
                                                  ds
                                  1
                                         (t)
 E( + ⇥) = E( ) +                              , ⇥ (t)⇥dt + O(||⇥||)
                              0       || (t)||
                          1 d              (t)
 ⌅E( ) : t ⇤⇥
                       || (t)|| dt      || (t)||             0



Mean-curvature motion:
                                                                 s

    d
           s (t)   = ⇥s (t)ns (t)
    ds
  Speed:           (x, n, ⇥) = ⇥
                                                                              7
Discretization
Discretization:     = { (i)}N 1
                            i=0       R2 , with (N ) = (0).

         ⇥µ, ⇥⇤ =    i ⇥µ(i),   ⇥(i)⇤|| (i)   (i + 1)||




                                                              k




                                                                  8
Discretization
Discretization:     = { (i)}N 1
                            i=0          R2 , with (N ) = (0).

         ⇥µ, ⇥⇤ =    i ⇥µ(i),   ⇥(i)⇤|| (i)      (i + 1)||

Discrete energy: E( ) =         i   || (i)    (i + 1)||


                                                                 k




                                                                     8
Discretization
Discretization:     = { (i)}N 1
                            i=0          R2 , with (N ) = (0).

         ⇥µ, ⇥⇤ =    i ⇥µ(i),   ⇥(i)⇤|| (i)      (i + 1)||

Discrete energy: E( ) =         i   || (i)    (i + 1)||

Gradient descent flow:       k+1      =   k    ⇥k E( k )
                                                                      k




                                                             E( k )


                                                                          8
Discretization
Discretization:      = { (i)}N 1
                             i=0            R2 , with (N ) = (0).

         ⇥µ, ⇥⇤ =     i ⇥µ(i),   ⇥(i)⇤|| (i)       (i + 1)||

Discrete energy: E( ) =          i   || (i)     (i + 1)||

Gradient descent flow:        k+1      =     k   ⇥k E( k )
                                                                        k
                                 1
Discrete gradient:    ⇥E( ) =        ⇥           N    ⇥( )
                              ||⇥ ||

        (⇥ )(i) = (i + 1)            (i)
                                                               E( k )
        (⇥ )(i) = (i 1)               (i)
                     (i)
        (N )(i) =
                  || (i)||
                                                                            8
Geodesic Active Contours Motion
Weighted length:
                                1
        E( ) = L( ) =               W ( (t))|| (t)||dt
                            0

 Evolution:
  d
     ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t)                           0
  ds
    (x, n, ⇥) = W (x)⇥              W (x), n


     attraction toward areas
    where W is small.
                                                                  s
     finite di erences discretization.
                                                         Weight W (x)
Open vs. Closed Curves
                                              0
                                    s




                                              x0
                            s

                                        0


                       x1
        Weight W (x)            Image f (x)
Global Minimum with Fast Marching
Geodesic distance map:

   Ux0 (x1 ) =       min           L( )
                 (0)=x0 , (1)=x1


Global minimum: Ux0 (x1 ) = L(            )      Image f           Metric W (x)




                                              Distance Ux0 (x) Geodesic curve (t)
Global Minimum with Fast Marching
Geodesic distance map:

   Ux0 (x1 ) =       min           L( )
                 (0)=x0 , (1)=x1


Global minimum: Ux0 (x1 ) = L(            )      Image f           Metric W (x)



Fast O(N log(N )) algorithm:

– Compute Ux0 with Fast Marching.
– Solve EDO:                                  Distance Ux0 (x) Geodesic curve (t)
    d
        (t) =       Ux0 ( (t))
     dt
       (0) = x1
Overview


 • Parametric Edge-based Active Contours
 • ImplicitEdge-based Active
   Contours


 • Region-based Active Contor
                                           12
Level Sets
Level-set curve representation:
                                                                      s (x)   0
    { s (t)  t   [0, 1]} = x   R  ⇥s (x) = 0 .
                                  2




Example: circle of radius r           s (x)   = ||x   x0 ||   s
Example: square of radius r           s (x)   = ||x   x0 ||       s
                                                                      s (x)   0
Level Sets
Level-set curve representation:
                                                                                  s (x)    0
    { s (t)  t   [0, 1]} = x    R  ⇥s (x) = 0 .
                                   2




Example: circle of radius r            s (x)   = ||x      x0 ||   s
Example: square of radius r            s (x)   = ||x      x0 ||       s
                                                                                  s (x)    0

    Union of domains:      s   = min(     1
                                          s,    s)
                                                2


    Intersection of domains:      s   = max(         1
                                                     s,   s)
                                                          2



                                                                          s (x)    0




                                                                          s   = min(      1
                                                                                          s,   s)
                                                                                               2
Level Sets
Level-set curve representation:
                                                                                     s (x)    0
    { s (t)  t   [0, 1]} = x      R  ⇥s (x) = 0 .
                                    2




Example: circle of radius r             s (x)     = ||x      x0 ||   s
Example: square of radius r             s (x)     = ||x      x0 ||       s
                                                                                     s (x)    0

    Union of domains:      s   = min(        1
                                             s,    s)
                                                   2


    Intersection of domains:       s   = max(           1
                                                        s,   s)
                                                             2



                                                                             s (x)    0
Popular choice: (signed) distance to a curve
         ⇥s (x) = ± min || s (t)       x||
                      t

     infinite number of mappings              s          s.
                                                                             s   = min(      1
                                                                                             s,   s)
                                                                                                  2
Level Sets Evolution
Dictionary parameteric          implicit:
     Position:    x=   s (t)


                                 s (x)
     Normal: ns (t) =
                           ||    s (x)||

                                         ⇥s
     Curvature:    s (x)   = div                 (x)
                                      || ⇥s ||




                                                       14
Level Sets Evolution
Dictionary parameteric           implicit:
     Position:    x=    s (t)


                                  s (x)
     Normal: ns (t) =
                            ||    s (x)||

                                          ⇥s
     Curvature:     s (x)   = div                  (x)
                                       || ⇥s ||
Evolution PDE:
        d
           ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t)
        ds

        d                                            ⇥s (x)              ⇥s
           ⇥s (x) = || ⇥s (x)||           ⇥s (x),             , div              (x) .
        ds                                        || ⇥s (x)||         || ⇥s ||
         All level sets evolves together.
                                                                                    14
Proof
                    d
Evolution PDE:         ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t)   ( )
                    ds
Definition of level-sets:      t, ⇥s ( s (t)) = 0




                                                                   15
Proof
                    d
Evolution PDE:         ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t)   ( )
                    ds
Definition of level-sets:      t, ⇥s ( s (t)) = 0

Deriving with respect to t:
              ⇤ s       ⇤⇥s
      ⇥s (x),     (t) +     (x) = 0          for x =     s (t)    ( )
              ⇤s         ⇤s




                                                                   15
Proof
                    d
Evolution PDE:         ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t)    ( )
                    ds
Definition of level-sets:      t, ⇥s ( s (t)) = 0

Deriving with respect to t:
              ⇤ s       ⇤⇥s
      ⇥s (x),     (t) +     (x) = 0           for x =     s (t)    ( )
              ⇤s         ⇤s
                ⌅⇤s
( )+( ):            (x) =       (x, ns (t), ⇥s (t))   ⇤s (x), ns (t)
                 ⌅s




                                                                       15
Proof
                    d
Evolution PDE:         ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t)             ( )
                    ds
Definition of level-sets:      t, ⇥s ( s (t)) = 0

Deriving with respect to t:
              ⇤ s       ⇤⇥s
      ⇥s (x),     (t) +     (x) = 0              for x =         s (t)     ( )
              ⇤s         ⇤s
                 ⌅⇤s
( )+( ):             (x) =        (x, ns (t), ⇥s (t))      ⇤s (x), ns (t)
                  ⌅s
                                                                         s (x)
                                                              =
                                                                  ||     s (x)||
 For all x on the curve,
     d                                    ⇥s (x)              ⇥s
        ⇥s (x) = || ⇥s (x)||   ⇥s (x),             , div               (x) .
     ds                                || ⇥s (x)||         || ⇥s ||
                                                                               15
Implicit Geodesic Active Contours
 Evolution PDE:
      d                             s
          s = ||   s ||div W               .
      ds                       ||   s ||




Comparison with explicit active contours:
          : 2D instead of 1D equation.
        + : allows topology change.




                                               16
Implicit Geodesic Active Contours
 Evolution PDE:
      d                                    s
          s = ||      s ||div W                       .
      ds                           ||      s ||




Comparison with explicit active contours:
           : 2D instead of 1D equation.
        + : allows topology change.


 Re-initialization:       ⇥s (x) = ± min || s (t)              x||
                                                  t
    Eikonal equation:         ||        s ||   =1         with ⇥s ( s (t)) = 0


                                                                                 16
Multiple Fluids Dynamics
See Ron Fedkiw homepage.   http://physbam.stanford.edu/ fedkiw/
    Multiple gaz:




   Fluid/air interface:




                                                             17
Overview


 • Parametric Edge-based Active Contours
 • Implicit   Edge-based Active Contours


 • Region-based Active Contours

                                           18
Energy Depending on Region
Optimal segmentation [0, 1]2 =          c
                                            :
     min L1 ( ) + L2 (   c
                             ) + R( )              R( ) = |        |
                Data           Regularization
               fidelity
Chan-Vese binary model:         L1 ( ) =        |I(x)   c1 |2 dx
           More general models




                                                                       19
Energy Depending on Region
Optimal segmentation [0, 1]2 =           c
                                             :
     min L1 ( ) + L2 (    c
                              ) + R( )                 R( ) = |         |
                Data            Regularization
               fidelity
Chan-Vese binary model:          L1 ( ) =         |I(x)      c1 |2 dx
           More general models

Level set implementation:          = {x  (x) > 0}
                                         2        x                         H(x)
 Smoothed Heaviside:          H (x) =        atan
                                                  ⇥                            x
L1 ( ) ⇥ L( ) =          H ( (x))||I(x)          c1 ||2 dx

R( )     R( ) =          ||⇥(H )(x)||dx
                                                                              19
Descent Schemes
For a given c = (c1 , c2 )   R2 :
        min Ec ( ) =     H ( (x))||I(x)   c1 ||2 +
         ⇥

                  H ( ⇥(x))||I(x)    c2 ||2 + ||⇥(H ⇥)(x)||dx




                                                                20
Descent Schemes
For a given c = (c1 , c2 )      R2 :
        min Ec ( ) =         H ( (x))||I(x)         c1 ||2 +
         ⇥

                   H ( ⇥(x))||I(x)            c2 ||2 + ||⇥(H ⇥)(x)||dx
Descent with respect to        :
               ⇥(k+1) = ⇥(k)            k    Ec (⇥(k) )

     Ec ( ) = H ( (x))G(x)
                                                                 ⇥⇥
   G(x) = ||I(x)         2
                     c1 ||     ||I(x)           2
                                            c2 ||    div                (x)
                                                               ||⇥⇥||



                                                                              20
Descent Schemes
For a given c = (c1 , c2 )       R2 :
        min Ec ( ) =          H ( (x))||I(x)           c1 ||2 +
         ⇥

                    H ( ⇥(x))||I(x)            c2 ||2 + ||⇥(H ⇥)(x)||dx
Descent with respect to         :
                  ⇥(k+1) = ⇥(k)          k    Ec (⇥(k) )

     Ec ( ) = H ( (x))G(x)
                                                                    ⇥⇥
   G(x) = ||I(x)          2
                      c1 ||     ||I(x)           2
                                             c2 ||      div                (x)
                                                                  ||⇥⇥||

Limit        0:    ⇥Ec (⇥)          { =0} (x)||⇥⇥(x)||G(x)

    Numerically, use           Ec ( ) = ||           (x)||G(x)
                                                                                 20
Update of c
Joint minimization:   min Ec ( )
                      ,c1 ,c2




                                   21
Update of c
Joint minimization:        min Ec ( )
                           ,c1 ,c2



   Update of :        ⇥(k+1) = ⇥(k)     k   Ec(k) (⇥(k) )




                                                            21
Update of c
Joint minimization:              min Ec ( )
                                  ,c1 ,c2



   Update of :             ⇥(k+1) = ⇥(k)         k     Ec(k) (⇥(k) )


  Update of (c1 , c2 ):

            (k+1)     (k+1)
         (c1        , c2      ) = argmin Ec (        (k)
                                                           )
                                   c1 ,c2

         (k+1)
        c1     = c( (k) )                              I(x)H( (x))dx
                                            c( ) =
         (k+1)
        c2     = c( (k) )                                H( (x))dx

                                                                       21
Example of Evolution
  Use de-localized initialization.




                         (0)




                                     22
Conclusion
        Curve evolution
        Energy minimization


 Parametric vs. level set representation.


 Dictionary to translate
     – curve properties.
     – energy gradients.


 Edge based vs. region based energies.

                                            23

More Related Content

What's hot

Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse RepresentationGabriel Peyré
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGabriel Peyré
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionGabriel Peyré
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsFrank Nielsen
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationGabriel Peyré
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyFrank Nielsen
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsGabriel Peyré
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas EberleBigMC
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image ProcessingGabriel Peyré
 
The dual geometry of Shannon information
The dual geometry of Shannon informationThe dual geometry of Shannon information
The dual geometry of Shannon informationFrank Nielsen
 
Bregman divergences from comparative convexity
Bregman divergences from comparative convexityBregman divergences from comparative convexity
Bregman divergences from comparative convexityFrank Nielsen
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualStéphane Canu
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Frank Nielsen
 
Image Processing 3
Image Processing 3Image Processing 3
Image Processing 3jainatin
 
Mesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsMesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsGabriel Peyré
 
Open GL 04 linealgos
Open GL 04 linealgosOpen GL 04 linealgos
Open GL 04 linealgosRoziq Bahtiar
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalStéphane Canu
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Tomoya Murata
 

What's hot (20)

Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse Representation
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : Multiresolution
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh Parameterization
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse Problems
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas Eberle
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image Processing
 
The dual geometry of Shannon information
The dual geometry of Shannon informationThe dual geometry of Shannon information
The dual geometry of Shannon information
 
Bregman divergences from comparative convexity
Bregman divergences from comparative convexityBregman divergences from comparative convexity
Bregman divergences from comparative convexity
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dual
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)
 
Image Processing 3
Image Processing 3Image Processing 3
Image Processing 3
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
 
Mesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsMesh Processing Course : Geodesics
Mesh Processing Course : Geodesics
 
Open GL 04 linealgos
Open GL 04 linealgosOpen GL 04 linealgos
Open GL 04 linealgos
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
 

Viewers also liked

Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)Yan Xu
 
Traitement d'image
Traitement d'imageTraitement d'image
Traitement d'imageAnissa Teyeb
 
Une approche multi-agents pour la détection de contours
Une approche multi-agents pour la détection  de contoursUne approche multi-agents pour la détection  de contours
Une approche multi-agents pour la détection de contoursSlim Namouchi
 
Active contour segmentation
Active contour segmentationActive contour segmentation
Active contour segmentationNishant Jain
 

Viewers also liked (6)

Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)
 
Traitement d'image
Traitement d'imageTraitement d'image
Traitement d'image
 
Dip Image Segmentation
Dip Image SegmentationDip Image Segmentation
Dip Image Segmentation
 
Formation traitement d_images
Formation traitement d_imagesFormation traitement d_images
Formation traitement d_images
 
Une approche multi-agents pour la détection de contours
Une approche multi-agents pour la détection  de contoursUne approche multi-agents pour la détection  de contours
Une approche multi-agents pour la détection de contours
 
Active contour segmentation
Active contour segmentationActive contour segmentation
Active contour segmentation
 

Similar to Mesh Processing Course : Active Contours

Reflect tsukuba524
Reflect tsukuba524Reflect tsukuba524
Reflect tsukuba524kazuhase2011
 
Introduction to inverse problems
Introduction to inverse problemsIntroduction to inverse problems
Introduction to inverse problemsDelta Pi Systems
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
Hybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity MarketHybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity Markettomoyukiichiba
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handoutcsedays
 
Case Study (All)
Case Study (All)Case Study (All)
Case Study (All)gudeyi
 
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsA T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsVjekoslavKovac1
 
Dsp U Lec09 Iir Filter Design
Dsp U   Lec09 Iir Filter DesignDsp U   Lec09 Iir Filter Design
Dsp U Lec09 Iir Filter Designtaha25
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptronESCOM
 
03 truncation errors
03 truncation errors03 truncation errors
03 truncation errorsmaheej
 
Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Sean Meyn
 
Logics of the laplace transform
Logics of the laplace transformLogics of the laplace transform
Logics of the laplace transformTarun Gehlot
 
Gibbs flow transport for Bayesian inference
Gibbs flow transport for Bayesian inferenceGibbs flow transport for Bayesian inference
Gibbs flow transport for Bayesian inferenceJeremyHeng10
 

Similar to Mesh Processing Course : Active Contours (20)

Reflect tsukuba524
Reflect tsukuba524Reflect tsukuba524
Reflect tsukuba524
 
Introduction to inverse problems
Introduction to inverse problemsIntroduction to inverse problems
Introduction to inverse problems
 
Assignment6
Assignment6Assignment6
Assignment6
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
Taylor problem
Taylor problemTaylor problem
Taylor problem
 
Hybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity MarketHybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity Market
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handout
 
Md2521102111
Md2521102111Md2521102111
Md2521102111
 
Case Study (All)
Case Study (All)Case Study (All)
Case Study (All)
 
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsA T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
 
Dsp U Lec09 Iir Filter Design
Dsp U   Lec09 Iir Filter DesignDsp U   Lec09 Iir Filter Design
Dsp U Lec09 Iir Filter Design
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptron
 
SSA slides
SSA slidesSSA slides
SSA slides
 
03 truncation errors
03 truncation errors03 truncation errors
03 truncation errors
 
Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009
 
Logics of the laplace transform
Logics of the laplace transformLogics of the laplace transform
Logics of the laplace transform
 
Dsp3
Dsp3Dsp3
Dsp3
 
01 regras diferenciacao
01   regras diferenciacao01   regras diferenciacao
01 regras diferenciacao
 
Regras diferenciacao
Regras diferenciacaoRegras diferenciacao
Regras diferenciacao
 
Gibbs flow transport for Bayesian inference
Gibbs flow transport for Bayesian inferenceGibbs flow transport for Bayesian inference
Gibbs flow transport for Bayesian inference
 

More from Gabriel Peyré

Mesh Processing Course : Introduction
Mesh Processing Course : IntroductionMesh Processing Course : Introduction
Mesh Processing Course : IntroductionGabriel Peyré
 
Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingGabriel Peyré
 
Mesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusMesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusGabriel Peyré
 
Signal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse RecoverySignal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse RecoveryGabriel Peyré
 
Signal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the CourseSignal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the CourseGabriel Peyré
 
Signal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesSignal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesGabriel Peyré
 
Signal Processing Course : Fourier
Signal Processing Course : FourierSignal Processing Course : Fourier
Signal Processing Course : FourierGabriel Peyré
 
Signal Processing Course : Denoising
Signal Processing Course : DenoisingSignal Processing Course : Denoising
Signal Processing Course : DenoisingGabriel Peyré
 
Signal Processing Course : Compressed Sensing
Signal Processing Course : Compressed SensingSignal Processing Course : Compressed Sensing
Signal Processing Course : Compressed SensingGabriel Peyré
 
Signal Processing Course : Approximation
Signal Processing Course : ApproximationSignal Processing Course : Approximation
Signal Processing Course : ApproximationGabriel Peyré
 
Signal Processing Course : Wavelets
Signal Processing Course : WaveletsSignal Processing Course : Wavelets
Signal Processing Course : WaveletsGabriel Peyré
 
Sparsity and Compressed Sensing
Sparsity and Compressed SensingSparsity and Compressed Sensing
Sparsity and Compressed SensingGabriel Peyré
 
Optimal Transport in Imaging Sciences
Optimal Transport in Imaging SciencesOptimal Transport in Imaging Sciences
Optimal Transport in Imaging SciencesGabriel Peyré
 
An Introduction to Optimal Transport
An Introduction to Optimal TransportAn Introduction to Optimal Transport
An Introduction to Optimal TransportGabriel Peyré
 
A Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New OneA Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New OneGabriel Peyré
 

More from Gabriel Peyré (15)

Mesh Processing Course : Introduction
Mesh Processing Course : IntroductionMesh Processing Course : Introduction
Mesh Processing Course : Introduction
 
Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic Sampling
 
Mesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusMesh Processing Course : Differential Calculus
Mesh Processing Course : Differential Calculus
 
Signal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse RecoverySignal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse Recovery
 
Signal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the CourseSignal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the Course
 
Signal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesSignal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal Bases
 
Signal Processing Course : Fourier
Signal Processing Course : FourierSignal Processing Course : Fourier
Signal Processing Course : Fourier
 
Signal Processing Course : Denoising
Signal Processing Course : DenoisingSignal Processing Course : Denoising
Signal Processing Course : Denoising
 
Signal Processing Course : Compressed Sensing
Signal Processing Course : Compressed SensingSignal Processing Course : Compressed Sensing
Signal Processing Course : Compressed Sensing
 
Signal Processing Course : Approximation
Signal Processing Course : ApproximationSignal Processing Course : Approximation
Signal Processing Course : Approximation
 
Signal Processing Course : Wavelets
Signal Processing Course : WaveletsSignal Processing Course : Wavelets
Signal Processing Course : Wavelets
 
Sparsity and Compressed Sensing
Sparsity and Compressed SensingSparsity and Compressed Sensing
Sparsity and Compressed Sensing
 
Optimal Transport in Imaging Sciences
Optimal Transport in Imaging SciencesOptimal Transport in Imaging Sciences
Optimal Transport in Imaging Sciences
 
An Introduction to Optimal Transport
An Introduction to Optimal TransportAn Introduction to Optimal Transport
An Introduction to Optimal Transport
 
A Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New OneA Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New One
 

Mesh Processing Course : Active Contours

  • 1. Active Contours www.numerical-tours.com Gabriel Peyré CEREMADE, Université Paris-Dauphine
  • 2. Overview • Parametric Edge-based Active Contours • Implicit Edge-based Active Contours • Region-based Active Contours 2
  • 3. Parametric Active Contours Local minimum: argmin E( ) = L( ) + ⇥R( ) Data Regularization Boundary conditions: fidelity x0 – Open curve: (0) = x0 and (1) = x1 . – Closed curve: (0) = (1). x1 3
  • 4. Parametric Active Contours Local minimum: argmin E( ) = L( ) + ⇥R( ) Data Regularization Boundary conditions: fidelity x0 – Open curve: (0) = x0 and (1) = x1 . – Closed curve: (0) = (1). Snakes energy: (depends on parameterization) x1 1 1 L( ) = W ( (t))|| (t)||dt, R( ) = || (t)|| + µ|| (t)||dt 0 0 Image f Weight W (x) Curve 3
  • 5. Geodesic Active Contours Geodesic active contours: (intrinsic) Replace W by W + , 1 E( ) = L( ) = W ( (t))|| (t)||dt 0 (local) minimum of the weighted length L. local geodesic (not minimal path). Weight W (x) Curve 4
  • 6. Curve Evolution Family of curves { s (t)}s>0 minimizing E( s ). s Do not confound: t: abscise along the curve. s+ds s: artificial “time” of evolution. 5
  • 7. Curve Evolution Family of curves { s (t)}s>0 minimizing E( s ). s Do not confound: t: abscise along the curve. s+ds s: artificial “time” of evolution. Local minimum of: min E( ) d Minimization flow: s = E( s ) ds 5
  • 8. Curve Evolution Family of curves { s (t)}s>0 minimizing E( s ). s Do not confound: t: abscise along the curve. s+ds s: artificial “time” of evolution. Local minimum of: min E( ) d Minimization flow: s = E( s ) ds Warning: the set of curves is not a vector space. 1 Inner product at : µ, ⇥⇥ = µ(t), ⇥(t)⇥|| (t)||dt 0 Riemannian manifold of infinite dimension. 5
  • 9. Curve Evolution Family of curves { s (t)}s>0 minimizing E( s ). s Do not confound: t: abscise along the curve. s+ds s: artificial “time” of evolution. Local minimum of: min E( ) d Minimization flow: s = E( s ) ds Warning: the set of curves is not a vector space. 1 Inner product at : µ, ⇥⇥ = µ(t), ⇥(t)⇥|| (t)||dt 0 Riemannian manifold of infinite dimension. Numerical implementation: (k+1) = (k) ⇥k E( (k) ) 5
  • 10. Intrinsic Curve Evolutions E( ) only depends on { (t) t [0, 1]}. Intrinsic energy E: evolution along the normal s d ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t) ns ds speed normal s (t) Normal: ns (t) = || s (t)|| 1 Curvature: ⇥s (t) = ns (t), s (t)⇥ || s (t)||2 6
  • 11. Mean Curvature Motion 1 No data-fidelity: E( ) = || (t)||dt 0 d Curve-shortening flow. s = E( s ) ds 1 (t) E( + ⇥) = E( ) + , ⇥ (t)⇥dt + O(||⇥||) 0 || (t)|| 1 d (t) ⌅E( ) : t ⇤⇥ || (t)|| dt || (t)|| 0 Mean-curvature motion: s d s (t) = ⇥s (t)ns (t) ds Speed: (x, n, ⇥) = ⇥ 7
  • 12. Discretization Discretization: = { (i)}N 1 i=0 R2 , with (N ) = (0). ⇥µ, ⇥⇤ = i ⇥µ(i), ⇥(i)⇤|| (i) (i + 1)|| k 8
  • 13. Discretization Discretization: = { (i)}N 1 i=0 R2 , with (N ) = (0). ⇥µ, ⇥⇤ = i ⇥µ(i), ⇥(i)⇤|| (i) (i + 1)|| Discrete energy: E( ) = i || (i) (i + 1)|| k 8
  • 14. Discretization Discretization: = { (i)}N 1 i=0 R2 , with (N ) = (0). ⇥µ, ⇥⇤ = i ⇥µ(i), ⇥(i)⇤|| (i) (i + 1)|| Discrete energy: E( ) = i || (i) (i + 1)|| Gradient descent flow: k+1 = k ⇥k E( k ) k E( k ) 8
  • 15. Discretization Discretization: = { (i)}N 1 i=0 R2 , with (N ) = (0). ⇥µ, ⇥⇤ = i ⇥µ(i), ⇥(i)⇤|| (i) (i + 1)|| Discrete energy: E( ) = i || (i) (i + 1)|| Gradient descent flow: k+1 = k ⇥k E( k ) k 1 Discrete gradient: ⇥E( ) = ⇥ N ⇥( ) ||⇥ || (⇥ )(i) = (i + 1) (i) E( k ) (⇥ )(i) = (i 1) (i) (i) (N )(i) = || (i)|| 8
  • 16. Geodesic Active Contours Motion Weighted length: 1 E( ) = L( ) = W ( (t))|| (t)||dt 0 Evolution: d ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t) 0 ds (x, n, ⇥) = W (x)⇥ W (x), n attraction toward areas where W is small. s finite di erences discretization. Weight W (x)
  • 17. Open vs. Closed Curves 0 s x0 s 0 x1 Weight W (x) Image f (x)
  • 18. Global Minimum with Fast Marching Geodesic distance map: Ux0 (x1 ) = min L( ) (0)=x0 , (1)=x1 Global minimum: Ux0 (x1 ) = L( ) Image f Metric W (x) Distance Ux0 (x) Geodesic curve (t)
  • 19. Global Minimum with Fast Marching Geodesic distance map: Ux0 (x1 ) = min L( ) (0)=x0 , (1)=x1 Global minimum: Ux0 (x1 ) = L( ) Image f Metric W (x) Fast O(N log(N )) algorithm: – Compute Ux0 with Fast Marching. – Solve EDO: Distance Ux0 (x) Geodesic curve (t) d (t) = Ux0 ( (t)) dt (0) = x1
  • 20. Overview • Parametric Edge-based Active Contours • ImplicitEdge-based Active Contours • Region-based Active Contor 12
  • 21. Level Sets Level-set curve representation: s (x) 0 { s (t) t [0, 1]} = x R ⇥s (x) = 0 . 2 Example: circle of radius r s (x) = ||x x0 || s Example: square of radius r s (x) = ||x x0 || s s (x) 0
  • 22. Level Sets Level-set curve representation: s (x) 0 { s (t) t [0, 1]} = x R ⇥s (x) = 0 . 2 Example: circle of radius r s (x) = ||x x0 || s Example: square of radius r s (x) = ||x x0 || s s (x) 0 Union of domains: s = min( 1 s, s) 2 Intersection of domains: s = max( 1 s, s) 2 s (x) 0 s = min( 1 s, s) 2
  • 23. Level Sets Level-set curve representation: s (x) 0 { s (t) t [0, 1]} = x R ⇥s (x) = 0 . 2 Example: circle of radius r s (x) = ||x x0 || s Example: square of radius r s (x) = ||x x0 || s s (x) 0 Union of domains: s = min( 1 s, s) 2 Intersection of domains: s = max( 1 s, s) 2 s (x) 0 Popular choice: (signed) distance to a curve ⇥s (x) = ± min || s (t) x|| t infinite number of mappings s s. s = min( 1 s, s) 2
  • 24. Level Sets Evolution Dictionary parameteric implicit: Position: x= s (t) s (x) Normal: ns (t) = || s (x)|| ⇥s Curvature: s (x) = div (x) || ⇥s || 14
  • 25. Level Sets Evolution Dictionary parameteric implicit: Position: x= s (t) s (x) Normal: ns (t) = || s (x)|| ⇥s Curvature: s (x) = div (x) || ⇥s || Evolution PDE: d ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t) ds d ⇥s (x) ⇥s ⇥s (x) = || ⇥s (x)|| ⇥s (x), , div (x) . ds || ⇥s (x)|| || ⇥s || All level sets evolves together. 14
  • 26. Proof d Evolution PDE: ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t) ( ) ds Definition of level-sets: t, ⇥s ( s (t)) = 0 15
  • 27. Proof d Evolution PDE: ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t) ( ) ds Definition of level-sets: t, ⇥s ( s (t)) = 0 Deriving with respect to t: ⇤ s ⇤⇥s ⇥s (x), (t) + (x) = 0 for x = s (t) ( ) ⇤s ⇤s 15
  • 28. Proof d Evolution PDE: ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t) ( ) ds Definition of level-sets: t, ⇥s ( s (t)) = 0 Deriving with respect to t: ⇤ s ⇤⇥s ⇥s (x), (t) + (x) = 0 for x = s (t) ( ) ⇤s ⇤s ⌅⇤s ( )+( ): (x) = (x, ns (t), ⇥s (t)) ⇤s (x), ns (t) ⌅s 15
  • 29. Proof d Evolution PDE: ⇥s (t) = (⇥s (t), ns (t), ⇤s (t)) ns (t) ( ) ds Definition of level-sets: t, ⇥s ( s (t)) = 0 Deriving with respect to t: ⇤ s ⇤⇥s ⇥s (x), (t) + (x) = 0 for x = s (t) ( ) ⇤s ⇤s ⌅⇤s ( )+( ): (x) = (x, ns (t), ⇥s (t)) ⇤s (x), ns (t) ⌅s s (x) = || s (x)|| For all x on the curve, d ⇥s (x) ⇥s ⇥s (x) = || ⇥s (x)|| ⇥s (x), , div (x) . ds || ⇥s (x)|| || ⇥s || 15
  • 30. Implicit Geodesic Active Contours Evolution PDE: d s s = || s ||div W . ds || s || Comparison with explicit active contours: : 2D instead of 1D equation. + : allows topology change. 16
  • 31. Implicit Geodesic Active Contours Evolution PDE: d s s = || s ||div W . ds || s || Comparison with explicit active contours: : 2D instead of 1D equation. + : allows topology change. Re-initialization: ⇥s (x) = ± min || s (t) x|| t Eikonal equation: || s || =1 with ⇥s ( s (t)) = 0 16
  • 32. Multiple Fluids Dynamics See Ron Fedkiw homepage. http://physbam.stanford.edu/ fedkiw/ Multiple gaz: Fluid/air interface: 17
  • 33. Overview • Parametric Edge-based Active Contours • Implicit Edge-based Active Contours • Region-based Active Contours 18
  • 34. Energy Depending on Region Optimal segmentation [0, 1]2 = c : min L1 ( ) + L2 ( c ) + R( ) R( ) = | | Data Regularization fidelity Chan-Vese binary model: L1 ( ) = |I(x) c1 |2 dx More general models 19
  • 35. Energy Depending on Region Optimal segmentation [0, 1]2 = c : min L1 ( ) + L2 ( c ) + R( ) R( ) = | | Data Regularization fidelity Chan-Vese binary model: L1 ( ) = |I(x) c1 |2 dx More general models Level set implementation: = {x (x) > 0} 2 x H(x) Smoothed Heaviside: H (x) = atan ⇥ x L1 ( ) ⇥ L( ) = H ( (x))||I(x) c1 ||2 dx R( ) R( ) = ||⇥(H )(x)||dx 19
  • 36. Descent Schemes For a given c = (c1 , c2 ) R2 : min Ec ( ) = H ( (x))||I(x) c1 ||2 + ⇥ H ( ⇥(x))||I(x) c2 ||2 + ||⇥(H ⇥)(x)||dx 20
  • 37. Descent Schemes For a given c = (c1 , c2 ) R2 : min Ec ( ) = H ( (x))||I(x) c1 ||2 + ⇥ H ( ⇥(x))||I(x) c2 ||2 + ||⇥(H ⇥)(x)||dx Descent with respect to : ⇥(k+1) = ⇥(k) k Ec (⇥(k) ) Ec ( ) = H ( (x))G(x) ⇥⇥ G(x) = ||I(x) 2 c1 || ||I(x) 2 c2 || div (x) ||⇥⇥|| 20
  • 38. Descent Schemes For a given c = (c1 , c2 ) R2 : min Ec ( ) = H ( (x))||I(x) c1 ||2 + ⇥ H ( ⇥(x))||I(x) c2 ||2 + ||⇥(H ⇥)(x)||dx Descent with respect to : ⇥(k+1) = ⇥(k) k Ec (⇥(k) ) Ec ( ) = H ( (x))G(x) ⇥⇥ G(x) = ||I(x) 2 c1 || ||I(x) 2 c2 || div (x) ||⇥⇥|| Limit 0: ⇥Ec (⇥) { =0} (x)||⇥⇥(x)||G(x) Numerically, use Ec ( ) = || (x)||G(x) 20
  • 39. Update of c Joint minimization: min Ec ( ) ,c1 ,c2 21
  • 40. Update of c Joint minimization: min Ec ( ) ,c1 ,c2 Update of : ⇥(k+1) = ⇥(k) k Ec(k) (⇥(k) ) 21
  • 41. Update of c Joint minimization: min Ec ( ) ,c1 ,c2 Update of : ⇥(k+1) = ⇥(k) k Ec(k) (⇥(k) ) Update of (c1 , c2 ): (k+1) (k+1) (c1 , c2 ) = argmin Ec ( (k) ) c1 ,c2 (k+1) c1 = c( (k) ) I(x)H( (x))dx c( ) = (k+1) c2 = c( (k) ) H( (x))dx 21
  • 42. Example of Evolution Use de-localized initialization. (0) 22
  • 43. Conclusion Curve evolution Energy minimization Parametric vs. level set representation. Dictionary to translate – curve properties. – energy gradients. Edge based vs. region based energies. 23