SlideShare a Scribd company logo
1 of 53
Download to read offline
Robert Collins
CSE486, Penn State




                         Lecture 14
                     Parameter Estimation
                     Readings T&V Sec 5.1 - 5.3
Robert Collins
CSE486, Penn State
                     Summary: Transformations

                                           Euclidean


                                           similarity


                                             affine


                                           projective
Robert Collins
CSE486, Penn State
                     Parameter Estimation

       We will talk about estimating parameters of
       1) Geometric models (e.g. lines, planes, surfaces)
       2) Geometric transformations (any of the parametric
           transformations we have been talking about)


       Least-squares is a general strategy to address both!
Robert Collins
CSE486, Penn State    Parameter Estimation:
                     Fitting Geometric Models
        General Idea:
        • Want to fit a model to raw image features (data)
             (the features could be points, edges, even regions)
        • Parameterize model such that
              model instance is an element of Rn
              i.e. model instance = (a1,a2,…,an)
        • Define an error function E(modeli , data) that
              measures how well a given model instance
              describes the data
        • Solve for the model instance that minimizes E
Robert Collins
CSE486, Penn State
                     Example : Line Fitting

        General Idea:
        • Want to fit a model to raw image features (data)
             (the features could be points, edges, even regions)
        • Parameterize model such that
              model instance is an element of Rn
              i.e. model instance = (a1,a2,…,an)
        • Define an error function E(modeli , data) that
              measures how well a given model instance
              describes the data
        • Solve for the model instance that minimizes E
Robert Collins
CSE486, Penn State
                             Point Feature Data
                     Point features = {(xi,yi) | i = 1,…,n}

        pts = [...
           17 81;
           23 72;
           35 73;
           37 58;
           45 50;
           57 56;
           61 36;
           70 32;
           80 32;
           84 19]
             x       y
Robert Collins
CSE486, Penn State
                     Example : Line Fitting

        General Idea:
        • Want to fit a model to raw image features (data)
             (the features could be points, edges, even regions)
        • Parameterize model such that
              model instance is an element of Rn
              i.e. model instance = (a1,a2,…,an)
        • Define an error function E(modeli , data) that
              measures how well a given model instance
              describes the data
        • Solve for the model instance that minimizes E
Robert Collins
CSE486, Penn State
                          Line Parameterization


       y=m*x+b                                                     y

       m = slope                                               b          dy
       b = y-intercept                                                         x
       (where b crosses the y axis)

                                                               dx      m = dy/dx
       Model instance = (m,b)


                     (The astute student will note a problem
                      with representing vertical lines)
Robert Collins
CSE486, Penn State
                     Example : Line Fitting

        General Idea:
        • Want to fit a model to raw image features (data)
             (the features could be points, edges, even regions)
        • Parameterize model such that
              model instance is an element of Rn
              i.e. model instance = (a1,a2,…,an)
        • Define an error function E(modeli , data) that
              measures how well a given model instance
              describes the data
        • Solve for the model instance that minimizes E
Robert Collins
                                                        (L.S. is just one type of
CSE486, Penn State
                             Least Squares              error function)

       • Given line (m,b)
       • distance of point (xi,yi) to line is vertical distance


       • E is sum of squared distances over all points




                                                             m xi + b
                                            di
                        (m,b)                                yi
                                          xi
Robert Collins
CSE486, Penn State
                     Example : Line Fitting

        General Idea:
        • Want to fit a model to raw image features (data)
             (the features could be points, edges, even regions)
        • Parameterize model such that
              model instance is an element of Rn
              i.e. model instance = (a1,a2,…,an)
        • Define an error function E(modeli , data) that
              measures how well a given model instance
              describes the data
        • Solve for the model instance that minimizes E
Robert Collins
CSE486, Penn State
                     Calculus to Find Extrema

           • Take first derivatives of E with respect to m, b
           • Set equations to zero




                                 Note equivalence
           • Solve for m, b      to linear regression


               *DOING DERIVATION ON THE BOARD*
Robert Collins
CSE486, Penn State
                     Least Squares Solution




     m = -0.8528
     b = 94.3059
Robert Collins
CSE486, Penn State
                     Problem with Parameterization
       •There is a problem with our parameterization,
         namely (m,b) is undefined for vertical lines

       •More general line parameterization


         such that



       •Question for class: why do we need the quadratic constraint?
       •Another question: what is relation of this to y=mx+b?
Robert Collins
CSE486, Penn State
                     LS with Algebraic Distance

          Algebraic Distance

          Will derive on board.
          Result:




           (a,b,c) is eigenvector associated with smallest eigenvalue
           (it will only be 0 if there is no noise, and therefore all
           the points lie exactly on a line)
Robert Collins
CSE486, Penn State
                     LS with Algebraic Distance




            Note much different from linear regression line (in this case)
            but we can rest assured that our program won’t blow up when
            it sees a vertical line!
Robert Collins
CSE486, Penn State
                     Algebraic Least Squares Issues
        • Note: I didn’t draw the error vectors on the plot

        • That’s because I don’t know what to draw…

        • Main problem with algebraic distances: Hard to say
          precisely what it is you are minimizing, since algebraic
          distances are rarely physically meaningful quantities
Robert Collins
CSE486, Penn State
                      Orthogonal Least Squares
        •Minimize orthogonal (geometric) distance.
        •Makes sense physically, but harder to derive
        •Representation:


          such that
                                                        Distance measured
                                                        perpendicular to line




        (compare with algebraic distance)
Robert Collins
CSE486, Penn State
                     Orthogonal Least Squares

          Harder to derive.

          Key insight: best fit line must pass through
              the center of mass of the set of points!
          Move center of mass to the origin.
          This reduces the problem to finding a unit
               vector normal to the line: (a,b) s.t. a2+b^2=1
          This will be minimum eigenvector of scatter
               matrix of the points.
          Finally, solve for c
Robert Collins

              Orthogonal Least Squares Solution
CSE486, Penn State




           center
           of mass




                     “distance” means what we intuitively expect
                     (i.e. distance to closest point on line, or
                     minimum distance to line)
Robert Collins
CSE486, Penn State      Parameter Estimation:
                     Estimating a Transformation
    Let’s say we have found point matches between two images, and
    we think they are related by some parametric transformation (e.g.
    translation; scaled Euclidean; affine). How do we estimate the
    parameters of that transformation?

   General Strategy
     • Least-Squares estimation from point correspondences

   Two important (related) questions:
        •How many degrees of freedom?
        •How many point correspondences are needed?
Robert Collins

                Example: Translation Estimation
CSE486, Penn State




                equations           matrix form




        How many degrees of freedom?
        How many independent variables are there? Two

        How many point correspondences are needed?
        Each correspondence (x,y)=>(x’,y’)
        provides two equations             DoF = 2/2 = 1
                                            2
Robert Collins

                Example: Translation Estimation
CSE486, Penn State



                                                  x1’,y1’
                equations       x1,y1
                                                       x2’,y2’
                                     x2,y2

                                             x3’,y3’             x4’,y4’

                             x3,y3                x4,y4




      Least Squares Estimation:

      Minimize                                                             wrt tx, ty
Robert Collins
CSE486, Penn State
                         Let’s try another example

         Similarity transformation

                x’            a     -b   c      x
                y’       =    b      a   d      y
                1             0      0   1      1


                     note: a = s cosθ
                                             **ON THE BOARD**
                           b = s sinθ
Robert Collins
CSE486, Penn State
                          Practical Issue

              Once we have estimated a transformation,
              how can we (un)warp image pixel values
              to produce a new picture.
Robert Collins

                Warping & Bilinear Interpolation
CSE486, Penn State




             Given a transformation between two images,
             (coordinate systems) we want to “warp” one
             image into the coordinate system of the other.

             We will call the coordinate system where we
             are mapping from the “source” image

             We will call the coordinate system we are
             mapping to the “destination” image.
Robert Collins
CSE486, Penn State
                             Warping Example
      Transformation in this case is a projective transformation
      (general 3x3 matrix, operating on homogeneous coords)




                                    H

  from Hartley & Zisserman

                     Source Image        Destination image
  We will have a lot more to say about this is a future lecture.
Robert Collins
CSE486, Penn State
                       Forward Warping

           Source image            Destination image

             x         H                 H(x)




          •For each pixel x in the source image
          •Determine where it goes as H(x)
          •Color the destination pixel            Problems?
Robert Collins
CSE486, Penn State
                     Forward Warping Problem

           Source image               Destination image




                          magnified




          Can leave gaps!
Robert Collins
CSE486, Penn State
                         Backward Warping         (No gaps)


           Source image            Destination image

                H-1(x)                  x
                         H-1
                                              ?



          •For each pixel x in the destination image
          •Determine where it comes from as H-1(x)
          •Get color from that location
Robert Collins
CSE486, Penn State
                                  Interpolation
         What do we mean by “get color from that location”?
         Consider grey values. What is intensity at (x,y)?

                     I(i , j+1)   I(i+1 , j+1)
                                                 Nearest Neighbor:
                                                  Take color of pixel
                                                  with closest center.

                                  (x,y)
                                                  I(x,y) = I(i+1,j)

                      I(i , j)     I(i+1 , j)
Robert Collins
CSE486, Penn State
                                  Bilinear interpolation
         What do we mean by “get color from that location”?
         Consider grey values. What is intensity at (x,y)?

                     I(i , j+1)       I(i+1 , j+1)
                                                     Bilinear Interpolation:
                                                       Weighted average


                                      (x,y)



                      I(i , j)        I(i+1 , j)
Robert Collins
CSE486, Penn State
                                  Bilinear interpolation
         What do we mean by “get color from that location”?
         Consider grey values. What is intensity at (x,y)?

                     I(i , j+1)        I(i+1 , j+1)
                                                      Bilinear Interpolation:
                                                        Weighted average
                                  A1   A2
                                                      I(x,y) = A3*I(i,j)
                                 A3    A4                    + A4*I(i+1,j)
                                                             + A2*I(i+1,j+1)
                                                             + A1*I(i,j+1)
                      I(i , j)         I(i+1 , j)
Robert Collins
CSE486, Penn State
                      Bilinear Interpolation, Math
        First, consider linear interpolation

                       (i , j)                       (i+1 , j)


        Inituition: Given two pixel values, what should the value
        be at some intermediate point between them?

            (i , j)                      (i+1 , j)       If close to (i,j), should
                                                         be intensity similar to I(i,j)


            (i , j)                      (i+1 , j)       If equidistant from both,
                                                         should be average of the
                                                         two intensities

            (i , j)                      (i+1 , j)       If close to (i+1,j), should
                                                         be intensity similar to I(i+1,j)
Robert Collins
CSE486, Penn State
                                                       Linear Interpolation
                                                    (i , j)                              (i+1 , j)


                                                              a                (1-a)

                                                                                Recall:
  Y axis (intensity value)




                                                                  I(i+1 , j)
                                               y
                             I(i , j)                                            Instantiate



                                0               a                    1           Solve
                                        X axis (relative location)
Robert Collins
CSE486, Penn State
                     Bilinear Interpolation, Math
Robert Collins
CSE486, Penn State
                         Bilinear Interpolation, Math




                     [                    ]
           +         [                         ]
Robert Collins
CSE486, Penn State
                     Bilinear Interpolation, Math




                     I=
Robert Collins
CSE486, Penn State
                     Image Warping in Matlab

      interp2 is Matlab’s built-in function for image warping

      Usage: interp2(X,Y,Z,XI,YI)

                        X                   XI



             Y         Z         YI         ZI?
Robert Collins
CSE486, Penn State
                     Tips on Using Interp2

       For our purposes, we can assume X and Y are
       just normal image pixel coords.

       Simpler Usage: interp2(Z,XI,YI)

                      X                    XI



              Y       Z         YI        ZI?
Robert Collins
CSE486, Penn State
                     Tips on Using Interp2 (cont)

       How does it work?

       Consider computing the value of ZI at row R and col C.


                         X                    XI
                                                    R
                                                        Row: R
                                          C             Col: C

              Y          Z        YI          ZI?
Robert Collins
CSE486, Penn State
                     Tips on Using Interp2 (cont)

       XI and YI are two arrays the same size as ZI.
       For a given row, col (R,C), the value (XI(R,C), YI(R,C))
       tells which (X,Y) coord of the orig image to take.
       That is: Z(YI(R,C), XI(R,C)).

                         X                    XI
                                                    R
                                                        Row: R
                                          C             Col: C

              Y          Z        YI          ZI?
Robert Collins
CSE486, Penn State
                     Tips on Using Interp2 (cont)
                                    R                              R
                                              XI(R,C)                  YI(R,C)
                        C                                C

                            XI                               YI



                                X                            XI
                                                                   R
                                                                       Row: R
                                        YI(R,C)
                      XI(R,C)
                                                         C             Col: C

              Y                 Z                 YI         ZI?


                                    Z(YI(R,C),XI(R,C))
Robert Collins
CSE486, Penn State
                     Tips on Using Interp2 (cont)

            interp2 takes care of bilinear interpolation for you,
            in case YI(R,C) and XI(R,C) are not integer coords.

            There are optional arguments to interp2 to change
            the way the interpolation is calculated. We can
            go with the default, which is bilinear interp.
Robert Collins
CSE486, Penn State
                                    Meshgrid

        A useful function when using interp2 is meshgrid

        >> [xx,yy] = meshgrid(1:4,1:3)

        xx =

               1
               1
                     2
                     2
                         3
                         3
                               4
                               4     An array suitable for use as XI
               1     2   3     4


        yy =

               1     1   1     1
               2
               3
                     2
                     3
                         2
                         3
                               2
                               3
                                     An array suitable for use as YI
Robert Collins
CSE486, Penn State
                             Interp2 Examples

         im = double(imread(‘cameraman.tif’));
         size(im)
         ans =
            256   256

         [xi, yi] = meshgrid(1:256, 1:256);

         foo = interp2(im, xi, yi);
         Imshow(uint8(foo));




                                                 Result: just a copy of
                                                 the image.
Robert Collins
CSE486, Penn State
                             Interp2 Examples

         im = double(imread(‘cameraman.tif’));
         size(im)
         ans =
            256   256

         [xi, yi] = meshgrid(1:256, 1:256);

         foo = interp2(im, xi/2, yi/2);
         Imshow(uint8(foo));




                                                 Result: scale up by 2
                                                 (but stuck in 256x256 image)
Robert Collins
CSE486, Penn State
                             Interp2 Examples

         im = double(imread(‘cameraman.tif’));
         size(im)
         ans =
            256   256

         [xi, yi] = meshgrid(1:256, 1:256);

         foo = interp2(im, 2*xi, 2*yi);
         Imshow(uint8(foo));




                                                 Result: scale down by 2
                                                 (within 256x256 image, pad with 0)
Robert Collins
CSE486, Penn State
                          Interp2 Examples
      Confusion alert!
      foo = interp2(im, xi/2, yi/2);


                     Divide coords by 2 to
                       scale up by 2


      foo = interp2(im, 2*xi, 2*yi);


                     Multiply coords by 2 to
                      reduce up by 2

       Interp2 wants the inverse coordinate transforms to the
       geometric operation you want (it uses backward warping)
Robert Collins
CSE486, Penn State
                                Interp2 Examples
        A more complicated example: scale down by 2,
        but around center of image (128,128), not (0,0)

        Recall concatenation of transformation matrices:




                                                         Bring (128,128) to origin
                                    Scale down by 2 around origin
              Bring origin back to (128,128)



                                               BUT, BE CAREFUL….
Robert Collins
CSE486, Penn State
                        Interp2 Examples

                               This specifies how we want the
                               original image to map into the
                               new image.

                                 H


                                 H-1

          Interp2 wants to know how
          new image coords map back
          to the original image coords.
Robert Collins
CSE486, Penn State
                             Interp2 Examples

         im = double(imread(‘cameraman.tif’));
         size(im)
         ans =
            256   256

         [xi, yi] = meshgrid(1:256, 1:256);

         foo = interp2(im, 2*xi-128, 2*yi-128);
         imshow(uint8(foo));




                                                  Result
Robert Collins
CSE486, Penn State
                                Interp2 Examples
   More generally
   (e.g. for any 3x3 transformation
    matrix in homogeneous coords)
         im = double(imread(‘cameraman.tif’));
         size(im)
         ans =
            256   256

         [xi, yi] = meshgrid(1:256, 1:256);

         h = [1 0 128;0 1 128; 0 0 1] * [1/2 0 0; 0 1/2 0; 0 0 1] * [1 0 -128; 0 1 -128; 0 0 1];
         h = inv(h);   %TAKE INVERSE FOR USE WITH INTERP2
         xx = (h(1,1)*xi+h(1,2)*yi+h(1,3))./(h(3,1)*xi+h(3,2)*yi+h(3,3));
         yy = (h(2,1)*xi+h(2,2)*yi+h(2,3))./(h(3,1)*xi+h(3,2)*yi+h(3,3));
         foo = uint8(interp2(im,xx,yy));
         figure(1); imshow(foo)




                                                   Result

More Related Content

Viewers also liked

Questionnaire for media mag. jenny.
Questionnaire for media mag. jenny.Questionnaire for media mag. jenny.
Questionnaire for media mag. jenny.Jennykam
 
Estimation & project planning
Estimation & project planningEstimation & project planning
Estimation & project planningalle_tode
 
State Estimation of Power System with Interline Power Flow Controller
State Estimation of Power System with Interline Power Flow ControllerState Estimation of Power System with Interline Power Flow Controller
State Estimation of Power System with Interline Power Flow ControllerIDES Editor
 
Configuration and working point and state estimation
Configuration and working point and state estimationConfiguration and working point and state estimation
Configuration and working point and state estimationDaniel Obaleye
 
State estimation and Mean-Field Control with application to demand dispatch
State estimation and Mean-Field Control with application to demand dispatchState estimation and Mean-Field Control with application to demand dispatch
State estimation and Mean-Field Control with application to demand dispatchSean Meyn
 
Power System State Estimation - A Review
Power System State Estimation - A ReviewPower System State Estimation - A Review
Power System State Estimation - A ReviewIDES Editor
 
VSC based HVDC system
VSC based HVDC systemVSC based HVDC system
VSC based HVDC systemmdhsudhan
 

Viewers also liked (10)

Questionnaire for media mag. jenny.
Questionnaire for media mag. jenny.Questionnaire for media mag. jenny.
Questionnaire for media mag. jenny.
 
Estimation & project planning
Estimation & project planningEstimation & project planning
Estimation & project planning
 
06 2015 pesgm_wllv_vsc
06 2015 pesgm_wllv_vsc06 2015 pesgm_wllv_vsc
06 2015 pesgm_wllv_vsc
 
State Estimation of Power System with Interline Power Flow Controller
State Estimation of Power System with Interline Power Flow ControllerState Estimation of Power System with Interline Power Flow Controller
State Estimation of Power System with Interline Power Flow Controller
 
Configuration and working point and state estimation
Configuration and working point and state estimationConfiguration and working point and state estimation
Configuration and working point and state estimation
 
State estimation and Mean-Field Control with application to demand dispatch
State estimation and Mean-Field Control with application to demand dispatchState estimation and Mean-Field Control with application to demand dispatch
State estimation and Mean-Field Control with application to demand dispatch
 
State estimation
State estimationState estimation
State estimation
 
Power System State Estimation - A Review
Power System State Estimation - A ReviewPower System State Estimation - A Review
Power System State Estimation - A Review
 
VSC based HVDC system
VSC based HVDC systemVSC based HVDC system
VSC based HVDC system
 
P1111130668
P1111130668P1111130668
P1111130668
 

Similar to Lecture14

Lecture20
Lecture20Lecture20
Lecture20zukun
 
Lecture31
Lecture31Lecture31
Lecture31zukun
 
Lecture07
Lecture07Lecture07
Lecture07zukun
 
Lecture02
Lecture02Lecture02
Lecture02zukun
 
Lecture21
Lecture21Lecture21
Lecture21zukun
 
Lecture21
Lecture21Lecture21
Lecture21zukun
 
Lecture06
Lecture06Lecture06
Lecture06zukun
 
Lecture23
Lecture23Lecture23
Lecture23zukun
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptxAbdusSadik
 
Lecture19
Lecture19Lecture19
Lecture19zukun
 
So sánh cấu trúc protein_Protein structure comparison
So sánh cấu trúc protein_Protein structure comparisonSo sánh cấu trúc protein_Protein structure comparison
So sánh cấu trúc protein_Protein structure comparisonbomxuan868
 
Statistical models of shape and appearance
Statistical models of shape and appearanceStatistical models of shape and appearance
Statistical models of shape and appearancepotaters
 
Lecture24
Lecture24Lecture24
Lecture24zukun
 
Space-efficient Feature Maps for String Alignment Kernels
Space-efficient Feature Maps for String Alignment KernelsSpace-efficient Feature Maps for String Alignment Kernels
Space-efficient Feature Maps for String Alignment KernelsYasuo Tabei
 
Lecture04
Lecture04Lecture04
Lecture04zukun
 
Dimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptxDimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptxRohanBorgalli
 
leanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL ReasonerleanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL ReasonerAdriano Melo
 

Similar to Lecture14 (20)

Lecture20
Lecture20Lecture20
Lecture20
 
Lecture31
Lecture31Lecture31
Lecture31
 
Lecture07
Lecture07Lecture07
Lecture07
 
Lecture02
Lecture02Lecture02
Lecture02
 
Curve Fitting
Curve FittingCurve Fitting
Curve Fitting
 
Lecture21
Lecture21Lecture21
Lecture21
 
Lecture21
Lecture21Lecture21
Lecture21
 
Lecture06
Lecture06Lecture06
Lecture06
 
Lecture23
Lecture23Lecture23
Lecture23
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptx
 
Lecture19
Lecture19Lecture19
Lecture19
 
So sánh cấu trúc protein_Protein structure comparison
So sánh cấu trúc protein_Protein structure comparisonSo sánh cấu trúc protein_Protein structure comparison
So sánh cấu trúc protein_Protein structure comparison
 
Statistical models of shape and appearance
Statistical models of shape and appearanceStatistical models of shape and appearance
Statistical models of shape and appearance
 
Lecture24
Lecture24Lecture24
Lecture24
 
Space-efficient Feature Maps for String Alignment Kernels
Space-efficient Feature Maps for String Alignment KernelsSpace-efficient Feature Maps for String Alignment Kernels
Space-efficient Feature Maps for String Alignment Kernels
 
Linear Regression.pptx
Linear Regression.pptxLinear Regression.pptx
Linear Regression.pptx
 
Data Mining Lecture_9.pptx
Data Mining Lecture_9.pptxData Mining Lecture_9.pptx
Data Mining Lecture_9.pptx
 
Lecture04
Lecture04Lecture04
Lecture04
 
Dimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptxDimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptx
 
leanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL ReasonerleanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL Reasoner
 

More from zukun

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009zukun
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVzukun
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Informationzukun
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statisticszukun
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibrationzukun
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionzukun
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluationzukun
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-softwarezukun
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptorszukun
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectorszukun
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-introzukun
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video searchzukun
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video searchzukun
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video searchzukun
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learningzukun
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionzukun
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick startzukun
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysiszukun
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structureszukun
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities zukun
 

More from zukun (20)

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCV
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Information
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statistics
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibration
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer vision
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluation
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-software
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-intro
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video search
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video search
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video search
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learning
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer vision
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick start
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structures
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities
 

Lecture14

  • 1. Robert Collins CSE486, Penn State Lecture 14 Parameter Estimation Readings T&V Sec 5.1 - 5.3
  • 2. Robert Collins CSE486, Penn State Summary: Transformations Euclidean similarity affine projective
  • 3. Robert Collins CSE486, Penn State Parameter Estimation We will talk about estimating parameters of 1) Geometric models (e.g. lines, planes, surfaces) 2) Geometric transformations (any of the parametric transformations we have been talking about) Least-squares is a general strategy to address both!
  • 4. Robert Collins CSE486, Penn State Parameter Estimation: Fitting Geometric Models General Idea: • Want to fit a model to raw image features (data) (the features could be points, edges, even regions) • Parameterize model such that model instance is an element of Rn i.e. model instance = (a1,a2,…,an) • Define an error function E(modeli , data) that measures how well a given model instance describes the data • Solve for the model instance that minimizes E
  • 5. Robert Collins CSE486, Penn State Example : Line Fitting General Idea: • Want to fit a model to raw image features (data) (the features could be points, edges, even regions) • Parameterize model such that model instance is an element of Rn i.e. model instance = (a1,a2,…,an) • Define an error function E(modeli , data) that measures how well a given model instance describes the data • Solve for the model instance that minimizes E
  • 6. Robert Collins CSE486, Penn State Point Feature Data Point features = {(xi,yi) | i = 1,…,n} pts = [... 17 81; 23 72; 35 73; 37 58; 45 50; 57 56; 61 36; 70 32; 80 32; 84 19] x y
  • 7. Robert Collins CSE486, Penn State Example : Line Fitting General Idea: • Want to fit a model to raw image features (data) (the features could be points, edges, even regions) • Parameterize model such that model instance is an element of Rn i.e. model instance = (a1,a2,…,an) • Define an error function E(modeli , data) that measures how well a given model instance describes the data • Solve for the model instance that minimizes E
  • 8. Robert Collins CSE486, Penn State Line Parameterization y=m*x+b y m = slope b dy b = y-intercept x (where b crosses the y axis) dx m = dy/dx Model instance = (m,b) (The astute student will note a problem with representing vertical lines)
  • 9. Robert Collins CSE486, Penn State Example : Line Fitting General Idea: • Want to fit a model to raw image features (data) (the features could be points, edges, even regions) • Parameterize model such that model instance is an element of Rn i.e. model instance = (a1,a2,…,an) • Define an error function E(modeli , data) that measures how well a given model instance describes the data • Solve for the model instance that minimizes E
  • 10. Robert Collins (L.S. is just one type of CSE486, Penn State Least Squares error function) • Given line (m,b) • distance of point (xi,yi) to line is vertical distance • E is sum of squared distances over all points m xi + b di (m,b) yi xi
  • 11. Robert Collins CSE486, Penn State Example : Line Fitting General Idea: • Want to fit a model to raw image features (data) (the features could be points, edges, even regions) • Parameterize model such that model instance is an element of Rn i.e. model instance = (a1,a2,…,an) • Define an error function E(modeli , data) that measures how well a given model instance describes the data • Solve for the model instance that minimizes E
  • 12. Robert Collins CSE486, Penn State Calculus to Find Extrema • Take first derivatives of E with respect to m, b • Set equations to zero Note equivalence • Solve for m, b to linear regression *DOING DERIVATION ON THE BOARD*
  • 13. Robert Collins CSE486, Penn State Least Squares Solution m = -0.8528 b = 94.3059
  • 14. Robert Collins CSE486, Penn State Problem with Parameterization •There is a problem with our parameterization, namely (m,b) is undefined for vertical lines •More general line parameterization such that •Question for class: why do we need the quadratic constraint? •Another question: what is relation of this to y=mx+b?
  • 15. Robert Collins CSE486, Penn State LS with Algebraic Distance Algebraic Distance Will derive on board. Result: (a,b,c) is eigenvector associated with smallest eigenvalue (it will only be 0 if there is no noise, and therefore all the points lie exactly on a line)
  • 16. Robert Collins CSE486, Penn State LS with Algebraic Distance Note much different from linear regression line (in this case) but we can rest assured that our program won’t blow up when it sees a vertical line!
  • 17. Robert Collins CSE486, Penn State Algebraic Least Squares Issues • Note: I didn’t draw the error vectors on the plot • That’s because I don’t know what to draw… • Main problem with algebraic distances: Hard to say precisely what it is you are minimizing, since algebraic distances are rarely physically meaningful quantities
  • 18. Robert Collins CSE486, Penn State Orthogonal Least Squares •Minimize orthogonal (geometric) distance. •Makes sense physically, but harder to derive •Representation: such that Distance measured perpendicular to line (compare with algebraic distance)
  • 19. Robert Collins CSE486, Penn State Orthogonal Least Squares Harder to derive. Key insight: best fit line must pass through the center of mass of the set of points! Move center of mass to the origin. This reduces the problem to finding a unit vector normal to the line: (a,b) s.t. a2+b^2=1 This will be minimum eigenvector of scatter matrix of the points. Finally, solve for c
  • 20. Robert Collins Orthogonal Least Squares Solution CSE486, Penn State center of mass “distance” means what we intuitively expect (i.e. distance to closest point on line, or minimum distance to line)
  • 21. Robert Collins CSE486, Penn State Parameter Estimation: Estimating a Transformation Let’s say we have found point matches between two images, and we think they are related by some parametric transformation (e.g. translation; scaled Euclidean; affine). How do we estimate the parameters of that transformation? General Strategy • Least-Squares estimation from point correspondences Two important (related) questions: •How many degrees of freedom? •How many point correspondences are needed?
  • 22. Robert Collins Example: Translation Estimation CSE486, Penn State equations matrix form How many degrees of freedom? How many independent variables are there? Two How many point correspondences are needed? Each correspondence (x,y)=>(x’,y’) provides two equations DoF = 2/2 = 1 2
  • 23. Robert Collins Example: Translation Estimation CSE486, Penn State x1’,y1’ equations x1,y1 x2’,y2’ x2,y2 x3’,y3’ x4’,y4’ x3,y3 x4,y4 Least Squares Estimation: Minimize wrt tx, ty
  • 24. Robert Collins CSE486, Penn State Let’s try another example Similarity transformation x’ a -b c x y’ = b a d y 1 0 0 1 1 note: a = s cosθ **ON THE BOARD** b = s sinθ
  • 25. Robert Collins CSE486, Penn State Practical Issue Once we have estimated a transformation, how can we (un)warp image pixel values to produce a new picture.
  • 26. Robert Collins Warping & Bilinear Interpolation CSE486, Penn State Given a transformation between two images, (coordinate systems) we want to “warp” one image into the coordinate system of the other. We will call the coordinate system where we are mapping from the “source” image We will call the coordinate system we are mapping to the “destination” image.
  • 27. Robert Collins CSE486, Penn State Warping Example Transformation in this case is a projective transformation (general 3x3 matrix, operating on homogeneous coords) H from Hartley & Zisserman Source Image Destination image We will have a lot more to say about this is a future lecture.
  • 28. Robert Collins CSE486, Penn State Forward Warping Source image Destination image x H H(x) •For each pixel x in the source image •Determine where it goes as H(x) •Color the destination pixel Problems?
  • 29. Robert Collins CSE486, Penn State Forward Warping Problem Source image Destination image magnified Can leave gaps!
  • 30. Robert Collins CSE486, Penn State Backward Warping (No gaps) Source image Destination image H-1(x) x H-1 ? •For each pixel x in the destination image •Determine where it comes from as H-1(x) •Get color from that location
  • 31. Robert Collins CSE486, Penn State Interpolation What do we mean by “get color from that location”? Consider grey values. What is intensity at (x,y)? I(i , j+1) I(i+1 , j+1) Nearest Neighbor: Take color of pixel with closest center. (x,y) I(x,y) = I(i+1,j) I(i , j) I(i+1 , j)
  • 32. Robert Collins CSE486, Penn State Bilinear interpolation What do we mean by “get color from that location”? Consider grey values. What is intensity at (x,y)? I(i , j+1) I(i+1 , j+1) Bilinear Interpolation: Weighted average (x,y) I(i , j) I(i+1 , j)
  • 33. Robert Collins CSE486, Penn State Bilinear interpolation What do we mean by “get color from that location”? Consider grey values. What is intensity at (x,y)? I(i , j+1) I(i+1 , j+1) Bilinear Interpolation: Weighted average A1 A2 I(x,y) = A3*I(i,j) A3 A4 + A4*I(i+1,j) + A2*I(i+1,j+1) + A1*I(i,j+1) I(i , j) I(i+1 , j)
  • 34. Robert Collins CSE486, Penn State Bilinear Interpolation, Math First, consider linear interpolation (i , j) (i+1 , j) Inituition: Given two pixel values, what should the value be at some intermediate point between them? (i , j) (i+1 , j) If close to (i,j), should be intensity similar to I(i,j) (i , j) (i+1 , j) If equidistant from both, should be average of the two intensities (i , j) (i+1 , j) If close to (i+1,j), should be intensity similar to I(i+1,j)
  • 35. Robert Collins CSE486, Penn State Linear Interpolation (i , j) (i+1 , j) a (1-a) Recall: Y axis (intensity value) I(i+1 , j) y I(i , j) Instantiate 0 a 1 Solve X axis (relative location)
  • 36. Robert Collins CSE486, Penn State Bilinear Interpolation, Math
  • 37. Robert Collins CSE486, Penn State Bilinear Interpolation, Math [ ] + [ ]
  • 38. Robert Collins CSE486, Penn State Bilinear Interpolation, Math I=
  • 39. Robert Collins CSE486, Penn State Image Warping in Matlab interp2 is Matlab’s built-in function for image warping Usage: interp2(X,Y,Z,XI,YI) X XI Y Z YI ZI?
  • 40. Robert Collins CSE486, Penn State Tips on Using Interp2 For our purposes, we can assume X and Y are just normal image pixel coords. Simpler Usage: interp2(Z,XI,YI) X XI Y Z YI ZI?
  • 41. Robert Collins CSE486, Penn State Tips on Using Interp2 (cont) How does it work? Consider computing the value of ZI at row R and col C. X XI R Row: R C Col: C Y Z YI ZI?
  • 42. Robert Collins CSE486, Penn State Tips on Using Interp2 (cont) XI and YI are two arrays the same size as ZI. For a given row, col (R,C), the value (XI(R,C), YI(R,C)) tells which (X,Y) coord of the orig image to take. That is: Z(YI(R,C), XI(R,C)). X XI R Row: R C Col: C Y Z YI ZI?
  • 43. Robert Collins CSE486, Penn State Tips on Using Interp2 (cont) R R XI(R,C) YI(R,C) C C XI YI X XI R Row: R YI(R,C) XI(R,C) C Col: C Y Z YI ZI? Z(YI(R,C),XI(R,C))
  • 44. Robert Collins CSE486, Penn State Tips on Using Interp2 (cont) interp2 takes care of bilinear interpolation for you, in case YI(R,C) and XI(R,C) are not integer coords. There are optional arguments to interp2 to change the way the interpolation is calculated. We can go with the default, which is bilinear interp.
  • 45. Robert Collins CSE486, Penn State Meshgrid A useful function when using interp2 is meshgrid >> [xx,yy] = meshgrid(1:4,1:3) xx = 1 1 2 2 3 3 4 4 An array suitable for use as XI 1 2 3 4 yy = 1 1 1 1 2 3 2 3 2 3 2 3 An array suitable for use as YI
  • 46. Robert Collins CSE486, Penn State Interp2 Examples im = double(imread(‘cameraman.tif’)); size(im) ans = 256 256 [xi, yi] = meshgrid(1:256, 1:256); foo = interp2(im, xi, yi); Imshow(uint8(foo)); Result: just a copy of the image.
  • 47. Robert Collins CSE486, Penn State Interp2 Examples im = double(imread(‘cameraman.tif’)); size(im) ans = 256 256 [xi, yi] = meshgrid(1:256, 1:256); foo = interp2(im, xi/2, yi/2); Imshow(uint8(foo)); Result: scale up by 2 (but stuck in 256x256 image)
  • 48. Robert Collins CSE486, Penn State Interp2 Examples im = double(imread(‘cameraman.tif’)); size(im) ans = 256 256 [xi, yi] = meshgrid(1:256, 1:256); foo = interp2(im, 2*xi, 2*yi); Imshow(uint8(foo)); Result: scale down by 2 (within 256x256 image, pad with 0)
  • 49. Robert Collins CSE486, Penn State Interp2 Examples Confusion alert! foo = interp2(im, xi/2, yi/2); Divide coords by 2 to scale up by 2 foo = interp2(im, 2*xi, 2*yi); Multiply coords by 2 to reduce up by 2 Interp2 wants the inverse coordinate transforms to the geometric operation you want (it uses backward warping)
  • 50. Robert Collins CSE486, Penn State Interp2 Examples A more complicated example: scale down by 2, but around center of image (128,128), not (0,0) Recall concatenation of transformation matrices: Bring (128,128) to origin Scale down by 2 around origin Bring origin back to (128,128) BUT, BE CAREFUL….
  • 51. Robert Collins CSE486, Penn State Interp2 Examples This specifies how we want the original image to map into the new image. H H-1 Interp2 wants to know how new image coords map back to the original image coords.
  • 52. Robert Collins CSE486, Penn State Interp2 Examples im = double(imread(‘cameraman.tif’)); size(im) ans = 256 256 [xi, yi] = meshgrid(1:256, 1:256); foo = interp2(im, 2*xi-128, 2*yi-128); imshow(uint8(foo)); Result
  • 53. Robert Collins CSE486, Penn State Interp2 Examples More generally (e.g. for any 3x3 transformation matrix in homogeneous coords) im = double(imread(‘cameraman.tif’)); size(im) ans = 256 256 [xi, yi] = meshgrid(1:256, 1:256); h = [1 0 128;0 1 128; 0 0 1] * [1/2 0 0; 0 1/2 0; 0 0 1] * [1 0 -128; 0 1 -128; 0 0 1]; h = inv(h); %TAKE INVERSE FOR USE WITH INTERP2 xx = (h(1,1)*xi+h(1,2)*yi+h(1,3))./(h(3,1)*xi+h(3,2)*yi+h(3,3)); yy = (h(2,1)*xi+h(2,2)*yi+h(2,3))./(h(3,1)*xi+h(3,2)*yi+h(3,3)); foo = uint8(interp2(im,xx,yy)); figure(1); imshow(foo) Result