SlideShare a Scribd company logo
1 of 33
Download to read offline
Fitting
Fitting: Motivation
• We’ve learned how to
detect edges, corners,
blobs. Now what?
• We would like to form a
higher-level, more
compact representation of
the features in the image
by grouping multiple
features according to a
simple model
9300 Harris Corners Pkwy, Charlotte, NC
Source: K. Grauman
Fitting
• Choose a parametric model to represent a
set of features
simple model: lines simple model: circles
complicated model: car
Fitting
• Choose a parametric model to represent a
set of features
• Membership criterion is not local
• Can’t tell whether a point belongs to a given model just by
looking at that point
• Three main questions:
• What model represents this set of features best?
• Which of several model instances gets which feature?
• How many model instances are there?
• Computational complexity is important
• It is infeasible to examine every possible set of parameters
and every possible combination of features
Fitting: Issues
• Noise in the measured feature locations
• Extraneous data: clutter (outliers), multiple lines
• Missing data: occlusions
Case study: Line detection
Fitting: Issues
• If we know which points belong to the line,
how do we find the “optimal” line parameters?
• Least squares
• What if there are outliers?
• Robust fitting, RANSAC
• What if there are many lines?
• Voting methods: RANSAC, Hough transform
• What if we’re not even sure it’s a line?
• Model selection
Least squares line fitting
Data: (x1, y1), …, (xn, yn)
Line equation: yi = m xi + b
Find (m, b) to minimize
∑=
−
−
=
n
i i
i b
x
m
y
E 1
2
)
(
(xi, yi)
y=mx+b
Least squares line fitting
Data: (x1, y1), …, (xn, yn)
Line equation: yi = m xi + b
Find (m, b) to minimize
0
2
2 =
−
= Y
X
XB
X
dB
dE T
T
[ ]
)
(
)
(
)
(
2
)
(
)
(
1
1
1
2
2
1
1
1
2
XB
XB
Y
XB
Y
Y
XB
Y
XB
Y
XB
Y
b
m
x
x
y
y
b
m
x
y
E
T
T
T
T
n
n
n
i i
i
+
−
=
−
−
=
−
=
⎥
⎦
⎤
⎢
⎣
⎡
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎣
⎡
−
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎣
⎡
=
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
⎥
⎦
⎤
⎢
⎣
⎡
−
= ∑=
M
M
M
Normal equations: least squares solution to
XB=Y
∑=
−
−
=
n
i i
i b
x
m
y
E 1
2
)
(
(xi, yi)
y=mx+b
Y
X
XB
X T
T
=
Problem with “vertical” least squares
• Not rotation-invariant
• Fails completely for vertical lines
Total least squares
Distance between point (xn, yn) and
line ax+by=d (a2+b2=1): |ax + by – d|
Find (a, b, d) to minimize the sum of
squared perpendicular distances ∑=
−
+
=
n
i i
i d
y
b
x
a
E 1
2
)
( (xi, yi)
ax+by=d
∑=
−
+
=
n
i i
i d
y
b
x
a
E 1
2
)
(
Unit normal:
N=(a, b)
Total least squares
Distance between point (xn, yn) and
line ax+by=d (a2+b2=1): |ax + by – d|
Find (a, b, d) to minimize the sum of
squared perpendicular distances ∑=
−
+
=
n
i i
i d
y
b
x
a
E 1
2
)
( (xi, yi)
ax+by=d
∑=
−
+
=
n
i i
i d
y
b
x
a
E 1
2
)
(
Unit normal:
N=(a, b)
0
)
(
2
1
=
−
+
−
=
∂
∂
∑=
n
i i
i d
y
b
x
a
d
E
y
b
x
a
x
n
b
x
n
a
d
n
i i
n
i i +
=
+
= ∑
∑ =
= 1
1
)
(
)
(
))
(
)
(
(
2
1
1
1
2
UN
UN
b
a
y
y
x
x
y
y
x
x
y
y
b
x
x
a
E T
n
n
n
i i
i =
⎥
⎦
⎤
⎢
⎣
⎡
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎣
⎡
−
−
−
−
=
−
+
−
= ∑=
M
M
0
)
(
2 =
= N
U
U
dN
dE T
Solution to (UTU)N = 0, subject to ||N||2 = 1: eigenvector of UTU
associated with the smallest eigenvalue (least squares solution
to homogeneous linear system UN = 0)
Total least squares
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎣
⎡
−
−
−
−
=
y
y
x
x
y
y
x
x
U
n
n
M
M
1
1
⎥
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎢
⎣
⎡
−
−
−
−
−
−
=
∑
∑
∑
∑
=
=
=
=
n
i
i
n
i
i
i
n
i
i
i
n
i
i
T
y
y
y
y
x
x
y
y
x
x
x
x
U
U
1
2
1
1
1
2
)
(
)
)(
(
)
)(
(
)
(
second moment matrix
Total least squares
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎣
⎡
−
−
−
−
=
y
y
x
x
y
y
x
x
U
n
n
M
M
1
1
⎥
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎢
⎣
⎡
−
−
−
−
−
−
=
∑
∑
∑
∑
=
=
=
=
n
i
i
n
i
i
i
n
i
i
i
n
i
i
T
y
y
y
y
x
x
y
y
x
x
x
x
U
U
1
2
1
1
1
2
)
(
)
)(
(
)
)(
(
)
(
)
,
( y
x
N = (a, b)
second moment matrix
)
,
( y
y
x
x i
i −
−
Least squares as likelihood maximization
• Generative model: line
points are corrupted by
Gaussian noise in the
direction perpendicular to
the line
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
+
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
=
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
b
a
v
u
y
x
ε
(x, y)
ax+by=d
(u, v)
ε
point
on the
line
noise:
zero-mean
Gaussian with
std. dev. σ
normal
direction
Least squares as likelihood maximization
• Generative model: line
points are corrupted by
Gaussian noise in the
direction perpendicular to
the line
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
+
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
=
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛
b
a
v
u
y
x
ε
(x, y)
(u, v)
ε
∏
∏ =
=
⎟
⎟
⎠
⎞
⎜
⎜
⎝
⎛ −
+
−
∝
=
n
i
i
i
n
i
i
n
d
by
ax
d
b
a
x
P
d
b
a
x
x
P
1
2
2
1
1
2
)
(
exp
)
,
,
|
(
)
,
,
|
,
,
(
σ
K
Likelihood of points given line parameters (a, b, d):
∑
=
−
+
−
=
n
i
i
i
n d
by
ax
d
b
a
x
x
L
1
2
2
1 )
(
2
1
)
,
,
|
,
,
(
σ
K
Log-likelihood:
ax+by=d
Least squares for general curves
• We would like to minimize the sum of squared
geometric distances between the data points
and the curve
(xi, yi)
d((xi, yi), C)
curve C
Least squares for conics
• Equation of a general conic:
C(a, x) = a · x = ax2 + bxy + cy2 + dx + ey + f = 0,
a = [a, b, c, d, e, f],
x = [x2, xy, y2, x, y, 1]
• Minimizing the geometric distance is non-linear even for
a conic
• Algebraic distance: C(a, x)
• Algebraic distance minimization by linear least squares:
0
1
1
1
2
2
2
2
2
2
2
2
2
2
1
1
2
1
1
1
2
1
=
⎥
⎥
⎥
⎥
⎥
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎢
⎢
⎢
⎢
⎢
⎣
⎡
⎥
⎥
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎢
⎢
⎣
⎡
f
e
d
c
b
a
y
x
y
y
x
x
y
x
y
y
x
x
y
x
y
y
x
x
n
n
n
n
n
n
M
O
M
Least squares for conics
• Least squares system: Da = 0
• Need constraint on a to prevent trivial solution
• Discriminant: b2 – 4ac
• Negative: ellipse
• Zero: parabola
• Positive: hyperbola
• Minimizing squared algebraic distance subject to
constraints leads to a generalized eigenvalue
problem
• Many variations possible
• For more information:
• A. Fitzgibbon, M. Pilu, and R. Fisher, Direct least-squares fitting
of ellipses, IEEE Transactions on Pattern Analysis and Machine
Intelligence, 21(5), 476--480, May 1999
Least squares: Robustness to noise
Least squares fit to the red points:
Least squares: Robustness to noise
Least squares fit with an outlier:
Problem: squared error heavily penalizes outliers
Robust estimators
• General approach: minimize
ri (xi, θ) – residual of ith point w.r.t. model parameters θ
ρ – robust function with scale parameter σ
( )
( )
σ
θ
ρ ;
,
i
i
i
x
r
∑
The robust function
ρ behaves like
squared distance for
small values of the
residual u but
saturates for larger
values of u
Choosing the scale: Just right
The effect of the outlier is eliminated
The error value is almost the same for every
point and the fit is very poor
Choosing the scale: Too small
Choosing the scale: Too large
Behaves much the same as least squares
Robust estimation: Notes
• Robust fitting is a nonlinear optimization
problem that must be solved iteratively
• Least squares solution can be used for
initialization
• Adaptive choice of scale:
“magic number” times median residual
RANSAC
• Robust fitting can deal with a few outliers –
what if we have very many?
• Random sample consensus (RANSAC):
Very general framework for model fitting in
the presence of outliers
• Outline
• Choose a small subset uniformly at random
• Fit a model to that subset
• Find all remaining points that are “close” to the model and
reject the rest as outliers
• Do this many times and choose the best model
M. A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model
Fitting with Applications to Image Analysis and Automated Cartography. Comm. of
the ACM, Vol 24, pp 381-395, 1981.
RANSAC for line fitting
Repeat N times:
• Draw s points uniformly at random
• Fit line to these s points
• Find inliers to this line among the remaining
points (i.e., points whose distance from the
line is less than t)
• If there are d or more inliers, accept the line
and refit using all inliers
Choosing the parameters
• Initial number of points s
• Typically minimum number needed to fit the model
• Distance threshold t
• Choose t so probability for inlier is p (e.g. 0.95)
• Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2
• Number of samples N
• Choose N so that, with probability p, at least one random
sample is free from outliers (e.g. p=0.99) (outlier ratio: e)
Source: M. Pollefeys
Choosing the parameters
• Initial number of points s
• Typically minimum number needed to fit the model
• Distance threshold t
• Choose t so probability for inlier is p (e.g. 0.95)
• Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2
• Number of samples N
• Choose N so that, with probability p, at least one random
sample is free from outliers (e.g. p=0.99) (outlier ratio: e)
( ) ( )
( )
s
e
p
N −
−
−
= 1
1
log
/
1
log
( )
( ) p
e
N
s
−
=
−
− 1
1
1
proportion of outliers e
s 5% 10% 20% 25% 30% 40% 50%
2 2 3 5 6 7 11 17
3 3 4 7 9 11 19 35
4 3 5 9 13 17 34 72
5 4 6 12 17 26 57 146
6 4 7 16 24 37 97 293
7 4 8 20 33 54 163 588
8 5 9 26 44 78 272 1177
Source: M. Pollefeys
Choosing the parameters
• Initial number of points s
• Typically minimum number needed to fit the model
• Distance threshold t
• Choose t so probability for inlier is p (e.g. 0.95)
• Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2
• Number of samples N
• Choose N so that, with probability p, at least one random
sample is free from outliers (e.g. p=0.99) (outlier ratio: e)
( )
( ) p
e
N
s
−
=
−
− 1
1
1
Source: M. Pollefeys
( ) ( )
( )
s
e
p
N −
−
−
= 1
1
log
/
1
log
Choosing the parameters
• Initial number of points s
• Typically minimum number needed to fit the model
• Distance threshold t
• Choose t so probability for inlier is p (e.g. 0.95)
• Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2
• Number of samples N
• Choose N so that, with probability p, at least one random
sample is free from outliers (e.g. p=0.99) (outlier ratio: e)
• Consensus set size d
• Should match expected inlier ratio
Source: M. Pollefeys
Adaptively determining the number of samples
• Inlier ratio e is often unknown a priori, so pick
worst case, e.g. 50%, and adapt if more
inliers are found, e.g. 80% would yield e=0.2
• Adaptive procedure:
• N=∞, sample_count =0
• While N >sample_count
– Choose a sample and count the number of inliers
– Set e = 1 – (number of inliers)/(total number of points)
– Recompute N from e:
– Increment the sample_count by 1
( ) ( )
( )
s
e
p
N −
−
−
= 1
1
log
/
1
log
Source: M. Pollefeys
RANSAC pros and cons
• Pros
• Simple and general
• Applicable to many different problems
• Often works well in practice
• Cons
• Lots of parameters to tune
• Can’t always get a good initialization of the model based on
the minimum number of samples
• Sometimes too many iterations are required
• Can fail for extremely low inlier ratios
• We can often do better than brute-force sampling

More Related Content

What's hot

Lec07 corner blob
Lec07 corner blobLec07 corner blob
Lec07 corner blobBaliThorat1
 
Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015)
Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015) Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015)
Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015) Konrad Wenzel
 
04 image enhancement edge detection
04 image enhancement edge detection04 image enhancement edge detection
04 image enhancement edge detectionRumah Belajar
 
Lighting and shading
Lighting and shadingLighting and shading
Lighting and shadingeshveeen
 
Point Cloud Segmentation for 3D Reconstruction
Point Cloud Segmentation for 3D ReconstructionPoint Cloud Segmentation for 3D Reconstruction
Point Cloud Segmentation for 3D ReconstructionPirouz Nourian
 
Use of Specularities and Motion in the Extraction of Surface Shape
Use of Specularities and Motion in the Extraction of Surface ShapeUse of Specularities and Motion in the Extraction of Surface Shape
Use of Specularities and Motion in the Extraction of Surface ShapeDamian T. Gordon
 
Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)Yan Xu
 
Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentationasodariyabhavesh
 
Hidden Surface Removal using Z-buffer
Hidden Surface Removal using Z-bufferHidden Surface Removal using Z-buffer
Hidden Surface Removal using Z-bufferRaj Sikarwar
 

What's hot (19)

Histogram processing
Histogram processingHistogram processing
Histogram processing
 
Image enhancement
Image enhancementImage enhancement
Image enhancement
 
Image segmentation
Image segmentationImage segmentation
Image segmentation
 
Lec07 corner blob
Lec07 corner blobLec07 corner blob
Lec07 corner blob
 
Mathematical tools in dip
Mathematical tools in dipMathematical tools in dip
Mathematical tools in dip
 
Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015)
Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015) Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015)
Dense Image Matching - Challenges and Potentials (Keynote 3D-ARCH 2015)
 
2. filtering basics
2. filtering basics2. filtering basics
2. filtering basics
 
04 image enhancement edge detection
04 image enhancement edge detection04 image enhancement edge detection
04 image enhancement edge detection
 
Edge detection
Edge detectionEdge detection
Edge detection
 
Lighting and shading
Lighting and shadingLighting and shading
Lighting and shading
 
Point Cloud Segmentation for 3D Reconstruction
Point Cloud Segmentation for 3D ReconstructionPoint Cloud Segmentation for 3D Reconstruction
Point Cloud Segmentation for 3D Reconstruction
 
Use of Specularities and Motion in the Extraction of Surface Shape
Use of Specularities and Motion in the Extraction of Surface ShapeUse of Specularities and Motion in the Extraction of Surface Shape
Use of Specularities and Motion in the Extraction of Surface Shape
 
Edges and lines
Edges and linesEdges and lines
Edges and lines
 
testpang
testpangtestpang
testpang
 
Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)Snakes in Images (Active contour tutorial)
Snakes in Images (Active contour tutorial)
 
Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentation
 
Shading
ShadingShading
Shading
 
Shading in OpenGL
Shading in OpenGLShading in OpenGL
Shading in OpenGL
 
Hidden Surface Removal using Z-buffer
Hidden Surface Removal using Z-bufferHidden Surface Removal using Z-buffer
Hidden Surface Removal using Z-buffer
 

Similar to Fitting Models to Image Features Using Least Squares and RANSAC

Undecidable Problems and Approximation Algorithms
Undecidable Problems and Approximation AlgorithmsUndecidable Problems and Approximation Algorithms
Undecidable Problems and Approximation AlgorithmsMuthu Vinayagam
 
Undecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWER
Undecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWERUndecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWER
Undecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWERmuthukrishnavinayaga
 
Random number generation
Random number generationRandom number generation
Random number generationvinay126me
 
Clipping & Rasterization
Clipping & RasterizationClipping & Rasterization
Clipping & RasterizationAhmed Daoud
 
Algorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model RecoveryAlgorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model RecoveryCurvSurf
 
Bayesian Defect Signal Analysis for Nondestructive Evaluation of Materials
Bayesian Defect Signal Analysis for Nondestructive Evaluation of MaterialsBayesian Defect Signal Analysis for Nondestructive Evaluation of Materials
Bayesian Defect Signal Analysis for Nondestructive Evaluation of MaterialsAleksandar Dogandžić
 
CS 354 More Graphics Pipeline
CS 354 More Graphics PipelineCS 354 More Graphics Pipeline
CS 354 More Graphics PipelineMark Kilgard
 
Sparsenet
SparsenetSparsenet
Sparsenetndronen
 
High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...Vissarion Fisikopoulos
 
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)마이캠퍼스
 
Convex Hull Approximation of Nearly Optimal Lasso Solutions
Convex Hull Approximation of Nearly Optimal Lasso SolutionsConvex Hull Approximation of Nearly Optimal Lasso Solutions
Convex Hull Approximation of Nearly Optimal Lasso SolutionsSatoshi Hara
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptxAbdusSadik
 
ImageSegmentation (1).ppt
ImageSegmentation (1).pptImageSegmentation (1).ppt
ImageSegmentation (1).pptNoorUlHaq47
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.pptAVUDAI1
 

Similar to Fitting Models to Image Features Using Least Squares and RANSAC (20)

Undecidable Problems and Approximation Algorithms
Undecidable Problems and Approximation AlgorithmsUndecidable Problems and Approximation Algorithms
Undecidable Problems and Approximation Algorithms
 
Counting trees.pptx
Counting trees.pptxCounting trees.pptx
Counting trees.pptx
 
Undecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWER
Undecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWERUndecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWER
Undecidable Problems - COPING WITH THE LIMITATIONS OF ALGORITHM POWER
 
Extended Isolation Forest
Extended Isolation Forest Extended Isolation Forest
Extended Isolation Forest
 
Random number generation
Random number generationRandom number generation
Random number generation
 
Clipping & Rasterization
Clipping & RasterizationClipping & Rasterization
Clipping & Rasterization
 
Mvs adas
Mvs adasMvs adas
Mvs adas
 
Representation image
Representation imageRepresentation image
Representation image
 
Introduction to matlab
Introduction to matlabIntroduction to matlab
Introduction to matlab
 
Algorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model RecoveryAlgorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model Recovery
 
Mit6 094 iap10_lec03
Mit6 094 iap10_lec03Mit6 094 iap10_lec03
Mit6 094 iap10_lec03
 
Bayesian Defect Signal Analysis for Nondestructive Evaluation of Materials
Bayesian Defect Signal Analysis for Nondestructive Evaluation of MaterialsBayesian Defect Signal Analysis for Nondestructive Evaluation of Materials
Bayesian Defect Signal Analysis for Nondestructive Evaluation of Materials
 
CS 354 More Graphics Pipeline
CS 354 More Graphics PipelineCS 354 More Graphics Pipeline
CS 354 More Graphics Pipeline
 
Sparsenet
SparsenetSparsenet
Sparsenet
 
High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...
 
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)
 
Convex Hull Approximation of Nearly Optimal Lasso Solutions
Convex Hull Approximation of Nearly Optimal Lasso SolutionsConvex Hull Approximation of Nearly Optimal Lasso Solutions
Convex Hull Approximation of Nearly Optimal Lasso Solutions
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptx
 
ImageSegmentation (1).ppt
ImageSegmentation (1).pptImageSegmentation (1).ppt
ImageSegmentation (1).ppt
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.ppt
 

More from BaliThorat1

Lec14 multiview stereo
Lec14 multiview stereoLec14 multiview stereo
Lec14 multiview stereoBaliThorat1
 
Lec11 single view-converted
Lec11 single view-convertedLec11 single view-converted
Lec11 single view-convertedBaliThorat1
 
8 operating system concept
8 operating system concept8 operating system concept
8 operating system conceptBaliThorat1
 
6 input output devices
6 input output devices6 input output devices
6 input output devicesBaliThorat1
 
2 windows operating system
2 windows operating system2 windows operating system
2 windows operating systemBaliThorat1
 
5 computer memory
5 computer memory5 computer memory
5 computer memoryBaliThorat1
 
4 computer languages
4 computer languages4 computer languages
4 computer languagesBaliThorat1
 
1 fundamentals of computer
1 fundamentals of computer1 fundamentals of computer
1 fundamentals of computerBaliThorat1
 
1 fundamentals of computer system
1 fundamentals of computer system1 fundamentals of computer system
1 fundamentals of computer systemBaliThorat1
 
Computer generation and classification
Computer generation and classificationComputer generation and classification
Computer generation and classificationBaliThorat1
 
Algorithm and flowchart
Algorithm and flowchartAlgorithm and flowchart
Algorithm and flowchartBaliThorat1
 
6 cpu scheduling
6 cpu scheduling6 cpu scheduling
6 cpu schedulingBaliThorat1
 
5 process synchronization
5 process synchronization5 process synchronization
5 process synchronizationBaliThorat1
 
1 intro and overview
1 intro and overview1 intro and overview
1 intro and overviewBaliThorat1
 

More from BaliThorat1 (20)

Lec14 multiview stereo
Lec14 multiview stereoLec14 multiview stereo
Lec14 multiview stereo
 
Lec11 single view-converted
Lec11 single view-convertedLec11 single view-converted
Lec11 single view-converted
 
8 operating system concept
8 operating system concept8 operating system concept
8 operating system concept
 
7 processor
7 processor7 processor
7 processor
 
6 input output devices
6 input output devices6 input output devices
6 input output devices
 
2 windows operating system
2 windows operating system2 windows operating system
2 windows operating system
 
5 computer memory
5 computer memory5 computer memory
5 computer memory
 
4 computer languages
4 computer languages4 computer languages
4 computer languages
 
1 fundamentals of computer
1 fundamentals of computer1 fundamentals of computer
1 fundamentals of computer
 
1 fundamentals of computer system
1 fundamentals of computer system1 fundamentals of computer system
1 fundamentals of computer system
 
Computer generation and classification
Computer generation and classificationComputer generation and classification
Computer generation and classification
 
Algorithm and flowchart
Algorithm and flowchartAlgorithm and flowchart
Algorithm and flowchart
 
6 cpu scheduling
6 cpu scheduling6 cpu scheduling
6 cpu scheduling
 
5 process synchronization
5 process synchronization5 process synchronization
5 process synchronization
 
4 threads
4 threads4 threads
4 threads
 
3 processes
3 processes3 processes
3 processes
 
2 os structure
2 os structure2 os structure
2 os structure
 
1 intro and overview
1 intro and overview1 intro and overview
1 intro and overview
 
Lec04 color
Lec04 colorLec04 color
Lec04 color
 
Lec03 light
Lec03 lightLec03 light
Lec03 light
 

Recently uploaded

Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 

Recently uploaded (20)

Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 

Fitting Models to Image Features Using Least Squares and RANSAC

  • 2. Fitting: Motivation • We’ve learned how to detect edges, corners, blobs. Now what? • We would like to form a higher-level, more compact representation of the features in the image by grouping multiple features according to a simple model 9300 Harris Corners Pkwy, Charlotte, NC
  • 3. Source: K. Grauman Fitting • Choose a parametric model to represent a set of features simple model: lines simple model: circles complicated model: car
  • 4. Fitting • Choose a parametric model to represent a set of features • Membership criterion is not local • Can’t tell whether a point belongs to a given model just by looking at that point • Three main questions: • What model represents this set of features best? • Which of several model instances gets which feature? • How many model instances are there? • Computational complexity is important • It is infeasible to examine every possible set of parameters and every possible combination of features
  • 5. Fitting: Issues • Noise in the measured feature locations • Extraneous data: clutter (outliers), multiple lines • Missing data: occlusions Case study: Line detection
  • 6. Fitting: Issues • If we know which points belong to the line, how do we find the “optimal” line parameters? • Least squares • What if there are outliers? • Robust fitting, RANSAC • What if there are many lines? • Voting methods: RANSAC, Hough transform • What if we’re not even sure it’s a line? • Model selection
  • 7. Least squares line fitting Data: (x1, y1), …, (xn, yn) Line equation: yi = m xi + b Find (m, b) to minimize ∑= − − = n i i i b x m y E 1 2 ) ( (xi, yi) y=mx+b
  • 8. Least squares line fitting Data: (x1, y1), …, (xn, yn) Line equation: yi = m xi + b Find (m, b) to minimize 0 2 2 = − = Y X XB X dB dE T T [ ] ) ( ) ( ) ( 2 ) ( ) ( 1 1 1 2 2 1 1 1 2 XB XB Y XB Y Y XB Y XB Y XB Y b m x x y y b m x y E T T T T n n n i i i + − = − − = − = ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ − ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ = ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ − = ∑= M M M Normal equations: least squares solution to XB=Y ∑= − − = n i i i b x m y E 1 2 ) ( (xi, yi) y=mx+b Y X XB X T T =
  • 9. Problem with “vertical” least squares • Not rotation-invariant • Fails completely for vertical lines
  • 10. Total least squares Distance between point (xn, yn) and line ax+by=d (a2+b2=1): |ax + by – d| Find (a, b, d) to minimize the sum of squared perpendicular distances ∑= − + = n i i i d y b x a E 1 2 ) ( (xi, yi) ax+by=d ∑= − + = n i i i d y b x a E 1 2 ) ( Unit normal: N=(a, b)
  • 11. Total least squares Distance between point (xn, yn) and line ax+by=d (a2+b2=1): |ax + by – d| Find (a, b, d) to minimize the sum of squared perpendicular distances ∑= − + = n i i i d y b x a E 1 2 ) ( (xi, yi) ax+by=d ∑= − + = n i i i d y b x a E 1 2 ) ( Unit normal: N=(a, b) 0 ) ( 2 1 = − + − = ∂ ∂ ∑= n i i i d y b x a d E y b x a x n b x n a d n i i n i i + = + = ∑ ∑ = = 1 1 ) ( ) ( )) ( ) ( ( 2 1 1 1 2 UN UN b a y y x x y y x x y y b x x a E T n n n i i i = ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ − − − − = − + − = ∑= M M 0 ) ( 2 = = N U U dN dE T Solution to (UTU)N = 0, subject to ||N||2 = 1: eigenvector of UTU associated with the smallest eigenvalue (least squares solution to homogeneous linear system UN = 0)
  • 14. Least squares as likelihood maximization • Generative model: line points are corrupted by Gaussian noise in the direction perpendicular to the line ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ = ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ b a v u y x ε (x, y) ax+by=d (u, v) ε point on the line noise: zero-mean Gaussian with std. dev. σ normal direction
  • 15. Least squares as likelihood maximization • Generative model: line points are corrupted by Gaussian noise in the direction perpendicular to the line ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ + ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ = ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ b a v u y x ε (x, y) (u, v) ε ∏ ∏ = = ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ − + − ∝ = n i i i n i i n d by ax d b a x P d b a x x P 1 2 2 1 1 2 ) ( exp ) , , | ( ) , , | , , ( σ K Likelihood of points given line parameters (a, b, d): ∑ = − + − = n i i i n d by ax d b a x x L 1 2 2 1 ) ( 2 1 ) , , | , , ( σ K Log-likelihood: ax+by=d
  • 16. Least squares for general curves • We would like to minimize the sum of squared geometric distances between the data points and the curve (xi, yi) d((xi, yi), C) curve C
  • 17. Least squares for conics • Equation of a general conic: C(a, x) = a · x = ax2 + bxy + cy2 + dx + ey + f = 0, a = [a, b, c, d, e, f], x = [x2, xy, y2, x, y, 1] • Minimizing the geometric distance is non-linear even for a conic • Algebraic distance: C(a, x) • Algebraic distance minimization by linear least squares: 0 1 1 1 2 2 2 2 2 2 2 2 2 2 1 1 2 1 1 1 2 1 = ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎣ ⎡ ⎥ ⎥ ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎢ ⎢ ⎣ ⎡ f e d c b a y x y y x x y x y y x x y x y y x x n n n n n n M O M
  • 18. Least squares for conics • Least squares system: Da = 0 • Need constraint on a to prevent trivial solution • Discriminant: b2 – 4ac • Negative: ellipse • Zero: parabola • Positive: hyperbola • Minimizing squared algebraic distance subject to constraints leads to a generalized eigenvalue problem • Many variations possible • For more information: • A. Fitzgibbon, M. Pilu, and R. Fisher, Direct least-squares fitting of ellipses, IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(5), 476--480, May 1999
  • 19. Least squares: Robustness to noise Least squares fit to the red points:
  • 20. Least squares: Robustness to noise Least squares fit with an outlier: Problem: squared error heavily penalizes outliers
  • 21. Robust estimators • General approach: minimize ri (xi, θ) – residual of ith point w.r.t. model parameters θ ρ – robust function with scale parameter σ ( ) ( ) σ θ ρ ; , i i i x r ∑ The robust function ρ behaves like squared distance for small values of the residual u but saturates for larger values of u
  • 22. Choosing the scale: Just right The effect of the outlier is eliminated
  • 23. The error value is almost the same for every point and the fit is very poor Choosing the scale: Too small
  • 24. Choosing the scale: Too large Behaves much the same as least squares
  • 25. Robust estimation: Notes • Robust fitting is a nonlinear optimization problem that must be solved iteratively • Least squares solution can be used for initialization • Adaptive choice of scale: “magic number” times median residual
  • 26. RANSAC • Robust fitting can deal with a few outliers – what if we have very many? • Random sample consensus (RANSAC): Very general framework for model fitting in the presence of outliers • Outline • Choose a small subset uniformly at random • Fit a model to that subset • Find all remaining points that are “close” to the model and reject the rest as outliers • Do this many times and choose the best model M. A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. of the ACM, Vol 24, pp 381-395, 1981.
  • 27. RANSAC for line fitting Repeat N times: • Draw s points uniformly at random • Fit line to these s points • Find inliers to this line among the remaining points (i.e., points whose distance from the line is less than t) • If there are d or more inliers, accept the line and refit using all inliers
  • 28. Choosing the parameters • Initial number of points s • Typically minimum number needed to fit the model • Distance threshold t • Choose t so probability for inlier is p (e.g. 0.95) • Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2 • Number of samples N • Choose N so that, with probability p, at least one random sample is free from outliers (e.g. p=0.99) (outlier ratio: e) Source: M. Pollefeys
  • 29. Choosing the parameters • Initial number of points s • Typically minimum number needed to fit the model • Distance threshold t • Choose t so probability for inlier is p (e.g. 0.95) • Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2 • Number of samples N • Choose N so that, with probability p, at least one random sample is free from outliers (e.g. p=0.99) (outlier ratio: e) ( ) ( ) ( ) s e p N − − − = 1 1 log / 1 log ( ) ( ) p e N s − = − − 1 1 1 proportion of outliers e s 5% 10% 20% 25% 30% 40% 50% 2 2 3 5 6 7 11 17 3 3 4 7 9 11 19 35 4 3 5 9 13 17 34 72 5 4 6 12 17 26 57 146 6 4 7 16 24 37 97 293 7 4 8 20 33 54 163 588 8 5 9 26 44 78 272 1177 Source: M. Pollefeys
  • 30. Choosing the parameters • Initial number of points s • Typically minimum number needed to fit the model • Distance threshold t • Choose t so probability for inlier is p (e.g. 0.95) • Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2 • Number of samples N • Choose N so that, with probability p, at least one random sample is free from outliers (e.g. p=0.99) (outlier ratio: e) ( ) ( ) p e N s − = − − 1 1 1 Source: M. Pollefeys ( ) ( ) ( ) s e p N − − − = 1 1 log / 1 log
  • 31. Choosing the parameters • Initial number of points s • Typically minimum number needed to fit the model • Distance threshold t • Choose t so probability for inlier is p (e.g. 0.95) • Zero-mean Gaussian noise with std. dev. σ: t2=3.84σ2 • Number of samples N • Choose N so that, with probability p, at least one random sample is free from outliers (e.g. p=0.99) (outlier ratio: e) • Consensus set size d • Should match expected inlier ratio Source: M. Pollefeys
  • 32. Adaptively determining the number of samples • Inlier ratio e is often unknown a priori, so pick worst case, e.g. 50%, and adapt if more inliers are found, e.g. 80% would yield e=0.2 • Adaptive procedure: • N=∞, sample_count =0 • While N >sample_count – Choose a sample and count the number of inliers – Set e = 1 – (number of inliers)/(total number of points) – Recompute N from e: – Increment the sample_count by 1 ( ) ( ) ( ) s e p N − − − = 1 1 log / 1 log Source: M. Pollefeys
  • 33. RANSAC pros and cons • Pros • Simple and general • Applicable to many different problems • Often works well in practice • Cons • Lots of parameters to tune • Can’t always get a good initialization of the model based on the minimum number of samples • Sometimes too many iterations are required • Can fail for extremely low inlier ratios • We can often do better than brute-force sampling