SlideShare a Scribd company logo
Lecture 4
Reconstruction from two views
Joaquim Salvi
Universitat de Girona
Visual Perception
2
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
3
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
4
Lecture 4: Reconstruction from two views
4.1 Shape from X
Techniques based on:
– Modifying the intrinsic camera parameters
i.e. Depth from Focus/Defocus and Depth from Zooming
– Considering an additional source of light onto the scene
i.e. Shape from Structured Light and Shape from Photometric
Stereo
– Considering additional surface information
i.e. Shape from Shading, Shape from Texture and Shape from
Geometric Constraints
– Multiple views
i.e. Shape from Stereo and Shape from Motion
Shape from Focus/Defocus
5
Lecture 4: Reconstruction from two views
4.1 Shape from X
Techniques based on:
– Modifying the intrinsic camera parameters
i.e. Depth from Focus/Defocus and Depth from Zooming
– Considering an additional source of light onto the scene
i.e. Shape from Structured Light and Shape from Photometric
Stereo
– Considering additional surface information
i.e. Shape from Shading, Shape from Texture and Shape from
Geometric Constraints
– Multiple views
i.e. Shape from Stereo and Shape from Motion
Shape from Structured Light
6
Lecture 4: Reconstruction from two views
4.1 Shape from X
Techniques based on:
– Modifying the intrinsic camera parameters
i.e. Depth from Focus/Defocus and Depth from Zooming
– Considering an additional source of light onto the scene
i.e. Shape from Structured Light and Shape from Photometric
Stereo
– Considering additional surface information
i.e. Shape from Shading, Shape from Texture and Shape from
Geometric Constraints
– Multiple views
i.e. Shape from Stereo and Shape from Motion
Shape from Shading
7
Lecture 4: Reconstruction from two views
4.1 Shape from X
Techniques based on:
– Modifying the intrinsic camera parameters
i.e. Depth from Focus/Defocus and Depth from Zooming
– Considering an additional source of light onto the scene
i.e. Shape from Structured Light and Shape from Photometric
Stereo
– Considering additional surface information
i.e. Shape from Shading, Shape from Texture and Shape from
Geometric Constraints
– Multiple views
i.e. Shape from Stereo and Shape from Motion
Shape from Stereo
8
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
9
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
10
Lecture 4: Reconstruction from two views
4.2 Triangulation principle
World
coordinate
system
WZ
WY WX
WO  W
Camera’
coordinate
system
'CY
'CX
'CZ
'CO 'C
'IY
'IX
'IO
 'I
'RY
'RX  'R
'RO
Camera
coordinate
system
CY
CX CZ
CO
 C
IY
IXIO
 I
RY
RX
RO
 R
11
Lecture 4: Reconstruction from two views
4.2 Triangulation principle
uP
'uP
Camera
coordinate
system
CY
CX CZ
CO
 C
Camera’
coordinate
system
'CY
'CX
'CZ
'CO 'C
World
coordinate
system
WZ
WY WX
WO  W
IY
IXIO
 I
RY
RX RO
'IY
'IX
'IO
 'I
'RY
'RX
 'R
'RO
 R
dP
'dP
wP
Pw = Pu + m u
Pw = P’u + m’ v
Steps:
1 - Pu + m u = P’u + m’ v
2 - Expand to x,y,z
3 - Get m and m’
4 – Compute Pw
v
u
12
Lecture 4: Reconstruction from two views
4.2 Triangulation principle
rP
sP
u
v
ˆ
wP
wP
3D Object Point
Reconstructed
Point
ˆ
wP
uP
'uP
Camera
coordinate
system
CY
CX CZ
CO
 C
Camera’
coordinate
system
'CY
'CX
'CZ
'CO 'C
World
coordinate
system
WZ
WY WX
WO  W
IY
IXIO
 I
RY
RX RO
'IY
'IX
'IO
 'I
'RY
'RX
 'R
'RO
 R
dP
'dP
wP
Pw = Pu + m u
Pw = P’u + m’ v
13
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
WOc2
WOc1
WP2D2
WP2D1
u
v
pq
WP3D
http://astronomy.swin.edu.au/~pbourke/geometry/lineline3d/
Pa = P1 + mua (P2 - P1)
Pb = P3 + mub (P4 - P3)
Two different ways:
Minimize the distance between points:
Min || Pb - Pa ||2
Min || P1 + mua (P2 - P1) - P3 - mub (P4 - P3) ||2
Finding mua and mub once expanded to (x,y and z)
Compute the dot product between vectors:
(Pa - Pb)T (P2 - P1) = 0
(Pa - Pb)T (P4 - P3) = 0
Because they are perpendicular.
Finding mua and mub once expanded to Pa, Pb
and (x,y and z)
4.2 Triangulation principle
Pb
Pa
14
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
In practice we can use Least-Squares:

































1
34333231
24232221
14131211
1
11
11
Z
Y
X
AAAA
AAAA
AAAA
s
vs
us

































1
34333231
24232221
14131211
2
22
22
Z
Y
X
BBBB
BBBB
BBBB
s
vs
us
CQX
QXC
1













































Z
Y
X
BvBBvBBvB
BuBBuBBuB
AvAAvAAvA
AuAAuAAuA
vBB
uBB
vAA
uAA
232332223221231
132331223211231
231332213221131
131331213211131
23424
23414
13424
13414
4.2 Triangulation principle
Add additional rows if we have additiional views of the same point
15
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
16
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
17
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
I’I
OC’
OC
m
m’
e e’
M
OI
OI’

OW
CKC’
4.3 Epipolar Geometry – Modelling
• Focal points, epipoles and epipolar lines
• e is defined by OC’ in {I}, e’ is defined by OC in {I’}
• m defines an epipolar line in {I’}; m’ defines an epipolar line in {I}
• All epipolar lines intersect at the epipole
l’
m
lm’
18
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
Epipole
Epipole
Epipolar lines
Epipolar lines
Area 1
Area 2
Correspondence
pointsZoom
Area 1
Zoom
Area 2
Epipolar geometry of Camera 1 Epipolar geometry of Camera 2
4.3 Epipolar Geometry – Modelling
19
Lecture 4: Reconstruction from two views
epipole
epipole
4.3 Epipolar Geometry – Modelling
20
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
I’I
OC’
OC
m
m’
e e’
M
OI
OI’

OW
CKC’
•The Epipolar Geometry concerns the problem of computing the plane .
• a plane is defined by the cross product between two vectors
• M is unknown, m and m’ are knowns
• {W} is located at {C} or {C’} and  can be computed at {C} or {C’} 
4 solutions
4.3 Epipolar Geometry – Modelling
21
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
I’I
OC’
OC
m
m’
e e’
M
OI
OI’
lm’
l’m

OW
C’K’C
•The Epipolar Geometry concerns the problem of computing the plane .
• a plane is defined by the cross product between two vectors
• M is unknown, m and m’ are knowns
• {W} is located at {C} or {C’} and  can be computed at {C} or {C’} 
4 solutions
CKC’
4.3 Epipolar Geometry – Modelling
22
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
I’I
OC’
OC
m
m’
e e’
M
OI
OI’
lm’
l’m

OW
• Assume {W} at {C}
CKC’
P
P’
 
 
 tRPKP
tRK
IP



'
0
 
  tRtRRPe
t
t
I
C
Pe
ttt



























1
0
1
0
''
1
0
1
'
1
4.3 Epipolar Geometry – Modelling
23
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
• Assume {W} at {C}
I’I
OC’
OC
m
m’
e e’
M
O
I
OI’
lm’
l’m

OW
CKC
’
P
P’  
 
 tRPKP
tRK
IP



'
0
 
  tRtRRPe
t
t
I
C
Pe
ttt



























1
0
1
0
''
1
0
1
'
1
 
  ''''
)('''
'
1
RmtRmtmPel
mtRmtRmRtRmPel
xm
x
tttt
m

 
Since epipolar lines are contained in the plane , we can define the line by
a cross product of two vectors, obtaining the orthogonal vector of the line.
4.3 Epipolar Geometry – Modelling
24
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
I’I
OC’
OC
m
m’
e e’
M
O
I
OI’
lm’
l’m

OW
CKC
’
P
P’
 
  '0
'0
Rmtm
mtRm
x
t
x
tt


The Fundamental matrix is
defined by inner product of a
point with its epipolar line.
 
  '
'
' Rmtl
mtRl
xm
x
t
m


 
  '
'''''
'' Rmtmlmlm
mtRmlmlm
x
t
m
t
m
x
tt
m
t
m


Orthogonal, their cosinus is 0
4.3 Epipolar Geometry – Modelling
25
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
I’I
OC’
OC
m
m’
e e’
M
O
I
OI’
lm’
l’m

OW
CKC
’
P
P’
l’
mlm’
'~'''''~
~~
1
1
mmmm
mmmm




AA
AA
Now we consider the
intrinsics. Points in pixels
instead of metrics
       
       '~'~'~'~'0
~''~~'~''0
111
111
mRtmmRtmRmtm
mtRmmtRmmtRm
x
tt
x
t
x
t
x
ttt
x
tt
x
tt




AAAA
AAAA
      tttttt
AAAABAB 

11
 
  1
1
''
'




AA
AA
RtF
tRF
x
t
x
tt
0'~'~
0~'~


mFm
mFm
t
t











100
0
0
0
0
v
u
v
u


A
4.3 Epipolar Geometry – Modelling
26
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
           
           FtRtRRtRtF
FRtRttRtRF
x
ttt
x
tttt
x
ttt
x
tt
x
tttt
x
tt
x
tttt
x
ttt




11
11
'''''
'''''
AAAAAAAA
AAAAAAAA
F and F’ are related by a transpose. So,
t
t
FF
FF


'
'
Demonstration:
 
  1
1
''
'




AA
AA
RtF
tRF
x
t
x
tt
The same dissertation can be made assuming the origin at
{C’}, obtaining two more fundamental matrices that are also
equivalent to F and F’.
4.3 Epipolar Geometry – Modelling
27
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
The Essential Matrix is the calibrated case of the Fundamental matrix.
• The Intrinsic parameters are known: A and A’ are known
The problem is reduced to estimate E or E’.
 
  1
1
''
'




AA
AA
RtF
tRF
x
t
x
tt
 
  RtE
tRE
x
x
t


'
The monocular stereo is a symplified version of F where A = A’, reducing the
complexity of computing F.
 
  1
1
' 



AA
AA
RtF
tRF
x
t
x
tt
4.3 Epipolar Geometry – Modelling
28
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
29
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
30
Lecture 4: Reconstruction from two views
4.4 Epipolar Geometry – Calibration
' ' 0T
m m F  1 ' 0
1
i
i i i
x
x y y
 
  
 
  
F
Operating, we obtain:
0nU f 
 11 12 13 21 22 23 31 32 33, , , , , , , ,
t
f F F F F F F F F F
 , , , , , , , ,1i i i i i i i i i i i i iu x x y x x x y y y y x y     
The epipolar geometry is defined as:
 1 2, ,...,
t
n nU u u u
The Eight Point Method
31
Lecture 4: Reconstruction from two views
1n nU f   
 11 12 13 21 22 23 31 32, , , , , , ,
t
f F F F F F F F F
 , , , , , , ,i i i i i i i i i i i i iu x x y x x x y y y y x y      
F is defined up to a scale factor, so we can fix one of the
component to 1. Let’s fix F33 = 1.
First solution is :
0nU f 
0f  NOT WANTED
Then:
1 1
1n n n nU U f U
 
    
1
1n nf U

    
1
1
t t
n n n nf U U U

     Least-Squares
 1 2, ,...,
t
n nU u u u   
4.4 Epipolar Geometry – Calibration
The Eight Point Method with Least Squares
32
Lecture 4: Reconstruction from two views
  1
' 
 AA x
tt
tRF
First solution is :
0nU f 
0f  NOT WANTED
 














0
0
0
xy
xz
yz
t x
F has to be rank-2 because [tx] is rank-2.
Any system of equations:
can be solved by SVD so that f lies in the nullspace of Un= UDVT .
[U,D,V] = svd (Un)
Hence f corresponds to a multiple of the column of V that belongs to the
unique singular value of D equal to 0.
Note that f is only known up to a scaling factor.
 11 12 13 21 22 23 31 32 33, , , , , , , ,
t
f F F F F F F F F F0nU f 
4.4 Epipolar Geometry – Calibration
The Eight Point Method with Eigen Analysis
33
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
34
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
35
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
Extrinsics: Camera Pose
I I C W
C Ws m A K M
' ' '
' '' ' ' 'I I C W
C Ws m A K M
3D Reconstruction:
'
'; 'I I
C CA A
'
'; 'C C
W WK K
Intrinsics: Optics & Internal Geometry
I’I
OC’
OC
m
m’
M
OI
OI’
OW
4.5 Constraints in stereo vision
Constraints:
• The Correspondence Problem F/E matrix
• Stereo Configurations:
• Calibrated Stereo: Intrinsics and Extrinsics known  Triangulation!
• Uncalibrated Stereo: Intrinsics and Extrinsics unknown  F matrix
• Calibrated Monocular: Intrinsics known, Extrinsics unknown  E matrix
• Uncalibrated Monocular: Intrinsics and Extrinsics unknown  F matrix
36
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
37
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
38
Lecture 4: Reconstruction from two views
4.6 Experimental comparison – methods
Linear Iterative Robust Optimisation Rank-2
Seven point (7p) X — yes
Eight point (8p) X LS or Eig. no
Rank-2 constraint X LS yes
Iterative Newton-
Raphson
X LS no
Linear iterative X LS no
Non-linear
minimization in
parameter space
X Eig. yes
Gradient technique X LS or Eig. no
FNS X AML no
CFNS X AML no
M-Estimator X LS or Eig. no / yes
LMedS X 7p / LS or Eig. no
RANSAC X 7p / Eig no
MLESAC X AML no
MAPSAC X AML no
LS: Least-Squares Eig: Eigen Analysis AML: Approximate Maximum Likelihood
Least-squares Eigen Analysis Approximate Maximum
Likelihood
39
Lecture 4: Reconstruction from two views
Image plane camera 1 Image plane camera 2
4.6 Experimental comparison – Methodology
40
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
Methods: 1.- 7-Point; 2.- 8-Point with Least-Squares;
3.- 8-Point with Eigen Analysis 4.- Rank-2 Constraint
* Mean and Std. in pixels
*
Linear methods: Good results if the points are well located and no outilers
mean
std
4.6 Experimental comparison – Synthetic images
41
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views* Mean and Std. in pixels
*
Iterative methods: Can cope with noise but inefficient in the presence of outliers
Methods: 5.- Iterative Linear; 6.- Iterative Newton-Raphson;
7.- Minimization in parameter space;
8.- Gradient using LS; 9.- Gradient using Eigen;
10.- FNS; 11.- CFNS
4.6 Experimental comparison – Synthetic images
42
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views* Mean and Std. in pixels
Robust methods: Cope with both noise and outliers
Methods: 12.- M-Estimator using LS; 13.- M-Estimator using Eigen;
14.- M-Estimator proposed by Torr;
15.- LMedS using LS; 16.- LMedS using Eigen;
17.- RANSAC; 18.- MLESAC; 19.- MAPSAC.
4.6 Experimental comparison – Synthetic images
43
Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views
1.- 7-Point; 2.- 8-Point with Least-Squares; 3.- 8-Point with Eigen Analysis; 4.- Rank-2 Constraint;
5.- Iterative Linear; 6.- Iterative Newton-Raphson; 7.- Minimization in parameter space; 8.- Gradient using LS;
9.- Gradient using Eigen; 10.- FNS; 11.- CFNS; 12.- M-Estimator using LS; 13.- M-Estimator using Eigen;
14.- M-Estimator proposed by Torr; 15.- LMedS using LS; 16.- LMedS using Eigen; 17.- RANSAC;
18.- MLESAC; 19.- MAPSAC.
Linear Iterative Robust
Computing Time
4.6 Experimental comparison – Synthetic images
44
Lecture 4: Reconstruction from two views
4.6 Experimental comparison – Real images
45
Lecture 4: Reconstruction from two views
Methods: 1.- 7-Point; 2.- 8-Point with Least-Squares;
3.- 8-Point with Eigen Analysis 4.- Rank-2 Constraint
Methods: 5.- Iterative Linear; 6.- Iterative Newton-Raphson;
7.- Minimization in parameter space;
8.- Gradient using LS; 9.- Gradient using Eigen;
10.- FNS; 11.- CFNS
Methods: 12.- M-Estimator using LS; 13.- M-Estimator using Eigen;
14.- M-Estimator proposed by Torr;
15.- LMedS using LS; 16.- LMedS using Eigen;
17.- RANSAC; 18.- MLESAC; 19.- MAPSAC.* Mean and Std. in pixels
*
4.6 Experimental comparison – Real images
46
Lecture 4: Reconstruction from two views
• Survey of 15 methods of computing F and up to 19 different
implementations
• Description of the estimators from an algorithmic point of view
• Conditions: Gaussian noise, outliers and real images
– Linear methods: Good results if the points are well located and
the correspondence problem previously solved (without outliers)
– Iterative methods: Can cope with noise but inefficient in the
presence of outliers
– Robust methods: Cope with both noise and outliers
• Least-squares is worse than eigen analysis and approximate
maximum likelihood
• Rank-2 matrices are preferred if a good geometry is required
• Better results when data are previously normalized
4.6 Experimental comparison – Conclusions
47
Lecture 4: Reconstruction from two views
Publications
– X. Armangué and J. Salvi. Overall View Regarding Fundamental Matrix
Estimation. Image and Vision Computing, IVC, pp. 205-220, Vol. 21, Issue
2, February 2003.
– J. Salvi. An approach to coded structured light to obtain three dimensional
information. PhD Thesis. University of Girona, 1997. Chapter 3.
– J. Salvi, X. Armangué, J. Pagès. A survey addressing the fundamental
matrix estimation problem. IEEE International Conference on Image
Processing, ICIP 2001, Thessaloniki, Greece, October 2001.
More Information: http://eia.udg.es/~qsalvi/
4.6 Experimental comparison – Conclusions
48
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
49
Lecture 4: Reconstruction from two views
Contents
4. Reconstruction from two views
4.1 Shape from X
4.2 Triangulation principle
4.3 Epipolar geometry – Modelling
4.4 Epipolar geometry – Calibration
4.5 Constraints in stereo vision
4.6 Experimental comparison of methods
4.7 Sample: Mobile robot performing 3D mapping
50
Lecture 4: Reconstruction from two views
4.7 Sample: Mobile robot performing 3D mapping
• Building a 3D map from an
unknown environment
using a stereo camera
system
• Localization of the robot in
the map
• Providing a new useful
sensor for the robot control
architecture GRILL Mobile robot with a
stereo camera system
51
Lecture 4: Reconstruction from two views
Onboard Control Computer
Microcontroller
Onboard Vision Computer
Frame Grabber A Frame Grabber B
Motor Encoder Sonar
Camera A Camera B
Wireless Ethernet
Radio video emitter
Ethernet Card
PCI Bus
Ethernet Card
PCI Bus
Ethernet Link
PC104+PC104+
USB
RS-232
Control system
Vision system
Pioneer 2
Outside stereo vision systemInside stereo vision system
GRILL Mobile Robot
4.7 3D mapping – Robot components
52
Lecture 4: Reconstruction from two views
Image Acquisition
Image Processing
Low Level
Image Processing
High Level
Description Level
Camera
LocalizationMap Building
3D Information Motion
Estimation
Correspondence
Problem
Feature
Extraction
A/D
Gradients
Filtering
Calibration
Remove
Distortion
Tracking
Camera Modelling
and Calibration
Localization and
Map Building
Stereo Vision and
Reconstruction
LocalizationMap Building
Calibration
Remove
Distortion
3D Information
Correspondence
Problem
Motion
Estimation
4.7 3D mapping – Data flow diagram
53
Lecture 4: Reconstruction from two views
2D Image Processing
3D Image Processing
Map Building and
Localization
Camera A Camera B
Sequence A Sequence B
2D Points
3D Points
3D Points Position
3D Map Trajectory
Image Flow
2D Points Flow
3D Points Flow
Position Flow
4.7 3D mapping – Data flow diagram
54
Lecture 4: Reconstruction from two views
2D Image Processing
3D Image Processing
Map Building and
Localization
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Data flow diagram
55
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
• Cameras are calibrated
• Both stereo images are
obtained simultaneously
4.7 3D mapping – Input sequence
56
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – RGB to I
• Description
– Converting a color image
to an intensity image
• Input
– Color image (RGB)
• Output
– Intensity image
57
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
• Description
– Removing distortion of an
image using camera
calibration parameters
• Input
– Distorted image
• Output
– Undistorted image
4.7 3D mapping – Remove Distortion
Intensity ImagesUndistorted Images
58
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Corners
• Description
– Detection of corners using
a variant of Harris corners
detector
• Input
– Undistorted image
• Output
– Corners list
Undistorted ImagesCorners Detected
59
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Spatial Cross Correlation
• Description
– Spatial cross correlation
using fundamental matrix
obtained from camera
calibration parameters
• Input
– Undistorted image A
– Corners list A
– Undistorted image B
– Corners list B
• Output
– Spatial points list
– Spatial matches list
Corners DetectedPoints and matches list
60
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Temporal Cross Correlation
• Description
– Temporal cross correlation
using small windows
search
• Input
– Previous undistorted
image
– Previous corners list
– Current undistorted image
– Current corners list
• Output
– Temporal points list
– Temporal matches list
Corners DetectedPoints and matches list
61
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Stereo Reconstruction
• Description
– Stereo reconstruction by
triangulation using camera
calibration parameters
• Input
– Spatial points list
– Spatial matches list
• Output
– 3D points list
Points and matches list
3D points list
62
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – 3D Tracker
• Description
– Tracking 3D points using
temporal cross correlation
• Input
– 3D points list
– Temporal points list A
– Temporal matches list A
– Temporal point list B
– Temporal matches list B
• Output
– 3D points history
– Points history A
– Matches history A
– Points history B
– Matches history B
Points and matches tracker
3D points
tracker
63
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Outliers Detection
• Description
– Detection of outliers
comparing distance
between current and
previous 3D points list
• Input
– Odometry position
– Current 3D points list
– Previous 3D points list
• Output
– Outliers list
Current and previous 3D points with outliers
Outlier Example
Current and Previous 3D points without outliers
64
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Local Localization
• Description
– Computing the absolute
position from the map by
minimizing distance
between the projection of
3D map points in cameras
and current 2D points and
matches.
• Input
– Odometry position as
initial guest
– Previous 3D points list
from the map
– Current 2D points list
– Current 2D matches list
• Output
– Absolute position
Map absolute position
65
Lecture 4: Reconstruction from two views
4.7 3D mapping – Global Localization
• Description
– Computing the trajectory
effect by the robot
• Input
– Local position
• Output
– Global position
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
66
Lecture 4: Reconstruction from two views
RGB to I RGB to I
Remove
Distortion
Remove
Distortion
Corners Corners
Spatial Cross
Correlation
Temp. Cross
Correlation
Temp. Cross
Correlation
Image Points
Buffer A
Image Points
Buffer B
Camera A Camera B
Image
Buffer B
Image
Buffer A
Stereo
Reconstruction
3D Tracker
Outliers
Detection
Local
Localization
Global
Localization
Build 3D Map
3D Map Trajectory
4.7 3D mapping – Building 3D Map
• Description
– Building the 3D map from
the 3D points with a
history longer than n times
• Input
– Global position
– Current 3D points list
– Previous 3D points list
• Output
– Global position
3D Map
67
Lecture 4: Reconstruction from two views
4.7 3D mapping – Video

More Related Content

Viewers also liked

Lecture 5 Pattern Projection Techniques
Lecture 5   Pattern Projection TechniquesLecture 5   Pattern Projection Techniques
Lecture 5 Pattern Projection Techniques
Joaquim Salvi
 
Kitchen Occupation Project Presentation
Kitchen Occupation Project PresentationKitchen Occupation Project Presentation
Kitchen Occupation Project Presentation
MattiasTiger
 
Epipolarna - Project Presentation - Tracking
Epipolarna - Project Presentation - TrackingEpipolarna - Project Presentation - Tracking
Epipolarna - Project Presentation - Tracking
MattiasTiger
 
Epipolar Geometri Görüntü Oluşumu
Epipolar Geometri Görüntü OluşumuEpipolar Geometri Görüntü Oluşumu
Epipolar Geometri Görüntü Oluşumu
Ahmet Sancak Şanlı
 
エピポーラ幾何 (Epipolar geometry)
エピポーラ幾何 (Epipolar geometry)エピポーラ幾何 (Epipolar geometry)
エピポーラ幾何 (Epipolar geometry)
Shohei Mori
 
Image-Based Rendering 各手法の直感的理解
Image-Based Rendering各手法の直感的理解Image-Based Rendering各手法の直感的理解
Image-Based Rendering 各手法の直感的理解
Shohei Mori
 
EM 3D reconstruction
EM 3D reconstructionEM 3D reconstruction
EM 3D reconstruction
Fan Zhitao
 
"High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro...
"High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro..."High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro...
"High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro...
Edge AI and Vision Alliance
 
3D Printing
3D Printing3D Printing
3D Printing
Seminar Links
 

Viewers also liked (9)

Lecture 5 Pattern Projection Techniques
Lecture 5   Pattern Projection TechniquesLecture 5   Pattern Projection Techniques
Lecture 5 Pattern Projection Techniques
 
Kitchen Occupation Project Presentation
Kitchen Occupation Project PresentationKitchen Occupation Project Presentation
Kitchen Occupation Project Presentation
 
Epipolarna - Project Presentation - Tracking
Epipolarna - Project Presentation - TrackingEpipolarna - Project Presentation - Tracking
Epipolarna - Project Presentation - Tracking
 
Epipolar Geometri Görüntü Oluşumu
Epipolar Geometri Görüntü OluşumuEpipolar Geometri Görüntü Oluşumu
Epipolar Geometri Görüntü Oluşumu
 
エピポーラ幾何 (Epipolar geometry)
エピポーラ幾何 (Epipolar geometry)エピポーラ幾何 (Epipolar geometry)
エピポーラ幾何 (Epipolar geometry)
 
Image-Based Rendering 各手法の直感的理解
Image-Based Rendering各手法の直感的理解Image-Based Rendering各手法の直感的理解
Image-Based Rendering 各手法の直感的理解
 
EM 3D reconstruction
EM 3D reconstructionEM 3D reconstruction
EM 3D reconstruction
 
"High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro...
"High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro..."High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro...
"High-resolution 3D Reconstruction on a Mobile Processor," a Presentation fro...
 
3D Printing
3D Printing3D Printing
3D Printing
 

Similar to Lecture 4 Reconstruction from Two Views

Computer Vision sfm
Computer Vision sfmComputer Vision sfm
Computer Vision sfm
Wael Badawy
 
Computer Vision Structure from motion
Computer Vision Structure from motionComputer Vision Structure from motion
Computer Vision Structure from motion
Wael Badawy
 
New geometric interpretation and analytic solution for quadrilateral reconstr...
New geometric interpretation and analytic solution for quadrilateral reconstr...New geometric interpretation and analytic solution for quadrilateral reconstr...
New geometric interpretation and analytic solution for quadrilateral reconstr...
Joo-Haeng Lee
 
Object Size Detector - Computer Vision
Object Size Detector - Computer VisionObject Size Detector - Computer Vision
Object Size Detector - Computer Vision
Shyama Bhuvanendran
 
相机模型经典Camera
相机模型经典Camera相机模型经典Camera
相机模型经典Camera
海彦 庞
 
SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)
SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)
SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)
Matthew O'Toole
 
998-isvc16
998-isvc16998-isvc16
Focus set based reconstruction of micro-objects
Focus set based reconstruction of micro-objectsFocus set based reconstruction of micro-objects
Focus set based reconstruction of micro-objects
Jan Wedekind
 
Fisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous DrivingFisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous Driving
Yu Huang
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
grssieee
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
grssieee
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
grssieee
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
grssieee
 
Lecture Summary : Camera Projection
Lecture Summary : Camera Projection Lecture Summary : Camera Projection
Lecture Summary : Camera Projection
홍배 김
 
426 Lecture5: AR Registration
426 Lecture5: AR Registration426 Lecture5: AR Registration
426 Lecture5: AR Registration
Mark Billinghurst
 
Microlensing Modelling
Microlensing ModellingMicrolensing Modelling
Microlensing Modelling
Ashna Sharan
 
The flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional cameraThe flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional camera
TELKOMNIKA JOURNAL
 
Solving the Pose Ambiguity via a Simple Concentric Circle Constraint
Solving the Pose Ambiguity via a Simple Concentric Circle ConstraintSolving the Pose Ambiguity via a Simple Concentric Circle Constraint
Solving the Pose Ambiguity via a Simple Concentric Circle Constraint
Dr. Amarjeet Singh
 
Poster_Final
Poster_FinalPoster_Final
Poster_Final
Nicholas Chehade
 

Similar to Lecture 4 Reconstruction from Two Views (19)

Computer Vision sfm
Computer Vision sfmComputer Vision sfm
Computer Vision sfm
 
Computer Vision Structure from motion
Computer Vision Structure from motionComputer Vision Structure from motion
Computer Vision Structure from motion
 
New geometric interpretation and analytic solution for quadrilateral reconstr...
New geometric interpretation and analytic solution for quadrilateral reconstr...New geometric interpretation and analytic solution for quadrilateral reconstr...
New geometric interpretation and analytic solution for quadrilateral reconstr...
 
Object Size Detector - Computer Vision
Object Size Detector - Computer VisionObject Size Detector - Computer Vision
Object Size Detector - Computer Vision
 
相机模型经典Camera
相机模型经典Camera相机模型经典Camera
相机模型经典Camera
 
SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)
SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)
SIGGRAPH 2014 Course on Computational Cameras and Displays (part 4)
 
998-isvc16
998-isvc16998-isvc16
998-isvc16
 
Focus set based reconstruction of micro-objects
Focus set based reconstruction of micro-objectsFocus set based reconstruction of micro-objects
Focus set based reconstruction of micro-objects
 
Fisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous DrivingFisheye Omnidirectional View in Autonomous Driving
Fisheye Omnidirectional View in Autonomous Driving
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
 
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
 
Lecture Summary : Camera Projection
Lecture Summary : Camera Projection Lecture Summary : Camera Projection
Lecture Summary : Camera Projection
 
426 Lecture5: AR Registration
426 Lecture5: AR Registration426 Lecture5: AR Registration
426 Lecture5: AR Registration
 
Microlensing Modelling
Microlensing ModellingMicrolensing Modelling
Microlensing Modelling
 
The flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional cameraThe flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional camera
 
Solving the Pose Ambiguity via a Simple Concentric Circle Constraint
Solving the Pose Ambiguity via a Simple Concentric Circle ConstraintSolving the Pose Ambiguity via a Simple Concentric Circle Constraint
Solving the Pose Ambiguity via a Simple Concentric Circle Constraint
 
Poster_Final
Poster_FinalPoster_Final
Poster_Final
 

More from Joaquim Salvi

Presentació Politècnica UdG
Presentació Politècnica UdGPresentació Politècnica UdG
Presentació Politècnica UdG
Joaquim Salvi
 
Lecture 2 Camera Calibration
Lecture 2   Camera CalibrationLecture 2   Camera Calibration
Lecture 2 Camera Calibration
Joaquim Salvi
 
Lecture 1 Rigid Body Transformations
Lecture 1   Rigid Body TransformationsLecture 1   Rigid Body Transformations
Lecture 1 Rigid Body Transformations
Joaquim Salvi
 
Tema 6 Lògica Programable
Tema 6   Lògica ProgramableTema 6   Lògica Programable
Tema 6 Lògica Programable
Joaquim Salvi
 
Tema 5 Sistemes Seqüencials
Tema 5   Sistemes SeqüencialsTema 5   Sistemes Seqüencials
Tema 5 Sistemes Seqüencials
Joaquim Salvi
 
Tema 4 Sistemes Combinacionals
Tema 4   Sistemes CombinacionalsTema 4   Sistemes Combinacionals
Tema 4 Sistemes Combinacionals
Joaquim Salvi
 
Tema 3 Àlgebra de Boole
Tema 3   Àlgebra de BooleTema 3   Àlgebra de Boole
Tema 3 Àlgebra de Boole
Joaquim Salvi
 
Tema 2 Representació de la informació
Tema 2   Representació de la informacióTema 2   Representació de la informació
Tema 2 Representació de la informació
Joaquim Salvi
 
Tema 1 Introducció a l'Estructura i a la Tecnologia de Computadors
Tema 1 Introducció a l'Estructura i a la Tecnologia de ComputadorsTema 1 Introducció a l'Estructura i a la Tecnologia de Computadors
Tema 1 Introducció a l'Estructura i a la Tecnologia de Computadors
Joaquim Salvi
 

More from Joaquim Salvi (9)

Presentació Politècnica UdG
Presentació Politècnica UdGPresentació Politècnica UdG
Presentació Politècnica UdG
 
Lecture 2 Camera Calibration
Lecture 2   Camera CalibrationLecture 2   Camera Calibration
Lecture 2 Camera Calibration
 
Lecture 1 Rigid Body Transformations
Lecture 1   Rigid Body TransformationsLecture 1   Rigid Body Transformations
Lecture 1 Rigid Body Transformations
 
Tema 6 Lògica Programable
Tema 6   Lògica ProgramableTema 6   Lògica Programable
Tema 6 Lògica Programable
 
Tema 5 Sistemes Seqüencials
Tema 5   Sistemes SeqüencialsTema 5   Sistemes Seqüencials
Tema 5 Sistemes Seqüencials
 
Tema 4 Sistemes Combinacionals
Tema 4   Sistemes CombinacionalsTema 4   Sistemes Combinacionals
Tema 4 Sistemes Combinacionals
 
Tema 3 Àlgebra de Boole
Tema 3   Àlgebra de BooleTema 3   Àlgebra de Boole
Tema 3 Àlgebra de Boole
 
Tema 2 Representació de la informació
Tema 2   Representació de la informacióTema 2   Representació de la informació
Tema 2 Representació de la informació
 
Tema 1 Introducció a l'Estructura i a la Tecnologia de Computadors
Tema 1 Introducció a l'Estructura i a la Tecnologia de ComputadorsTema 1 Introducció a l'Estructura i a la Tecnologia de Computadors
Tema 1 Introducció a l'Estructura i a la Tecnologia de Computadors
 

Recently uploaded

South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
Academy of Science of South Africa
 
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdfANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
Priyankaranawat4
 
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
RAHUL
 
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama UniversityNatural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Akanksha trivedi rama nursing college kanpur.
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
TechSoup
 
How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience
Wahiba Chair Training & Consulting
 
Life upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for studentLife upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for student
NgcHiNguyn25
 
The History of Stoke Newington Street Names
The History of Stoke Newington Street NamesThe History of Stoke Newington Street Names
The History of Stoke Newington Street Names
History of Stoke Newington
 
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptxChapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Denish Jangid
 
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
GeorgeMilliken2
 
How to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 InventoryHow to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 Inventory
Celine George
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
Celine George
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
Nicholas Montgomery
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Excellence Foundation for South Sudan
 
คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1
คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1
คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1
สมใจ จันสุกสี
 
Wound healing PPT
Wound healing PPTWound healing PPT
Wound healing PPT
Jyoti Chand
 
How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17
Celine George
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
Nicholas Montgomery
 
How to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold MethodHow to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold Method
Celine George
 
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptxNEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
iammrhaywood
 

Recently uploaded (20)

South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
 
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdfANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
 
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
 
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama UniversityNatural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
 
How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience
 
Life upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for studentLife upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for student
 
The History of Stoke Newington Street Names
The History of Stoke Newington Street NamesThe History of Stoke Newington Street Names
The History of Stoke Newington Street Names
 
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptxChapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptx
 
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
 
How to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 InventoryHow to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 Inventory
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
 
คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1
คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1
คำศัพท์ คำพื้นฐานการอ่าน ภาษาอังกฤษ ระดับชั้น ม.1
 
Wound healing PPT
Wound healing PPTWound healing PPT
Wound healing PPT
 
How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
 
How to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold MethodHow to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold Method
 
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptxNEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
 

Lecture 4 Reconstruction from Two Views

  • 1. Lecture 4 Reconstruction from two views Joaquim Salvi Universitat de Girona Visual Perception
  • 2. 2 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 3. 3 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 4. 4 Lecture 4: Reconstruction from two views 4.1 Shape from X Techniques based on: – Modifying the intrinsic camera parameters i.e. Depth from Focus/Defocus and Depth from Zooming – Considering an additional source of light onto the scene i.e. Shape from Structured Light and Shape from Photometric Stereo – Considering additional surface information i.e. Shape from Shading, Shape from Texture and Shape from Geometric Constraints – Multiple views i.e. Shape from Stereo and Shape from Motion Shape from Focus/Defocus
  • 5. 5 Lecture 4: Reconstruction from two views 4.1 Shape from X Techniques based on: – Modifying the intrinsic camera parameters i.e. Depth from Focus/Defocus and Depth from Zooming – Considering an additional source of light onto the scene i.e. Shape from Structured Light and Shape from Photometric Stereo – Considering additional surface information i.e. Shape from Shading, Shape from Texture and Shape from Geometric Constraints – Multiple views i.e. Shape from Stereo and Shape from Motion Shape from Structured Light
  • 6. 6 Lecture 4: Reconstruction from two views 4.1 Shape from X Techniques based on: – Modifying the intrinsic camera parameters i.e. Depth from Focus/Defocus and Depth from Zooming – Considering an additional source of light onto the scene i.e. Shape from Structured Light and Shape from Photometric Stereo – Considering additional surface information i.e. Shape from Shading, Shape from Texture and Shape from Geometric Constraints – Multiple views i.e. Shape from Stereo and Shape from Motion Shape from Shading
  • 7. 7 Lecture 4: Reconstruction from two views 4.1 Shape from X Techniques based on: – Modifying the intrinsic camera parameters i.e. Depth from Focus/Defocus and Depth from Zooming – Considering an additional source of light onto the scene i.e. Shape from Structured Light and Shape from Photometric Stereo – Considering additional surface information i.e. Shape from Shading, Shape from Texture and Shape from Geometric Constraints – Multiple views i.e. Shape from Stereo and Shape from Motion Shape from Stereo
  • 8. 8 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 9. 9 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 10. 10 Lecture 4: Reconstruction from two views 4.2 Triangulation principle World coordinate system WZ WY WX WO  W Camera’ coordinate system 'CY 'CX 'CZ 'CO 'C 'IY 'IX 'IO  'I 'RY 'RX  'R 'RO Camera coordinate system CY CX CZ CO  C IY IXIO  I RY RX RO  R
  • 11. 11 Lecture 4: Reconstruction from two views 4.2 Triangulation principle uP 'uP Camera coordinate system CY CX CZ CO  C Camera’ coordinate system 'CY 'CX 'CZ 'CO 'C World coordinate system WZ WY WX WO  W IY IXIO  I RY RX RO 'IY 'IX 'IO  'I 'RY 'RX  'R 'RO  R dP 'dP wP Pw = Pu + m u Pw = P’u + m’ v Steps: 1 - Pu + m u = P’u + m’ v 2 - Expand to x,y,z 3 - Get m and m’ 4 – Compute Pw v u
  • 12. 12 Lecture 4: Reconstruction from two views 4.2 Triangulation principle rP sP u v ˆ wP wP 3D Object Point Reconstructed Point ˆ wP uP 'uP Camera coordinate system CY CX CZ CO  C Camera’ coordinate system 'CY 'CX 'CZ 'CO 'C World coordinate system WZ WY WX WO  W IY IXIO  I RY RX RO 'IY 'IX 'IO  'I 'RY 'RX  'R 'RO  R dP 'dP wP Pw = Pu + m u Pw = P’u + m’ v
  • 13. 13 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views WOc2 WOc1 WP2D2 WP2D1 u v pq WP3D http://astronomy.swin.edu.au/~pbourke/geometry/lineline3d/ Pa = P1 + mua (P2 - P1) Pb = P3 + mub (P4 - P3) Two different ways: Minimize the distance between points: Min || Pb - Pa ||2 Min || P1 + mua (P2 - P1) - P3 - mub (P4 - P3) ||2 Finding mua and mub once expanded to (x,y and z) Compute the dot product between vectors: (Pa - Pb)T (P2 - P1) = 0 (Pa - Pb)T (P4 - P3) = 0 Because they are perpendicular. Finding mua and mub once expanded to Pa, Pb and (x,y and z) 4.2 Triangulation principle Pb Pa
  • 14. 14 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views In practice we can use Least-Squares:                                  1 34333231 24232221 14131211 1 11 11 Z Y X AAAA AAAA AAAA s vs us                                  1 34333231 24232221 14131211 2 22 22 Z Y X BBBB BBBB BBBB s vs us CQX QXC 1                                              Z Y X BvBBvBBvB BuBBuBBuB AvAAvAAvA AuAAuAAuA vBB uBB vAA uAA 232332223221231 132331223211231 231332213221131 131331213211131 23424 23414 13424 13414 4.2 Triangulation principle Add additional rows if we have additiional views of the same point
  • 15. 15 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 16. 16 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 17. 17 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views I’I OC’ OC m m’ e e’ M OI OI’  OW CKC’ 4.3 Epipolar Geometry – Modelling • Focal points, epipoles and epipolar lines • e is defined by OC’ in {I}, e’ is defined by OC in {I’} • m defines an epipolar line in {I’}; m’ defines an epipolar line in {I} • All epipolar lines intersect at the epipole l’ m lm’
  • 18. 18 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views Epipole Epipole Epipolar lines Epipolar lines Area 1 Area 2 Correspondence pointsZoom Area 1 Zoom Area 2 Epipolar geometry of Camera 1 Epipolar geometry of Camera 2 4.3 Epipolar Geometry – Modelling
  • 19. 19 Lecture 4: Reconstruction from two views epipole epipole 4.3 Epipolar Geometry – Modelling
  • 20. 20 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views I’I OC’ OC m m’ e e’ M OI OI’  OW CKC’ •The Epipolar Geometry concerns the problem of computing the plane . • a plane is defined by the cross product between two vectors • M is unknown, m and m’ are knowns • {W} is located at {C} or {C’} and  can be computed at {C} or {C’}  4 solutions 4.3 Epipolar Geometry – Modelling
  • 21. 21 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views I’I OC’ OC m m’ e e’ M OI OI’ lm’ l’m  OW C’K’C •The Epipolar Geometry concerns the problem of computing the plane . • a plane is defined by the cross product between two vectors • M is unknown, m and m’ are knowns • {W} is located at {C} or {C’} and  can be computed at {C} or {C’}  4 solutions CKC’ 4.3 Epipolar Geometry – Modelling
  • 22. 22 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views I’I OC’ OC m m’ e e’ M OI OI’ lm’ l’m  OW • Assume {W} at {C} CKC’ P P’      tRPKP tRK IP    ' 0     tRtRRPe t t I C Pe ttt                            1 0 1 0 '' 1 0 1 ' 1 4.3 Epipolar Geometry – Modelling
  • 23. 23 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views • Assume {W} at {C} I’I OC’ OC m m’ e e’ M O I OI’ lm’ l’m  OW CKC ’ P P’      tRPKP tRK IP    ' 0     tRtRRPe t t I C Pe ttt                            1 0 1 0 '' 1 0 1 ' 1     '''' )(''' ' 1 RmtRmtmPel mtRmtRmRtRmPel xm x tttt m    Since epipolar lines are contained in the plane , we can define the line by a cross product of two vectors, obtaining the orthogonal vector of the line. 4.3 Epipolar Geometry – Modelling
  • 24. 24 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views I’I OC’ OC m m’ e e’ M O I OI’ lm’ l’m  OW CKC ’ P P’     '0 '0 Rmtm mtRm x t x tt   The Fundamental matrix is defined by inner product of a point with its epipolar line.     ' ' ' Rmtl mtRl xm x t m       ' ''''' '' Rmtmlmlm mtRmlmlm x t m t m x tt m t m   Orthogonal, their cosinus is 0 4.3 Epipolar Geometry – Modelling
  • 25. 25 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views I’I OC’ OC m m’ e e’ M O I OI’ lm’ l’m  OW CKC ’ P P’ l’ mlm’ '~'''''~ ~~ 1 1 mmmm mmmm     AA AA Now we consider the intrinsics. Points in pixels instead of metrics                '~'~'~'~'0 ~''~~'~''0 111 111 mRtmmRtmRmtm mtRmmtRmmtRm x tt x t x t x ttt x tt x tt     AAAA AAAA       tttttt AAAABAB   11     1 1 '' '     AA AA RtF tRF x t x tt 0'~'~ 0~'~   mFm mFm t t            100 0 0 0 0 v u v u   A 4.3 Epipolar Geometry – Modelling
  • 26. 26 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views                        FtRtRRtRtF FRtRttRtRF x ttt x tttt x ttt x tt x tttt x tt x tttt x ttt     11 11 ''''' ''''' AAAAAAAA AAAAAAAA F and F’ are related by a transpose. So, t t FF FF   ' ' Demonstration:     1 1 '' '     AA AA RtF tRF x t x tt The same dissertation can be made assuming the origin at {C’}, obtaining two more fundamental matrices that are also equivalent to F and F’. 4.3 Epipolar Geometry – Modelling
  • 27. 27 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views The Essential Matrix is the calibrated case of the Fundamental matrix. • The Intrinsic parameters are known: A and A’ are known The problem is reduced to estimate E or E’.     1 1 '' '     AA AA RtF tRF x t x tt     RtE tRE x x t   ' The monocular stereo is a symplified version of F where A = A’, reducing the complexity of computing F.     1 1 '     AA AA RtF tRF x t x tt 4.3 Epipolar Geometry – Modelling
  • 28. 28 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 29. 29 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 30. 30 Lecture 4: Reconstruction from two views 4.4 Epipolar Geometry – Calibration ' ' 0T m m F  1 ' 0 1 i i i i x x y y           F Operating, we obtain: 0nU f   11 12 13 21 22 23 31 32 33, , , , , , , , t f F F F F F F F F F  , , , , , , , ,1i i i i i i i i i i i i iu x x y x x x y y y y x y      The epipolar geometry is defined as:  1 2, ,..., t n nU u u u The Eight Point Method
  • 31. 31 Lecture 4: Reconstruction from two views 1n nU f     11 12 13 21 22 23 31 32, , , , , , , t f F F F F F F F F  , , , , , , ,i i i i i i i i i i i i iu x x y x x x y y y y x y       F is defined up to a scale factor, so we can fix one of the component to 1. Let’s fix F33 = 1. First solution is : 0nU f  0f  NOT WANTED Then: 1 1 1n n n nU U f U        1 1n nf U       1 1 t t n n n nf U U U       Least-Squares  1 2, ,..., t n nU u u u    4.4 Epipolar Geometry – Calibration The Eight Point Method with Least Squares
  • 32. 32 Lecture 4: Reconstruction from two views   1 '   AA x tt tRF First solution is : 0nU f  0f  NOT WANTED                 0 0 0 xy xz yz t x F has to be rank-2 because [tx] is rank-2. Any system of equations: can be solved by SVD so that f lies in the nullspace of Un= UDVT . [U,D,V] = svd (Un) Hence f corresponds to a multiple of the column of V that belongs to the unique singular value of D equal to 0. Note that f is only known up to a scaling factor.  11 12 13 21 22 23 31 32 33, , , , , , , , t f F F F F F F F F F0nU f  4.4 Epipolar Geometry – Calibration The Eight Point Method with Eigen Analysis
  • 33. 33 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 34. 34 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 35. 35 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views Extrinsics: Camera Pose I I C W C Ws m A K M ' ' ' ' '' ' ' 'I I C W C Ws m A K M 3D Reconstruction: ' '; 'I I C CA A ' '; 'C C W WK K Intrinsics: Optics & Internal Geometry I’I OC’ OC m m’ M OI OI’ OW 4.5 Constraints in stereo vision Constraints: • The Correspondence Problem F/E matrix • Stereo Configurations: • Calibrated Stereo: Intrinsics and Extrinsics known  Triangulation! • Uncalibrated Stereo: Intrinsics and Extrinsics unknown  F matrix • Calibrated Monocular: Intrinsics known, Extrinsics unknown  E matrix • Uncalibrated Monocular: Intrinsics and Extrinsics unknown  F matrix
  • 36. 36 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 37. 37 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 38. 38 Lecture 4: Reconstruction from two views 4.6 Experimental comparison – methods Linear Iterative Robust Optimisation Rank-2 Seven point (7p) X — yes Eight point (8p) X LS or Eig. no Rank-2 constraint X LS yes Iterative Newton- Raphson X LS no Linear iterative X LS no Non-linear minimization in parameter space X Eig. yes Gradient technique X LS or Eig. no FNS X AML no CFNS X AML no M-Estimator X LS or Eig. no / yes LMedS X 7p / LS or Eig. no RANSAC X 7p / Eig no MLESAC X AML no MAPSAC X AML no LS: Least-Squares Eig: Eigen Analysis AML: Approximate Maximum Likelihood Least-squares Eigen Analysis Approximate Maximum Likelihood
  • 39. 39 Lecture 4: Reconstruction from two views Image plane camera 1 Image plane camera 2 4.6 Experimental comparison – Methodology
  • 40. 40 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views Methods: 1.- 7-Point; 2.- 8-Point with Least-Squares; 3.- 8-Point with Eigen Analysis 4.- Rank-2 Constraint * Mean and Std. in pixels * Linear methods: Good results if the points are well located and no outilers mean std 4.6 Experimental comparison – Synthetic images
  • 41. 41 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views* Mean and Std. in pixels * Iterative methods: Can cope with noise but inefficient in the presence of outliers Methods: 5.- Iterative Linear; 6.- Iterative Newton-Raphson; 7.- Minimization in parameter space; 8.- Gradient using LS; 9.- Gradient using Eigen; 10.- FNS; 11.- CFNS 4.6 Experimental comparison – Synthetic images
  • 42. 42 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views* Mean and Std. in pixels Robust methods: Cope with both noise and outliers Methods: 12.- M-Estimator using LS; 13.- M-Estimator using Eigen; 14.- M-Estimator proposed by Torr; 15.- LMedS using LS; 16.- LMedS using Eigen; 17.- RANSAC; 18.- MLESAC; 19.- MAPSAC. 4.6 Experimental comparison – Synthetic images
  • 43. 43 Lecture 4: Reconstruction from two viewsLecture 4: Reconstruction from two views 1.- 7-Point; 2.- 8-Point with Least-Squares; 3.- 8-Point with Eigen Analysis; 4.- Rank-2 Constraint; 5.- Iterative Linear; 6.- Iterative Newton-Raphson; 7.- Minimization in parameter space; 8.- Gradient using LS; 9.- Gradient using Eigen; 10.- FNS; 11.- CFNS; 12.- M-Estimator using LS; 13.- M-Estimator using Eigen; 14.- M-Estimator proposed by Torr; 15.- LMedS using LS; 16.- LMedS using Eigen; 17.- RANSAC; 18.- MLESAC; 19.- MAPSAC. Linear Iterative Robust Computing Time 4.6 Experimental comparison – Synthetic images
  • 44. 44 Lecture 4: Reconstruction from two views 4.6 Experimental comparison – Real images
  • 45. 45 Lecture 4: Reconstruction from two views Methods: 1.- 7-Point; 2.- 8-Point with Least-Squares; 3.- 8-Point with Eigen Analysis 4.- Rank-2 Constraint Methods: 5.- Iterative Linear; 6.- Iterative Newton-Raphson; 7.- Minimization in parameter space; 8.- Gradient using LS; 9.- Gradient using Eigen; 10.- FNS; 11.- CFNS Methods: 12.- M-Estimator using LS; 13.- M-Estimator using Eigen; 14.- M-Estimator proposed by Torr; 15.- LMedS using LS; 16.- LMedS using Eigen; 17.- RANSAC; 18.- MLESAC; 19.- MAPSAC.* Mean and Std. in pixels * 4.6 Experimental comparison – Real images
  • 46. 46 Lecture 4: Reconstruction from two views • Survey of 15 methods of computing F and up to 19 different implementations • Description of the estimators from an algorithmic point of view • Conditions: Gaussian noise, outliers and real images – Linear methods: Good results if the points are well located and the correspondence problem previously solved (without outliers) – Iterative methods: Can cope with noise but inefficient in the presence of outliers – Robust methods: Cope with both noise and outliers • Least-squares is worse than eigen analysis and approximate maximum likelihood • Rank-2 matrices are preferred if a good geometry is required • Better results when data are previously normalized 4.6 Experimental comparison – Conclusions
  • 47. 47 Lecture 4: Reconstruction from two views Publications – X. Armangué and J. Salvi. Overall View Regarding Fundamental Matrix Estimation. Image and Vision Computing, IVC, pp. 205-220, Vol. 21, Issue 2, February 2003. – J. Salvi. An approach to coded structured light to obtain three dimensional information. PhD Thesis. University of Girona, 1997. Chapter 3. – J. Salvi, X. Armangué, J. Pagès. A survey addressing the fundamental matrix estimation problem. IEEE International Conference on Image Processing, ICIP 2001, Thessaloniki, Greece, October 2001. More Information: http://eia.udg.es/~qsalvi/ 4.6 Experimental comparison – Conclusions
  • 48. 48 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 49. 49 Lecture 4: Reconstruction from two views Contents 4. Reconstruction from two views 4.1 Shape from X 4.2 Triangulation principle 4.3 Epipolar geometry – Modelling 4.4 Epipolar geometry – Calibration 4.5 Constraints in stereo vision 4.6 Experimental comparison of methods 4.7 Sample: Mobile robot performing 3D mapping
  • 50. 50 Lecture 4: Reconstruction from two views 4.7 Sample: Mobile robot performing 3D mapping • Building a 3D map from an unknown environment using a stereo camera system • Localization of the robot in the map • Providing a new useful sensor for the robot control architecture GRILL Mobile robot with a stereo camera system
  • 51. 51 Lecture 4: Reconstruction from two views Onboard Control Computer Microcontroller Onboard Vision Computer Frame Grabber A Frame Grabber B Motor Encoder Sonar Camera A Camera B Wireless Ethernet Radio video emitter Ethernet Card PCI Bus Ethernet Card PCI Bus Ethernet Link PC104+PC104+ USB RS-232 Control system Vision system Pioneer 2 Outside stereo vision systemInside stereo vision system GRILL Mobile Robot 4.7 3D mapping – Robot components
  • 52. 52 Lecture 4: Reconstruction from two views Image Acquisition Image Processing Low Level Image Processing High Level Description Level Camera LocalizationMap Building 3D Information Motion Estimation Correspondence Problem Feature Extraction A/D Gradients Filtering Calibration Remove Distortion Tracking Camera Modelling and Calibration Localization and Map Building Stereo Vision and Reconstruction LocalizationMap Building Calibration Remove Distortion 3D Information Correspondence Problem Motion Estimation 4.7 3D mapping – Data flow diagram
  • 53. 53 Lecture 4: Reconstruction from two views 2D Image Processing 3D Image Processing Map Building and Localization Camera A Camera B Sequence A Sequence B 2D Points 3D Points 3D Points Position 3D Map Trajectory Image Flow 2D Points Flow 3D Points Flow Position Flow 4.7 3D mapping – Data flow diagram
  • 54. 54 Lecture 4: Reconstruction from two views 2D Image Processing 3D Image Processing Map Building and Localization RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Data flow diagram
  • 55. 55 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory • Cameras are calibrated • Both stereo images are obtained simultaneously 4.7 3D mapping – Input sequence
  • 56. 56 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – RGB to I • Description – Converting a color image to an intensity image • Input – Color image (RGB) • Output – Intensity image
  • 57. 57 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory • Description – Removing distortion of an image using camera calibration parameters • Input – Distorted image • Output – Undistorted image 4.7 3D mapping – Remove Distortion Intensity ImagesUndistorted Images
  • 58. 58 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Corners • Description – Detection of corners using a variant of Harris corners detector • Input – Undistorted image • Output – Corners list Undistorted ImagesCorners Detected
  • 59. 59 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Spatial Cross Correlation • Description – Spatial cross correlation using fundamental matrix obtained from camera calibration parameters • Input – Undistorted image A – Corners list A – Undistorted image B – Corners list B • Output – Spatial points list – Spatial matches list Corners DetectedPoints and matches list
  • 60. 60 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Temporal Cross Correlation • Description – Temporal cross correlation using small windows search • Input – Previous undistorted image – Previous corners list – Current undistorted image – Current corners list • Output – Temporal points list – Temporal matches list Corners DetectedPoints and matches list
  • 61. 61 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Stereo Reconstruction • Description – Stereo reconstruction by triangulation using camera calibration parameters • Input – Spatial points list – Spatial matches list • Output – 3D points list Points and matches list 3D points list
  • 62. 62 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – 3D Tracker • Description – Tracking 3D points using temporal cross correlation • Input – 3D points list – Temporal points list A – Temporal matches list A – Temporal point list B – Temporal matches list B • Output – 3D points history – Points history A – Matches history A – Points history B – Matches history B Points and matches tracker 3D points tracker
  • 63. 63 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Outliers Detection • Description – Detection of outliers comparing distance between current and previous 3D points list • Input – Odometry position – Current 3D points list – Previous 3D points list • Output – Outliers list Current and previous 3D points with outliers Outlier Example Current and Previous 3D points without outliers
  • 64. 64 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Local Localization • Description – Computing the absolute position from the map by minimizing distance between the projection of 3D map points in cameras and current 2D points and matches. • Input – Odometry position as initial guest – Previous 3D points list from the map – Current 2D points list – Current 2D matches list • Output – Absolute position Map absolute position
  • 65. 65 Lecture 4: Reconstruction from two views 4.7 3D mapping – Global Localization • Description – Computing the trajectory effect by the robot • Input – Local position • Output – Global position RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory
  • 66. 66 Lecture 4: Reconstruction from two views RGB to I RGB to I Remove Distortion Remove Distortion Corners Corners Spatial Cross Correlation Temp. Cross Correlation Temp. Cross Correlation Image Points Buffer A Image Points Buffer B Camera A Camera B Image Buffer B Image Buffer A Stereo Reconstruction 3D Tracker Outliers Detection Local Localization Global Localization Build 3D Map 3D Map Trajectory 4.7 3D mapping – Building 3D Map • Description – Building the 3D map from the 3D points with a history longer than n times • Input – Global position – Current 3D points list – Previous 3D points list • Output – Global position 3D Map
  • 67. 67 Lecture 4: Reconstruction from two views 4.7 3D mapping – Video