Lidar-Stereo Calibration for Self-Driving Vehicles
1. Automatic Extrinsic
Calibration for Lidar-Stereo
Vehicle Sensor Setups
Carlos Guindel, Jorge Beltrán,
David Martín and Fernando García
Intelligent Systems Laboratory · Universidad Carlos III de Madrid
IEEE 20th International Conference on Intelligent Transportation Systems
Yokohama · 17 October 2017
4. Perception systems in vehicles
• Topologies with complementary sensory modalities
4
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Cameras
Stereo-
vision
systems
Range
scanners
Multi-layer
3D lidar
scanner
• High accuracy
• 360º Field of
View
• Appearance
information
• Cost-effective
• Dense 3D info.
Ovelapping
FOVs
Correspondence
between data
representations
Extrinsic
calibration
required
Data fusion
IVVI 2.0 Research Platform
5. Previous works
• Camera-to-range calibration in robotic/automotive platforms
• Complex setups / lack of generalization ability
• Strong assumptions are usually made: sensor resolution, limited pose
range, environment structure,…
• Assessment of calibration methods
• Ground-truth of extrinsic parameters cannot be obtained in practice
5
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Geiger et al., ICRA 2012 Velas et al., WSCG 2014
Levinson & Thrun,
RSS 2013
6. Proposal overview 6
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
• Stereo-vision system–multi-layer lidar calibration
• Suitable for use with different models of lidar scanners (e.g. 16-layer)
• Very different relative poses are allowed
• Performed within a reasonable time using a simple setup
8. Calibration algorithm
• Calibration target
• Process overview
8
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
• Single point of view
• Holes visible from the camera and
intersected by at least 2 lidar beams
• No alignment required
Registration
Data
CAMERA
Target
segmentation
CAMERA
Circles
segmentation
CAMERA
Data
LIDAR
Target
segmentation
LIDAR
Circles
segmentation
LIDAR
𝝃CL =
𝑡𝑥
𝑡𝑦
𝑡𝑧
𝜙
𝜃
𝜓
9. Data representation 9
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Registration
Data
CAMERA
Target
segmentation
CAMERA
Circles
segmentation
CAMERA
Data
LIDAR
Target
segmentation
LIDAR
Circles
segmentation
LIDAR
10. Data representation 10
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Point cloud: 𝒫0
𝑐
Stereo matching
Left image
Right image
Point cloud: 𝒫0
𝑙
• 3D point clouds, 𝒫0 = { 𝑥, 𝑦, 𝑧 }
Stereo matching
• Accuracy in the depth estimation is required (SGM)
• Border localization problem will be tackled using intensity
Data
CAMERA
Data
LIDAR
LIDAR: 𝒫0
𝑙
11. Target segmentation · Step 1 11
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Target
segmentation
CAMERA CAMERA
Target
segmentation
LIDAR LIDAR
𝝃CL =
𝑡𝑥
𝑡𝑦
𝑡𝑧
𝜙
𝜃
𝜓
12. Target segmentation · Step 1 12
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Point clouds: 𝒫0
Plane model
extraction
Plane model extraction
• Random sample consensus (RANSAC)
• Tight threshold (1 cm) and requirement for the plane to
be roughly parallel to the vertical axis (tol: 0.55 rad)
Remove pts. far
from the planes
Target
segmentation
CAMERA/LIDAR
LIDAR: 𝒫1
𝑙
CAMERA: 𝒫1
𝑐
• Extracting the points belonging to discontinuities in the target
• Successive segmentations: 𝒫𝑖0
= { 𝑥, 𝑦, 𝑧 } ⊆ 𝒫𝑖0−1
Step 1
13. Target segmentation · Step 2 13
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Target
segmentation
CAMERA
Left image Sobel filtering
Point cloud: 𝒫1
𝑐
Keep
discontinuities
CAMERA: 𝒫2
𝑐
for every point in 𝒫1
𝑙
CAMERA: Sobel edges
Target
segmentation
LIDAR
Filter out
LIDAR: 𝒫2
𝑙
Step 2
Step 2
(50 cm)
14. Circles segmentation · Step 1 14
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Circles
segmentation
CAMERA
Circles
segmentation
LIDAR
𝝃CL =
𝑡𝑥
𝑡𝑦
𝑡𝑧
𝜙
𝜃
𝜓
15. Circles segmentation · Step 1
• Getting rid of the points not belonging to the circles: target boundaries
15
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Circles
segmentation
CAMERA
Step 1
Point cloud: 𝒫2
𝑐
3D-line RANSAC Point cloud: 𝒫3
𝑐
Geometrical
constraints
LIDAR: 𝒫3
𝑙
CAMERA: 𝒫3
𝑐
• Keep only the rings where a circle is possible
• Remove the outer points
Circles
segmentation
LIDAR
Step 1
16. Circles segmentation · Step 2 16
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
• Detecting the center of the holes
Circles
segmentation
CAMERA/LIDAR
Step 2
Point cloud: 𝒫3
Alignment with
XY plane
2D Circle
RANSAC
4 x centers +
radius
Undo the
alignment
Geometrical
constraints
4 x centers
coordinates
• 2D search: only three points are required
Circle model extraction
Camera Lidar
17. Circles segmentation · Step 3
• Robustness against noise
17
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Circles
segmentation
CAMERA/LIDAR
Step 3
𝑡0
𝑡1
…
𝑡𝑁
4 x center
coordinates
4 x center
coordinates
4 x center
coordinates
Euclidean
clustering
4 x cluster
centroids
CAMERA LIDAR
tol: 2 cm
18. Registration 18
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Registration
𝝃CL =
𝑡𝑥
𝑡𝑦
𝑡𝑧
𝜙
𝜃
𝜓
19. Registration 19
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
4 x reference
points, 𝒑𝑐
𝑖
Registration
Circles
segmentation
CAMERA
Circles
segmentation
LIDAR
4 x reference
points, 𝒑𝑙
𝑖
• Pure translation
• Overdetermined system of 12 equations
𝒕𝐶𝐿 = ഥ
𝒑𝑙
𝑖
− ഥ
𝒑𝑐
𝑖
• Column-pivoting QR decomposition
Step 1
• Iterative Closest Points (ICP)
Step 2
Translation
Translation
+ Rotation
Composition
𝝃CL =
𝑡𝑥
𝑡𝑦
𝑡𝑧
𝜙
𝜃
𝜓
21. Synthetic Test Suite
• Our proposal for quantitative assessment of calibration algorithms
• Exact ground-truth, but also noise and real constraints
• Simulation of sensors and their environment based on Gazebo
• Different calibration scenarios
21
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Calibration target
Stereo-vision
system model
Velodyne models
Gazebo models, plugins and worlds available at
http://wiki.ros.org/velo2cam_gazebo
Open source · GPLv2 License
23. Experimental setup
• Using the synthetic test suite
• Nine different calibration setups
• 7 simple setups to evaluate the parameters of the transform
• 2 challenging situations
• Gaussian noise added to the sensor measurements
• Models simulated with real parameters
• 12 cm stereo baseline and 16-layer lidar
23
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
𝑒𝑡 = ‖𝒕 − 𝒕𝒈‖
Translation error (linear)
𝑒𝑟 = ∠(𝑹−𝟏
𝑹𝒈)
Rotation error (angular)
𝑹 𝒕
24. Selection of the length of the window, 𝑁
Translation error (linear) Rotation error (angular)
Experiments
• Accumulation of cluster centroids over 𝑁 frames
• 𝑁 images and 𝑁 point clouds processed
• Not every window provides clusters to be accumulated
24
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Circles
segmentation
CAMERA/LIDAR
Step 3
25. Experiments
• Noise is included in the measurements from the sensors
• Gaussian noise: 𝒩 0, 𝐾𝜎0
2
25
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
𝜎0
𝑐
= 0.007
CAMERA
𝜎0
𝑙
= 0.008 𝑚
LIDAR
Translation error (linear) Rotation error (angular)
Robustness to noise, 𝐾
26. Comparison 26
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Geiger et al., ICRA 2012
Geiger et al., ICRA 2012
• Public web toolbox
• Monocular cam., provide intrinsics
• Tested with HDL-64E & Kinect
Velas et al., WSCG 2014
• Public ROS package
• Monocular camera
• Not suitable for large pose
displacements
• Tested with HDL-32E
Recreated
In Gazebo
16 layers 32 layers 64 layers
27. Experiments 27
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Translation
error
Rotation
error
16 layers 32-layer 64-layer
16-layer
28. Experiments 28
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Translation
error
Rotation
error
16 layers 32-layer 64-layer
16-layer
29. Results
• IVVI 2.0 platform
• Bumblebee XB3 stereo system: 1280 x 960 images, 12 cm baseline
• Velodyne VLP-16
29
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
30. Results 30
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
31. Results in real scenarios 31
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Stereo + lidar point clouds aligned
32. Results in real scenarios 32
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
Lidar measurements projected on the image
34. Conclusion
• Method for calibration of lidar–stereo-camera setups:
• Without user intervention
• Suitable for close-to-production devices
• Assessment of the calibration methods using advanced simulation
• Exact ground-truth in unlimited calibration scenarios
• Results validate our calibration approach
34
Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups
C. Guindel, J. Beltrán et al. · ITSC 2017
ROS Package available at
http://wiki.ros.org/velo2cam_calibration
Open source · GPLv2 License
Future work
• Further testing
• Sensitivity to different stereo matching approaches (e.g. CNN-based),
weather/illumination conditions,…
• Monocular camera–multi-layer lidar calibration
• Geometrical information may be extracted from the calibration target
35. Thank you
for your attention
Carlos Guindel · cguindel@ing.uc3m.es
IEEE 20th International Conference on Intelligent Transportation Systems
Yokohama · 17 October 2017