4. Background
Fundamental questions of autonomous mobile robot
navigation :
Where am I?
Where am I going?
How do I get there?
(Hugh Durrant-Whyte and John Leonard, 1989)
common requirement : Map
3
5. Background
Map
Mathematical representation of the environment
Machine understandable
Required to answer all three questions
Examples
Feature maps (landmark maps, point cloud maps)
Occupancy grid maps (OGM)
Distance transform (DT) maps
4
7. Background
Distance Transforms
Originated in the image processing domain
For a 2D binary image where pixels belong either to the set S,
the set of 1’s that belong to objects, or the set ¯S, the set of
0’s that belong to the background, a distance map or a
distance transform L(S) is an image defined as
L(x) = min(d[(i, j), ¯S]) ∀x = (i, j) ∈ S (1)
The distance function d is a positive definite, symmetric and
triangular function
6
8. Background
Distance Transform Variants
Unsigned DT : distance to the closest object
Signed DT : distance assigned a sign based on the
concept of inside vs outside
Vector DT : distance described using orthonormal vectors
7
10. List of Contributions
1. Vector distance transforms (VDT) for environment
representation
2. Sensor model on VDT maps
3. Optimization framework for localization on VDT maps
4. EKF framework for localization on VDT maps
5. Active localization using sparse range measurements
6. Capturing the uncertainty of VDT maps
9
13. VDT for Environment Representation
(a) (b)
(c) (d)
Figure: (a) UDT, (b) SDT, (c) x component of VDT and (d) y
component of VDT for box shaped environment
12
14. Behavior of Distance Transforms
(a) (b)
(c) (d)
Figure: (a) UDT, (b) SDT, (c) x component of VDT and (d) y
component of VDT along the dotted line in box shaped
environment
13
15. Behavior of Interpolated Distance Transforms
(a) (b)
(c) (d)
Figure: The behaviour of cubic spline interpolations of (a) UDT,
(b) SDT, (c) x component of VDT and (d) y component of VDT
close to the boundary 14
16. Gradients of Distance Transforms at the Boundary
(a) (b)
(c) (d)
Figure: Cubic spline interpolations of the gradients along the x
direction of (a) UDT, (b) SDT, (c) VDT x and (d) VDT y close to
the boundary 15
17. Behavior of Distance Transforms at Cut Locus
(a) (b)
(c) (d)
Figure: Cubic spline interpolations of the gradients along the x
direction of (a) UDT, (b) SDT, (c) VDT x and (d) VDT y close to
the cut locus 16
18. VDT for Environment Representation
Property UDT SDT VDT
Capable of represent-
ing complex 2D environ-
ments
Continuous gradients on
boundaries
Continuous function on
cut locus
17
20. Sensor Model for VDT Maps
Sensor Model associates sensor measurements
with the map
given robot state
Likelihood of an observation given the robot state
P(z | x, m) (2)
z : observation x : robot state m : map / internal
representation of the environment
19
21. Sensor Model for VDT Maps
Given a VDT map, an observation zi ∈ Z; zi = {ri, θi} and
an estimate for robot pose XR = (xR, yR, φR) , what is the
disparity between the actual and the expected observation?
Xo =
xoi
yoi
=
xR + ri sin(θ + φi)
yR + ri cos(θ + φi)
(3)
20
23. Sensor Model for VDT Maps
Assuming
1. Each range measurement is independent within a laser
scan
2. Range measurement noise σr is the only contributing
factor to the sensor noise
Sensitivity of dV DT due to sensor noise is
ΣdV DT
= diag(σ2
DTx,r1
, σ2
DTy,r1
, ..., ..., σ2
DTx,ri, σ2
DTy,ri, ..., ) (5)
22
24. Sensor Model for VDT Maps
σ2
DTx,ri
= JDTx,ri
· σ2
r · JDTx,ri
σ2
DTy,ri
= JDTy,ri
· σ2
r · JDTy,ri
(6)
JDTx,ri
=
∂dDTx
∂r
ri
=
∂DTx
∂xoi
xoi
·
∂xoi
∂r ri
+
∂DTx
∂yoi
yoi
·
∂xoi
∂r
ri
JDTy,ri
=
∂dDTy
∂r
ri
=
∂DTy
∂xoi
xoi
·
∂xoi
∂r ri
+
∂DTy
∂yoi
yoi
·
∂xoi
∂r
ri
(7)
23
25. Sensor Model for VDT Maps
A scalar disparity measurement between the expected and the
actual observations, inspired by the Chamfer distance
dV CD =
2n
i=1
dV DT (i)2
=
n
i=1
DTx(Xoi
)2
+ DTy(Xoi
)2
(8)
24
27. Localization
“Where am I?” == Localization
Given
a map
sensor measurements
prior information
figure out where I am.
26
28. Optimization Framework for Localization
Given
S : sensor measurement
M : map
XR = (xR, yR, φR) : hypothesized robot pose
if a function f can evaluate the disparity between XR and the
observations, localization becomes a minimization problem.
ˆXR = argmin
xR,yR,φR
f(S, M) (9)
Is dV CD scalar disparity measurement suitable for this?
27
29. Optimization Framework for Localization
Figure: Behavior of dV CD around the true robot position with the
orientation fixed at true value
28
30. Optimization Framework for Localization
Figure: Behavior of dV CD around the true robot orientation with
the position fixed at true value
29
31. Optimization Framework for Localization
ˆXR = argmin
xR,yR,φR
dV CD(Xo, DTv) (10)
Initial guess required
Similar to C-LOG
30
32. Optimization Framework for Localization
Uncertainty estimation
Sensitivity of the pose estimate to the error in sensor
measurement Σz = diag(σ2
r )
Pose derived using non linear least squares optimization
Use implicit function theorem
cov( ˆXR) = J · Σz · J (11)
31
33. Optimization Framework for Localization
J = −H−1
∗
∂2dV CD
∂xr∂r
∂2dV CD
∂yr∂r
∂2dV CD
∂φr∂r
(12)
H =
∂2dV CD
∂x2
r
∂2dV CD
∂xr∂yr
∂2dV DT
∂xr∂φr
∂2dV CD
∂xr∂yr
∂2dV CD
∂y2
r
∂2dV CD
∂yr∂φr
∂2dV CD
∂xr∂φr
∂2dV CD
∂yr∂φr
∂2dV CD
∂φ2
r
(13)
Calculating these derivatives is possible because of the
good behavior of the VDT at the boundary
32
34. Optimization Framework for Localization
Simulation Dataset : Ground truth available
Figure: Trajectory estimate of the optimization framework
33
36. Optimization Framework for Localization
Localization of a UAV in Kentland, VA, USA for MBZIRC 2017
Flight controller estimates roll and pitch
Height obtained from a range sensor
3DOF localization using monocular camera
Edge map of the flat ground terrain represented using a
VDT
35
37. Optimization Framework for Localization
Edge pixels (λi, µi) projected to the ground using
hypothesized robot position (xr, yr, zr) and orientation R
and focal length f.
xoi
=
xoi
yoi
=
xr + zr
λiR1,1+µiR1,2−fR1,3
λiR3,1−µiR3,2+fR3,3
yr + zr
λiR2,1+µiR2,2−fR2,3
λiR3,1−µiR3,2+fR3,3
(14)
Figure: Bogey 5
36
43. EKF Framework for Localization
Observation model
Standard EKF calculates an expected measurement
Innovation is the disparity between expected and actual
measurements
In a VDT map, this disparity is implicitly captured in the
map
h( ˆXRk|k−1
, zk) = dV DT (Xok
) = 0 (19)
42
44. EKF Framework for Localization
Innovation :
νk = h( ˆXRk|k−1
, zk) = dV DT (Xok
) (20)
A vector of distance values at the projected laser end points
43
46. EKF Framework for Localization
Calculation of Kalman gain and update equations follow the
standard EKF.
Kk = Pk|k−1 · Hx · S−1
k (24)
ˆXk = ˆXk|k−1 + Kk · νk (25)
Pk = Pk|k−1 − Kk · Sk · Kk (26)
45
47. EKF Framework for Localization
Simulation Dataset : Ground truth available
Figure: Trajectory estimate of the EKF framework
46
56. Active Localization
LiDARs are high frequency, high resolution sensors
The EKF framework can handle sparse measurements -
even a single range measurement
Using a rotating single point laser to localize
Active localization - determining the direction of the
single point laser
Based on information gain
55
59. Active Localization
The sensor rotation range is divided into n segments
Once a segment is selected as the goal, the sensor rotates
from current position to the farthest end of the selected
segment in a sweeping motion
Sensor continuously detect environment
Once the sensor reaches the goal, calculate a new goal
58
60. Active Localization
Calculating the goal
Given current position estimate and uncertainty
Ray trace at fixed angular resolution for hypothetical
observations
Update pose uncertainty using hypothetical observations
Pick the segment that produces the best cost function
value
For the ith
segment, the cost function is defined as
cost(i) = trace(Pki) + (λ.d) (27)
Pki - Uncertainty of the pose estimate using hypothetical
observations in the ith
segment
d - angular distance between current position and ith
segment
λ - tuning parameter
59
63. Uncertain VDT Maps
Desired properties of a map built from point cloud data
Continuous
Captures uncertainty of map building process
Simpler Sensor Model
62
64. Uncertain VDT Maps
Problem definition :
Given
a set of noisy robot poses
noisy observations obtained at said poses
how to represent the environment using the observations while
capturing the uncertainties?
63
65. Uncertain VDT Maps
If u(x), x ∈ 2
is the distance from point x to a set of points
S,
u(x) =⇒ UDT
u satisfies the Eikonal equation
| u| = 1, u|S = 0 (28)
64
66. Uncertain VDT Maps
Given the relationship between UDT and VDT
DTu(x, y)2
= DTx(x, y)2
+ DTy(x, y)2
(29)
A PDE can be derived as
(DTx
∂DTx
∂x
+ DTy
∂DTy
∂x
)2
+ (DTx
∂DTx
∂y
+ DTy
∂DTy
∂y
)2
DT2
x + DT2
y
= 1
(30)
With boundary conditions
DTx(x, y) = 0, DTy(x, y) = 0, ∀(x, y) ∈ S (31)
65
67. Uncertain VDT Maps
Parametric appriximation of a solution to a PDE
L u = f, x ∈ Ω (32)
Bu = g, x ∈ ∂Ω (33)
f, g - given functions
L , B - differential operators
∂Ω boundary of bounded open domain Ω
If a parametric function ua(x, βi) exists that is arbitrarily close
to u, βi can be optimized to obtain u = ua(x, βi) using
objective function h
h =
Ω
||L ua − f||2
dV +
∂Ω
||Bua − g||2
dS (34)
66
69. Uncertain VDT Maps
hv =
Ω
(DTx
∂DTx
∂x
+ DTy
∂DTy
∂x
)2
+ (DTx
∂DTx
∂y
+ DTy
∂DTy
∂y
)2
DT2
x + DT2
y
− 1
2
dV +
∂Ω
DT2
x + DT2
y dS
(38)
Optimize for the control points Pij using hv
Neural networks may also be candidates for the approximation
function
68
70. Uncertain VDT Maps
Integrating uncertainty
Each point in a point cloud Xo is
Xoi =
xoi
yoi
=
xr + ri cos(θi − φr)
yr + ri sin(θi − φr)
(39)
ˆxR = (xr, yr, φr)T
- robot pose
cov( ˆxR) - robot pose uncertainty
Srθ = {(ri, θi}) - laser range - bearing observations
σr - laser range measurement noise
69
71. Uncertain VDT Maps
Uncertainty of each point in the point cloud
cov(Xoi) =
∂Xoi
∂ ˆxR
cov( ˆxR)
∂Xoi
∂ ˆxR
T
+
∂Xoi
∂r
σ2
r
∂Xoi
∂r
T
(40)
70
72. Uncertain VDT Maps
Weights for observations based on uncertainty ρoi calculated
as :
ρoi =
1
∂(DT2
x +DT2
y )
∂Xoi
cov(Xoi)
∂(DT2
x +DT2
y )
∂Xoi
T
(41)
Weights for Eikonal equation ρei are empirically determined
71
73. Uncertain VDT Maps
Objective function with weights incorporated :
hv∗ =
Ω
ρei
(DTx
∂DTx
∂x
+ DTy
∂DTy
∂x
)2
+ (DTx
∂DTx
∂y
+ DTy
∂DTy
∂y
)2
DT2
x + DT2
y
− 1
2
dV +
∂Ω
ρoi DT2
x + DT2
y dS
(42)
72
74. Uncertain VDT Maps
Estimating map uncertainty
Px∗
ij - optimized values of the control points for DTx
Py∗
ij - optimized values of the control points for DTy
cov(Px∗
ij) = Jx ∗ cov(Xoi) ∗ JT
x (43)
cov(Py∗
ij) = Jy ∗ cov(Xoi) ∗ JT
y (44)
Jx = −H−1
∗
∂2hv∗
∂Px∗
ij∂rk
(45)
(implicit function theorem)
73
85. Uncertain VDT Maps
Demonstrate the use of Eikonal equation and boundary
condition to build a map
What the best approximation function is, is an open
question
Need improvements for practical use such as handling cut
locus efficiently, reduce computational burden using
submaps
84
87. Conclusion
Vector Distance Transform
Continuous
Implicitly captures the geometry
Behavior along the boundary makes it preferable to UDT
Preferable over SDT for unstructured environments
Localization using optimization or EKF on different types
of sensors
Uncertainty of mapping data can be embedded
86
90. List of Publications
Main Publications
Arukgoda, J., Ranasinghe, R., Dantanarayana, L.,
Dissanayake, G. and Furukawa, T., 2017. Vector Distance
Function Based Map Representation for Robot
Localisation. In The Australian Conference on Robotics
and Automation (ACRA), (Vol. 12). ARAA. ISBN:
978-0-9807404-8-6 ISSN: 1448-2053
Ranasinghe, R., Dissanayake, G., Furukawa, T.,
Arukgoda, J. and Dantanarayana, L., 2017, December.
Environment representation for mobile robot localisation.
In 2017 IEEE International Conference on Industrial and
Information Systems (ICIIS), (pp. 1-6). IEEE. doi:
10.1109/ICIINFS.2017.8300384
89
91. List of Publications
Main Publications ctd...
Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2019,
July. Robot Localisation in 3D Environments Using
Sparse Range Measurements. In 2019 IEEE/ASME
International Conference on Advanced Intelligent
Mechatronics (AIM), (pp. 551-558). IEEE. doi:
10.1109/AIM.2019.8868466
Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2019,
August. Representation of Uncertain Occupancy Maps
with High Level Feature Vectors. In 2019 IEEE 15th
International Conference on Automation Science and
Engineering (CASE), (pp. 1035-1041). IEEE. doi:
10.1109/COASE.2019.8842965
90
92. List of Publications
Under Review
Jayasuriya, M., Arukgoda, J., Ranasinghe, R. and
Dissanayake, G., 2020, May. Localising PMDs through
CNN Based Perception of Urban Streets. Under review in
2020 International Conference on Robotics and
Automation (ICRA).
91
93. List of Publications
Other Publications During Candidature
Perera, A., Arukgoda, J., Ranasinghe, R. and
Dissanayake, G., 2017, September. Localization System
for Carers to Track Elderly People in Visits to a Crowded
Shopping Mall. In 2017 International Conference on
Indoor Positioning and Indoor Navigation (IPIN), (pp.
1-8). IEEE. doi: 10.1109/IPIN.2017.8115936
Unicomb, J., Dantanarayana, L., Arukgoda, J.,
Ranasinghe, R., Dissanayake, G. and Furukawa, T., 2017,
September. Distance function based 6DOF localization
for unmanned aerial vehicles in GPS denied environments.
In 2017 IEEE/RSJ International Conference on Intelligent
Robots and Systems (IROS) (pp. 5292-5297). IEEE.
10.1109/IROS.2017.8206421
92
94. List of Publications
Other Publications During Candidature ctd...
Hodges, J., Attia, T., Arukgoda, J., Kang, C., Cowden,
M., Doan, L., Ranasinghe, R., Abdelatty, K., Dissanayake,
G. and Furukawa, T., 2019. Multistage Bayesian
Autonomy for High-precision Operation in a Large Field.
Journal of Field Robotics, (Vol 36 (1)), (pp.183-203).
doi: 10.1002/rob.21829
93