Collaborative Sensing for Automated Vehicles
Todd Humphreys, Lakshay Narula, Matthew Murrian, Daniel Lachapelle
Department of Aerospace Engineering
The University of Texas at Austin
April 11, 2018
∈
5G-Enabled Collaborative ADAS
Motivation
Both current and proposed ADAS are limited in important
ways. Even when equipped with an impressive array of
sensors, individual vehicles have blind spots in standard
traffic conditions: they can't see far enough to make a left-
hand turn onto a high-speed roadway, or around corners,
or through large nearby vehicles. Lack of vehicle situational
awareness due to such blind spots will unfortunately be
the cause of many accidents, including fatalities, for
connected and automated vehicles in the years to come.
Strategy: Collaborative Sensing via 5G
We view the “blind spot” problem as an opportunity to
develop safer ADAS for connected and (semi)-automated
vehicles.
Future Vehicle-to-vehicle (V2V) and vehicle-to-
infrastructure (V2I) connectivity will permit vehicles to
relay their positions and velocities to each other with
millisecond latency, enabling tight coordinated platooning
and efficient intersection management. More ambitiously,
broadband V2V and V2I enabled by 5G wireless networks
will permit vehicles to share unprocessed or lightly-
processed sensor data, allowing ad-hoc networks of
vehicles and infrastructure to function as a single sensing
organism.
Multiple connected vehicles sharing, e.g., radar data,
eliminate blind spots and thereby reduce the risk of
collision with cyclists, pedestrians, and other vehicles
Conditional occupancy map: The prior distribution is altered by
conditioning with sensor data. In this example, multiple vehicles
with radar collaborate to “clear out” the occupancy distribution.
Blue: pedestrians, Green: cyclists, Red: motor vehicles.
Sensor data (radar, cameras, lidar) conditions the prior
occupancy and velocity distributions, killing off particles whose
conditional probability drops below a threshold. The
interaction of particles and sensing data is governed by sensor
measurement models that account for sensor noise, field of
view, and sensitivity to particles of a given type and velocity.
For example, the interaction of radar sensing and slow-moving
pedestrian particles is weak given pedestrians’ poor radar
reflectivity.
Using the conditional occupancy and velocity distributions, the
Bayesian risk for collision with a particle at velocity v is
calculated as shown in the equation below, which sums up the
risk for all particle types (pedestrian, cyclist, vehicle) within a
small region of the map.
Assessing the Safety Benefit of Collaborative Sensing
GEOSLAM: Globally-reference electro-optical SLAM:
A tightly-coupled precise GNSS-vision system appropriate for
fusing data from multiple cooperating vehicles
Cameras
PpRx PpFusion
Feature Identifier Keyframe Selector
Bundle Adjustment
Tracking
GEOSLAM
Standard GNSS
position
Carrier phase
measurements
GNSS Antennas
Precise antenna
positions
Visually-derived
pose information
GNSS-aided
camera position
UT SAVES Sensorium v2.0:
Stereo cameras, dual-antenna triple-frequency software-defined GNSS,
industrial-grade IMU (8 deg/hr gyros), automotive radar, LTE connectivity
A platform for Collaborative Sensing Experimentation
During the course of the Honda project, a first Sensorium v2.0 was
fully completed and tested in field trials. A second Sensorium v2.0
is under construction: it is now structurally complete but its
sensors and computer have not been installed.
In April, 2018, we performed a demonstration of between-vehicle data sharing using our Sensorium hardware platform
and our GEOSLAM data fusion process/software. We used a single sensorium for this demonstration, but took imagery
from multiple viewpoints to simulate image sharing between vehicles. The demonstration scenario was meant to
simulate collaborative sensing of a pedestrian near an intersection in which blind spots are common due to other
vehicles blocking the sensing horizon.
Demo of 5G-Enhanced ADAS via Data Sharing
Distracted Daniel
600 800 1000 1200 1400 1600 1800 2000200 400 600 800 1000 1200 1400 1600
100
200
300
400
500
600
700
100
200
300
400
500
600
700
No map points on Distracted Daniel
Some map points on Distracted Daniel
Start
End
Distracted Daniel
not detected
The first vehicle enters the are
where Daniel is standing via the
trajectory shown, performing
globally-referenced 3D visual
mapping along the way. But no
features from Daniel are
present in the map because he
was never visible to the
vehicle’s cameras.
Start
End
Distracted Daniel
A second vehicle enters the are
where Daniel is standing, via the
trajectory shown, performing
globally-referenced 3D visual
mapping along the way. Several
features from Daniel are present
in the unified map, allowing
both vehicles to benefit from
decimeter-accurate knowledge
of Daniel’s position.
Collaborative Precise Map-Making using
only Standard GNSS
Q: Given that tightly-coupled vision+GNSS
processing averages down GNSS errors over
many runs, could standard GNSS solutions
(pseudorange + Doppler) be sufficient for
sub-50-cm-accurate mapping in urban
environments?
Research Statement
If vehicles are able to collaboratively update the map over multiple
sessions, then the GNSS errors average between all sessions with
appropriate weighting.
If GNSS errors are spatially and temporally zero-mean, then the
resulting visual map becomes arbitrarily accurate with sufficiently many
sessions through an area.
This project investigates the nature of GNSS errors and develops a GNSS-aided
collaborative visual SLAM pipeline.
The question that remains: are the GNSS errors at every map location
asymptotically zero-mean? Or do some location-dependent biases
persist in time-separated GNSS measurements?
Error Sources
Thermal noise: zero-mean Gaussian, readily averages
Satellite orbits and clocks: use IGS rapid orbit and clock products in post-processing
Ionospheric modeling errors
A common bias in the model is acceptable
for positioning: the error is absorbed in the
clock bias or DCB estimate.
However, systematic asymmetry may lead
to persistent biases in the position
solution.
What kind of asymmetry?
• Different model errors for satellites at different elevations
• Inherent asymmetry in satellite geometry
Tropospheric modeling errors: similar to ionospheric errors but comparatively benign, more on this later…
Ionospheric Modeling Errors
WAAS CorrectionsIGS Corrections
≈ 3 TECU peak-to-peak swing
(50 cm differential)
≈ 1 TECU peak-to-peak swing
(16 cm differential)
Multipath Error
Persistent reflections from asymmetric
reflectors can lead to persistent biases in
the navigation solution
Unlike the study of ionospheric modeling
errors for application in urban mapping,
multipath errors cannot be characterized
with data from survey stations with a clear
view of the sky.
1. Scalable study in simulation
2. Validation with empirical data
Based on Land Mobile Satellite Channel
Model [1].
LOS: interaction with buildings, trees, poles
NLOS echoes: stochastic
Geometric echoes: reflective, persistent
1. Simulate multipath echoes
2. Scalar tracking with narrow correlator
3. Navigation filter (EKF)
Naïve receiver: persistent biases after 100
sessions
Ideal NLOS rejection: accept signals with at
least 10 dB LOS advantage
Realistic receiver with normalized
innovation test based NLOS exclusion
Data Collection
Data collected on two separate days North of the UT Austin campus,
within 1 mile of the nearest LDRN reference station.
The chosen loop is ≈ 1 km long, and was repeated 25 times over the
course of the two data collection campaigns.
Pedestrian bridge
Tall buildings
Standalone GNSS tracks are tightly clustered, but exhibit a day-specific bias – most likely due to
ionospheric and tropospheric modeling errors
2018-Jan-15 Sessions 2017-Dec-21 Sessions
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
EastError(m)
Standalone Standard GNSS Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-4
-2
0
2
4
UpError(m)
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
EastError(m)
Standalone Standard GNSS Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-4
-2
0
2
4
UpError(m)
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
EastError(m)
Code-Phase Differential GNSS Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-4
-2
0
2
4
UpError(m)
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
EastError(m)
Code-Phase Differential GNSS Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900 1000
Distance along path (m)
-4
-2
0
2
4
UpError(m)
The average horizontal errors over multiple sessions are clearly within 50 cm. This is the
expected performance with negligible ionospheric and tropospheric errors.
2018-Jan-15 Sessions 2017-Dec-21 Sessions
Multi-Session
Mapping with
GEOSLAM
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
EastError(m)
GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-4
-2
0
2
4
UpError(m)
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
EastError(m)
GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-4
-2
0
2
4
UpError(m)
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
EastError(m)
GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-4
-2
0
2
4
UpError(m)
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
EastError(m)
GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-2
-1
0
1
2
NorthError(m)
0 100 200 300 400 500 600 700 800 900
Distance along path (m)
-4
-2
0
2
4
UpError(m)
Resulting 3D model has mm-level resolution and sub-cm absolute accuracy
Research goal: Multi-UAV fully-automated collaborative high-resolution mapping
Collaborative Sensing for Automated Vehicles

Collaborative Sensing for Automated Vehicles

  • 1.
    Collaborative Sensing forAutomated Vehicles Todd Humphreys, Lakshay Narula, Matthew Murrian, Daniel Lachapelle Department of Aerospace Engineering The University of Texas at Austin April 11, 2018
  • 2.
  • 3.
  • 4.
    Motivation Both current andproposed ADAS are limited in important ways. Even when equipped with an impressive array of sensors, individual vehicles have blind spots in standard traffic conditions: they can't see far enough to make a left- hand turn onto a high-speed roadway, or around corners, or through large nearby vehicles. Lack of vehicle situational awareness due to such blind spots will unfortunately be the cause of many accidents, including fatalities, for connected and automated vehicles in the years to come.
  • 5.
    Strategy: Collaborative Sensingvia 5G We view the “blind spot” problem as an opportunity to develop safer ADAS for connected and (semi)-automated vehicles. Future Vehicle-to-vehicle (V2V) and vehicle-to- infrastructure (V2I) connectivity will permit vehicles to relay their positions and velocities to each other with millisecond latency, enabling tight coordinated platooning and efficient intersection management. More ambitiously, broadband V2V and V2I enabled by 5G wireless networks will permit vehicles to share unprocessed or lightly- processed sensor data, allowing ad-hoc networks of vehicles and infrastructure to function as a single sensing organism. Multiple connected vehicles sharing, e.g., radar data, eliminate blind spots and thereby reduce the risk of collision with cyclists, pedestrians, and other vehicles
  • 6.
    Conditional occupancy map:The prior distribution is altered by conditioning with sensor data. In this example, multiple vehicles with radar collaborate to “clear out” the occupancy distribution. Blue: pedestrians, Green: cyclists, Red: motor vehicles. Sensor data (radar, cameras, lidar) conditions the prior occupancy and velocity distributions, killing off particles whose conditional probability drops below a threshold. The interaction of particles and sensing data is governed by sensor measurement models that account for sensor noise, field of view, and sensitivity to particles of a given type and velocity. For example, the interaction of radar sensing and slow-moving pedestrian particles is weak given pedestrians’ poor radar reflectivity. Using the conditional occupancy and velocity distributions, the Bayesian risk for collision with a particle at velocity v is calculated as shown in the equation below, which sums up the risk for all particle types (pedestrian, cyclist, vehicle) within a small region of the map. Assessing the Safety Benefit of Collaborative Sensing
  • 7.
    GEOSLAM: Globally-reference electro-opticalSLAM: A tightly-coupled precise GNSS-vision system appropriate for fusing data from multiple cooperating vehicles Cameras PpRx PpFusion Feature Identifier Keyframe Selector Bundle Adjustment Tracking GEOSLAM Standard GNSS position Carrier phase measurements GNSS Antennas Precise antenna positions Visually-derived pose information GNSS-aided camera position
  • 8.
    UT SAVES Sensoriumv2.0: Stereo cameras, dual-antenna triple-frequency software-defined GNSS, industrial-grade IMU (8 deg/hr gyros), automotive radar, LTE connectivity
  • 9.
    A platform forCollaborative Sensing Experimentation During the course of the Honda project, a first Sensorium v2.0 was fully completed and tested in field trials. A second Sensorium v2.0 is under construction: it is now structurally complete but its sensors and computer have not been installed.
  • 10.
    In April, 2018,we performed a demonstration of between-vehicle data sharing using our Sensorium hardware platform and our GEOSLAM data fusion process/software. We used a single sensorium for this demonstration, but took imagery from multiple viewpoints to simulate image sharing between vehicles. The demonstration scenario was meant to simulate collaborative sensing of a pedestrian near an intersection in which blind spots are common due to other vehicles blocking the sensing horizon. Demo of 5G-Enhanced ADAS via Data Sharing
  • 11.
  • 12.
    600 800 10001200 1400 1600 1800 2000200 400 600 800 1000 1200 1400 1600 100 200 300 400 500 600 700 100 200 300 400 500 600 700 No map points on Distracted Daniel Some map points on Distracted Daniel
  • 13.
    Start End Distracted Daniel not detected Thefirst vehicle enters the are where Daniel is standing via the trajectory shown, performing globally-referenced 3D visual mapping along the way. But no features from Daniel are present in the map because he was never visible to the vehicle’s cameras.
  • 14.
    Start End Distracted Daniel A secondvehicle enters the are where Daniel is standing, via the trajectory shown, performing globally-referenced 3D visual mapping along the way. Several features from Daniel are present in the unified map, allowing both vehicles to benefit from decimeter-accurate knowledge of Daniel’s position.
  • 17.
    Collaborative Precise Map-Makingusing only Standard GNSS
  • 18.
    Q: Given thattightly-coupled vision+GNSS processing averages down GNSS errors over many runs, could standard GNSS solutions (pseudorange + Doppler) be sufficient for sub-50-cm-accurate mapping in urban environments?
  • 19.
    Research Statement If vehiclesare able to collaboratively update the map over multiple sessions, then the GNSS errors average between all sessions with appropriate weighting. If GNSS errors are spatially and temporally zero-mean, then the resulting visual map becomes arbitrarily accurate with sufficiently many sessions through an area. This project investigates the nature of GNSS errors and develops a GNSS-aided collaborative visual SLAM pipeline. The question that remains: are the GNSS errors at every map location asymptotically zero-mean? Or do some location-dependent biases persist in time-separated GNSS measurements?
  • 20.
    Error Sources Thermal noise:zero-mean Gaussian, readily averages Satellite orbits and clocks: use IGS rapid orbit and clock products in post-processing Ionospheric modeling errors A common bias in the model is acceptable for positioning: the error is absorbed in the clock bias or DCB estimate. However, systematic asymmetry may lead to persistent biases in the position solution. What kind of asymmetry? • Different model errors for satellites at different elevations • Inherent asymmetry in satellite geometry Tropospheric modeling errors: similar to ionospheric errors but comparatively benign, more on this later…
  • 21.
    Ionospheric Modeling Errors WAASCorrectionsIGS Corrections ≈ 3 TECU peak-to-peak swing (50 cm differential) ≈ 1 TECU peak-to-peak swing (16 cm differential)
  • 23.
    Multipath Error Persistent reflectionsfrom asymmetric reflectors can lead to persistent biases in the navigation solution Unlike the study of ionospheric modeling errors for application in urban mapping, multipath errors cannot be characterized with data from survey stations with a clear view of the sky. 1. Scalable study in simulation 2. Validation with empirical data
  • 24.
    Based on LandMobile Satellite Channel Model [1]. LOS: interaction with buildings, trees, poles NLOS echoes: stochastic Geometric echoes: reflective, persistent 1. Simulate multipath echoes 2. Scalar tracking with narrow correlator 3. Navigation filter (EKF) Naïve receiver: persistent biases after 100 sessions Ideal NLOS rejection: accept signals with at least 10 dB LOS advantage Realistic receiver with normalized innovation test based NLOS exclusion
  • 25.
    Data Collection Data collectedon two separate days North of the UT Austin campus, within 1 mile of the nearest LDRN reference station. The chosen loop is ≈ 1 km long, and was repeated 25 times over the course of the two data collection campaigns. Pedestrian bridge Tall buildings
  • 26.
    Standalone GNSS tracksare tightly clustered, but exhibit a day-specific bias – most likely due to ionospheric and tropospheric modeling errors 2018-Jan-15 Sessions 2017-Dec-21 Sessions 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 EastError(m) Standalone Standard GNSS Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -4 -2 0 2 4 UpError(m) 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 EastError(m) Standalone Standard GNSS Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -4 -2 0 2 4 UpError(m)
  • 27.
    0 100 200300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 EastError(m) Code-Phase Differential GNSS Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -4 -2 0 2 4 UpError(m) 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 EastError(m) Code-Phase Differential GNSS Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 1000 Distance along path (m) -4 -2 0 2 4 UpError(m) The average horizontal errors over multiple sessions are clearly within 50 cm. This is the expected performance with negligible ionospheric and tropospheric errors. 2018-Jan-15 Sessions 2017-Dec-21 Sessions
  • 28.
    Multi-Session Mapping with GEOSLAM 0 100200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 EastError(m) GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -4 -2 0 2 4 UpError(m) 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 EastError(m) GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -4 -2 0 2 4 UpError(m) 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 EastError(m) GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -4 -2 0 2 4 UpError(m) 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 EastError(m) GNSS Measurement and GEOSLAM Estimate Errors w.r.t. Precise GNSS 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -2 -1 0 1 2 NorthError(m) 0 100 200 300 400 500 600 700 800 900 Distance along path (m) -4 -2 0 2 4 UpError(m)
  • 31.
    Resulting 3D modelhas mm-level resolution and sub-cm absolute accuracy Research goal: Multi-UAV fully-automated collaborative high-resolution mapping