Canadian Adverse Driving Conditions Dataset, 2020, 2
Deep multimodal sensor fusion in unseen adverse weather, 2020, 8
RADIATE: A Radar Dataset for Automotive Perception in Bad Weather, 2021, 4
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection, 2021, 7
Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather, 2021, 8
DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather, 2021, 9
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
1. LiDAR in the Adverse Weather:
Dust, Snow, Rain and Fog (2)
Yu Huang
Sunnyvale, California
Yu.huang07@gmail.com
2. Outline
• Canadian Adverse Driving Conditions Dataset, 2020, 2
• Deep multimodal sensor fusion in unseen adverse weather, 2020, 8
• RADIATE: A Radar Dataset for Automotive Perception in Bad Weather,
2021, 4
• Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection, 2021, 7
• Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in
Adverse Weather, 2021, 8
• DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR
Point Clouds in Severe Winter Weather, 2021, 9
3. Canadian Adverse Driving Conditions Dataset
• The Canadian Adverse Driving Conditions (CADC) dataset was
collected with the Autonomoose autonomous vehicle platform, based
on a modified Lincoln MKZ.
• The dataset, collected during winter within the Region of Waterloo,
Canada, autonomous vehicle dataset that focuses on adverse driving
conditions specifically.
• It contains 7,000 frames collected through a variety of winter weather
conditions of annotated data from 8 cameras (Ximea MQ013CG-E2),
Lidar (VLP-32C) and a GNSS+INS system (Novatel OEM638).
• Lidar frame annotations that represent ground truth for 3D object
detection and tracking have been provided by Scale AI.
4. Canadian Adverse Driving Conditions Dataset
Autonomoose, autonomous vehicle testing platform
A map of data collected for CADC
7. Canadian Adverse Driving Conditions Dataset
Top down lidar view of each snowfall levels with the corresponding front camera image. Top: left image couple is the
light snow and the right side is medium snow. Bottom: left image couple is heavy snow and the right is extreme snow.
8. Deep Multimodal Sensor Fusion in Unseen Adverse Weather
• The fusion of multimodal sensor streams, such as camera, lidar, and radar
measurements, plays a critical role in object detection for autonomous vehicles.
• While existing methods exploit redundant info under good conditions, they fail to
in adverse weather where the sensory streams can be asymmetrically distorted.
• To address this data challenge, this paper presents a multi-modal dataset
acquired by over 10,000 km of driving in northern Europe.
• Although this dataset is a large multimodal dataset in adverse weather, with 100k
labels for lidar, camera, radar and gated NIR sensors, it does not facilitate training
as extreme weather is rare.
• It presents a deep fusion network for robust fusion without a large corpus of
labeled training data covering all asymmetric distortions.
• Departing from proposal-level fusion, a single-shot model adaptively fuses
features, driven by measurement entropy.
• The dataset and all models will be published.
9. Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Existing object detection methods, including efficient
Single-Shot detectors (SSD), are trained on
automotive datasets that are biased towards good
weather conditions. While these methods work well
in good conditions, they fail in rare weather events
(top). Lidar only detectors, such as the same SSD
model trained on projected lidar depth, might be
distorted due to severe backscatter in fog or snow
(center). These asymmetric distortions are a
challenge for fusion methods, that rely on redundant
information. The proposed method (bottom) learns to
tackle unseen (potentially asymmetric) distortions in
multimodal data without seeing training data of these
rare scenarios.
10. Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Right: Geographical coverage of
the data collection campaign
covering two months and 10,000
km in Germany, Sweden,
Denmark and Finland. Top Left:
Test vehicle setup with top-
mounted lidar, gated camera with
flash illumination, RGB camera,
proprietary radar, FIR camera,
weather station and road friction
sensor. Bottom Left: Distribution
of weather conditions throughout
the data acquisition. The driving
data is highly unbalanced with
respect to weather conditions and
only contains adverse conditions
as rare samples.
11. Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Multimodal sensor response of RGB camera, scanning lidar, gated camera and radar in a fog chamber
with dense fog. Reference recordings under clear conditions are shown the first row, recordings in fog
with visibility of 23m are shown in the second row.
12. Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Overview of architecture consisting of four SSDs branches with deep feature exchange and adaptive fusion
of lidar, RGB camera, gated camera and radar. All sensory data is projected into the camera coordinate
system. To enable a steered fusion in-between sensors, the sensor entropy is provided to each feature
exchange block (red). The deep feature exchange blocks (white) interchange information (blue) with parallel
feature extraction blocks. The fused feature maps are analyzed by SSD blocks (orange).
13. Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Normalized entropy with respect to the clear reference recording for a gated camera, rgb camera, radar
and lidar in varying fog visibilities (left) and changing illumination (right). The entropy has been calculated
based on a dynamic scenario within a controlled fog chamber and a static scenario with changing natural
illumination settings. Note the asymmetric sensor failure for different sensor technologies.
15. RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
• This paper presents the RAdar Dataset In Adverse weaThEr (RADIATE), aiming to facilitate
research on object detection, tracking and scene understanding using radar sensing for
safe autonomous driving.
• RADIATE includes 3 hours of annotated radar images with more than 200K labelled road
actors in total, on average about 4.6 instances per radar image.
• It covers 8 different categories of actors in a variety of weather conditions (e.g., sun,
night, rain, fog and snow) and driving scenarios (e.g., parked, urban, motorway and
suburban), representing different levels of challenge.
• This is a public radar dataset which provides high-resolution radar images on public roads
with a large amount of road actors labelled.
• Some baseline results of radar based object detection and recognition are given to show
that the use of radar data is promising for automotive applications in bad weather, where
vision and LiDAR can fail.
• RADIATE also has stereo images, 32-channel LiDAR and GPS data, directed at other
applications such as sensor fusion, localisation and mapping.
• The public dataset can be accessed at http://pro.hw.ac.uk/radiate/.
16. RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
Examples from RADIATE. This dataset contains radar,
stereo camera, LiDAR and GPS data. It was collected in
various weather conditions and driving scenarios with
8 categories of annotated objects.
Qualitative results of radar based vehicle detection.
17. RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
18. RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
Sensor setup for data collection.
Category distribution for each scenario.
19. RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
Data in various weather conditions. Top: Image with LiDAR points projected. Middle: Radar with objects
annotated. Bottom: LiDAR with objects projected from radar annotation. Note both image and LiDAR images
are degraded in fog, rain and snow. The yellow circles encloses false LiDAR points caused by snow flakes.
20. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
• Lidar-based object detectors are known to be sensitive to adverse weather conditions
such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-
background ratio (SBR).
• As a result, lidar-based object detectors trained on data captured in normal weather tend
to perform poorly in such scenarios.
• However, collecting and labelling sufficient training data in a diverse range of adverse
weather conditions is laborious and prohibitively expensive.
• A physics- based approach to simulate lidar point clouds in adverse weather conditions.
• These augmented datasets can then be used to train lidar-based detectors to improve
their all-weather reliability.
• Specifically, a hybrid Monte-Carlo based approach treats (i) the effects of large particles
by placing them randomly and comparing their back reflected power against the target,
and (ii) attenuation effects on average through calculation of scattering efficiencies from
the Mie theory and particle size distributions.
• Retraining networks with this augmented data improves mean average precision
evaluated on real world rainy scenes.
21. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
LiDAR: a) Bistatic lidar (where receiver-Rx and transmitter-
Tx share different optical paths) parameters relevant for
weather augmentation calculations. b) Light scattering by
a single particle in the wave picture with a cartoon
radiation pattern. c) Light scattering by a random
ensemble of droplets in the ray picture.
Rain: a) Extinction efficiency as a function of rain droplet
diameter. b) Rain particle size distribution using the
Marshall-Palmer distribution for different rain rates. c)
Extinction coefficient as a function of rain rate from Mie
theory (orange) and the asymptotic solution from (blue).
22. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
intensity by Beer-Lambert law
average extinction coefficient
maximum detectable range
extinction coefficients
standard deviation for range noise
width of a Gaussian beam
beam waist
23. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
Graph of probability of occurrence of rain rate versus
rain rate in mm/hr. Rain rates are sampled from this
distribution for all simulation experiments.
In addition to the rain augmentations, it augments lidar
scenes in the KITTI dataset with snow, moderate
advection fog, and strong advection fog. The algorithm for
the simulation of snow is similar to that of rain.
24. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
A qualitative comparison for simulating a rainy lidar scene
against real world rainy data. Top: A clear weather scene
(Scene A) from the Waymo Open Dataset. Bottom left:
Scene A augmented with rain rate 35 mm/hr. Bottom right:
A real world rainy scene (Scene B). Rain leads to sparser
scenes, increased range uncertainty and reduced visibility.
25. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
A comparison of detection before and after rain simulation for three networks trained on the KITTI dataset, PointPillars
(first column), PV-RCNN (second column), and deformable PV-RCNN (third column). The first row depicts predictions
under normal conditions, while the second row depicts predictions under rainy conditions. The green bounding boxes
are the GT annotations, while the red bounding boxes are the predictions.
26. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
27. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
• It addresses the task of LiDAR- based 3D object detection in foggy weather.
• Collecting/annotating data in scenarios is time, labor and cost intensive.
• This paper tackles by simulating physically accurate fog into clear-weather
scenes, the existing real datasets in clear weather can be repurposed.
• 1) develop a physically valid fog simulation method that is applicable to any
LiDAR dataset. This unleashes the acquisition of large-scale foggy training
data at no extra cost. These partially synthetic data can be used to improve
the robustness of several perception methods, such as 3D object detection
and tracking or simultaneous localization and mapping, on real foggy data.
• 2) Through experiments with several SOTA detection approaches, fog
simulation can be leveraged to improve the performance for 3D object
detection in the presence of fog.
• The code is available at www.trace.ethz.ch/lidar fog simulation.
28. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
LiDAR returns caused by fog in the (top) scene. (a) shows
the strongest returns and (b) the last returns, color coded
by the LiDAR channel. The returns of the ground are
removed for better visibility of the points introduced by
fog. (red - low, cyan - high, 3D bounding box annotation
in green, ego vehicle dimensions in gray).
LiDAR returns caused by fog in the (top) scene. Color
coded by the LiDAR channel in (a) and by the intensity
in (b). The returns of the ground are removed for
better visibility of the points introduced by fog.
29. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
• To simulate the effect of fog on real-world LiDAR point clouds that have been recorded in
clear weather, it needs to resort to the optical system model that underlies the function
of the transmitter and receiver of the LiDAR sensor.
• It examines a single measurement/point, model the full signal of received power as a
function of the range and recover its exact form corresponding to the original clear-
weather measurement.
• This allows to operate in the signal domain and implement the transformation from clear
weather to fog simply by modifying the part of the impulse response that pertains to the
optical channel (i.e. the atmosphere).
• It makes the contribution of constructing a direct relation between the response (range-
dependent received signal power) in clear weather and in fog for the same 3D scene and
this relation enables to simulate fog on real clear-weather LiDAR measurements.
• Compared to clear weather, the spatial impulse response in fog is more involved, but it
can still be decomposed into two terms, corresponding to the hard and the soft target
respectively.
• Depending on the distance of the hard target from the sensor, the soft-target term of the
response may exhibit a larger maximum value than the hard-target term, which implies
that the measured range changes due to the presence of fog and becomes equal to the
point of maximum of the soft-target term.
30. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
Sketch of a LiDAR sensor where the transmitter
Tx and the receiver Rx do not have coaxial optics,
but have parallel axes. This is called a bistatic
beam configuration.
The two terms of the received signal power PR,fog from a single
LiDAR pulse, associated to the solid object that reflects the
pulse (Phard
R,fog) and the soft fog target (Psoft
R,fog), plotted across
the range domain. While in (a) the fog is not thick enough to
yield a return, in (b) it is thick enough to yield a return that
overshadows the solid object at R0 = 30m.
31. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
received signal power
hard target term
soft target term
32. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
Comparison of fog simulation (bottom) to the previous fog
simulation in STF dataset (middle) with α set to 0.06, which
corresponds to a meteorological optical range (MOR) ≈ 50m. In
the left column, the point cloud is color coded by the intensity
and in the right column it is color coded by the height (z value).
The top row shows the original point cloud.
33. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
34. Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
The (top) row shows predictions by PV-RCNN trained on the original clear weather data (first row in tables above),
the (bottom) row shows predictions by PV-RCNN trained on a mix of clear weather and simulated foggy data
(fourth row in tables above) on three example scenes from the STF dense fog test split. Ground truth boxes in
color, predictions of the model in white.
35. DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
• For autonomous vehicles to viably replace human drivers they must contend with
inclement weather.
• Falling rain and snow introduce noise in LiDAR returns resulting in both false positive and
false negative object detections.
• This article introduces the Winter Adverse Driving dataSet (WADS) collected in the snow
belt region of Michigan’s Upper Peninsula.
• WADS is the multi-modal dataset featuring dense point-wise labeled sequential LiDAR
scans collected in severe winter weather; weather that would cause an experienced
driver to alter their driving behavior.
• It has labelled and will make available over 7 GB or 3.6 billion labelled LiDAR points out of
over 26 TB of total LiDAR and camera data collected.
• It also presents the Dynamic Statistical Outlier Removal (DSOR) filter, a statistical PCL-
based filter capable or removing snow with a higher recall than the SOTA snow de-
noising filter while being 28% faster.
• The dataset and DSOR filter will be at https://bitbucket.org/autonomymtu/dsor_filter.
36. DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Part of Winter Adverse Driving dataSet (WADS) showing
moderate snowfall 0.6 in/hr (1.5 cm/hr). (top) Clutter
from snow particles obscures LiDAR point clouds and
reduces visibility. Here, two oncoming vehicles are
concealed by the snow. (bottom) The DSOR filter is faster
at de-noising snow clutter than the SOTA and enables
object detection in moderate and severe snow.
37. DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Distribution of classes in the WADS dataset. Scenes from sub-urban driving including vehicles, roads, and
man-made structures are included. Two novel classes: falling-snow and adverse winter weather.
38. DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
A labeled sequence from the WADS dataset. Every point has a unique label and represents one of 22
classes. Active falling snow (beige) and accumulated snow (off-white) are unique to the dataset.
39. DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
PCL’s SOR (Statistical Outlier Removal) filter is a general
noise removal filter widely used for cleaning point clouds;
It does not account for the non-uniform distribution of
the point cloud and, when applied to a scan with falling
snow, fails to remove it.
The DROR (Dynamic Radius Outlier Removal) filter applies
a threshold to the mean distance of points to their
neighbors in a given radius to remove sparse snowflakes;
To address changing LiDAR point spacing with distance,
the DROR filter changes the search radius as the distance
increases from the sensor; The DROR filter achieves a high
accuracy but fails to produce a clean point cloud at high
snowfall rates.
The DSOR (Dynamic Statistical Outlier Removal) filter is an
extension of PCL’s SOR filter, designed to address the
inherent non-uniformity in point clouds.
40. DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Qualitative comparison of the SOR filter, DROR filter and DSOR filter (left to right). The original point cloud shows
snow clutter (orange points) that degrades LiDAR perception. The DSOR filter removes more snow compared to
both the SOR and DROR filters and preserves most of the environmental features.
41. DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Percentage of filtered points as a function of range
(averaged over 100 point clouds). The DSOR filter
outperforms both the SOR and DROR filters in ranges
< 20m where most of the snow is concentrated.
The DSOR filter accurately filters out more snow than
the DROR filter and achieves a higher recall. Note that
the y-axis starts from 40 to better highlight differences
between the filters.