SlideShare a Scribd company logo
1 of 42
Download to read offline
LiDAR in the Adverse Weather:
Dust, Snow, Rain and Fog (2)
Yu Huang
Sunnyvale, California
Yu.huang07@gmail.com
Outline
• Canadian Adverse Driving Conditions Dataset, 2020, 2
• Deep multimodal sensor fusion in unseen adverse weather, 2020, 8
• RADIATE: A Radar Dataset for Automotive Perception in Bad Weather,
2021, 4
• Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection, 2021, 7
• Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in
Adverse Weather, 2021, 8
• DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR
Point Clouds in Severe Winter Weather, 2021, 9
Canadian Adverse Driving Conditions Dataset
• The Canadian Adverse Driving Conditions (CADC) dataset was
collected with the Autonomoose autonomous vehicle platform, based
on a modified Lincoln MKZ.
• The dataset, collected during winter within the Region of Waterloo,
Canada, autonomous vehicle dataset that focuses on adverse driving
conditions specifically.
• It contains 7,000 frames collected through a variety of winter weather
conditions of annotated data from 8 cameras (Ximea MQ013CG-E2),
Lidar (VLP-32C) and a GNSS+INS system (Novatel OEM638).
• Lidar frame annotations that represent ground truth for 3D object
detection and tracking have been provided by Scale AI.
Canadian Adverse Driving Conditions Dataset
Autonomoose, autonomous vehicle testing platform
A map of data collected for CADC
Canadian Adverse Driving Conditions Dataset
Canadian Adverse Driving Conditions Dataset
Canadian Adverse Driving Conditions Dataset
Top down lidar view of each snowfall levels with the corresponding front camera image. Top: left image couple is the
light snow and the right side is medium snow. Bottom: left image couple is heavy snow and the right is extreme snow.
Deep Multimodal Sensor Fusion in Unseen Adverse Weather
• The fusion of multimodal sensor streams, such as camera, lidar, and radar
measurements, plays a critical role in object detection for autonomous vehicles.
• While existing methods exploit redundant info under good conditions, they fail to
in adverse weather where the sensory streams can be asymmetrically distorted.
• To address this data challenge, this paper presents a multi-modal dataset
acquired by over 10,000 km of driving in northern Europe.
• Although this dataset is a large multimodal dataset in adverse weather, with 100k
labels for lidar, camera, radar and gated NIR sensors, it does not facilitate training
as extreme weather is rare.
• It presents a deep fusion network for robust fusion without a large corpus of
labeled training data covering all asymmetric distortions.
• Departing from proposal-level fusion, a single-shot model adaptively fuses
features, driven by measurement entropy.
• The dataset and all models will be published.
Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Existing object detection methods, including efficient
Single-Shot detectors (SSD), are trained on
automotive datasets that are biased towards good
weather conditions. While these methods work well
in good conditions, they fail in rare weather events
(top). Lidar only detectors, such as the same SSD
model trained on projected lidar depth, might be
distorted due to severe backscatter in fog or snow
(center). These asymmetric distortions are a
challenge for fusion methods, that rely on redundant
information. The proposed method (bottom) learns to
tackle unseen (potentially asymmetric) distortions in
multimodal data without seeing training data of these
rare scenarios.
Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Right: Geographical coverage of
the data collection campaign
covering two months and 10,000
km in Germany, Sweden,
Denmark and Finland. Top Left:
Test vehicle setup with top-
mounted lidar, gated camera with
flash illumination, RGB camera,
proprietary radar, FIR camera,
weather station and road friction
sensor. Bottom Left: Distribution
of weather conditions throughout
the data acquisition. The driving
data is highly unbalanced with
respect to weather conditions and
only contains adverse conditions
as rare samples.
Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Multimodal sensor response of RGB camera, scanning lidar, gated camera and radar in a fog chamber
with dense fog. Reference recordings under clear conditions are shown the first row, recordings in fog
with visibility of 23m are shown in the second row.
Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Overview of architecture consisting of four SSDs branches with deep feature exchange and adaptive fusion
of lidar, RGB camera, gated camera and radar. All sensory data is projected into the camera coordinate
system. To enable a steered fusion in-between sensors, the sensor entropy is provided to each feature
exchange block (red). The deep feature exchange blocks (white) interchange information (blue) with parallel
feature extraction blocks. The fused feature maps are analyzed by SSD blocks (orange).
Deep Multimodal Sensor Fusion in Unseen Adverse Weather
Normalized entropy with respect to the clear reference recording for a gated camera, rgb camera, radar
and lidar in varying fog visibilities (left) and changing illumination (right). The entropy has been calculated
based on a dynamic scenario within a controlled fog chamber and a static scenario with changing natural
illumination settings. Note the asymmetric sensor failure for different sensor technologies.
Deep Multimodal Sensor Fusion in Unseen Adverse Weather
RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
• This paper presents the RAdar Dataset In Adverse weaThEr (RADIATE), aiming to facilitate
research on object detection, tracking and scene understanding using radar sensing for
safe autonomous driving.
• RADIATE includes 3 hours of annotated radar images with more than 200K labelled road
actors in total, on average about 4.6 instances per radar image.
• It covers 8 different categories of actors in a variety of weather conditions (e.g., sun,
night, rain, fog and snow) and driving scenarios (e.g., parked, urban, motorway and
suburban), representing different levels of challenge.
• This is a public radar dataset which provides high-resolution radar images on public roads
with a large amount of road actors labelled.
• Some baseline results of radar based object detection and recognition are given to show
that the use of radar data is promising for automotive applications in bad weather, where
vision and LiDAR can fail.
• RADIATE also has stereo images, 32-channel LiDAR and GPS data, directed at other
applications such as sensor fusion, localisation and mapping.
• The public dataset can be accessed at http://pro.hw.ac.uk/radiate/.
RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
Examples from RADIATE. This dataset contains radar,
stereo camera, LiDAR and GPS data. It was collected in
various weather conditions and driving scenarios with
8 categories of annotated objects.
Qualitative results of radar based vehicle detection.
RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
Sensor setup for data collection.
Category distribution for each scenario.
RADIATE: A Radar Dataset for Automotive Perception in
Bad Weather
Data in various weather conditions. Top: Image with LiDAR points projected. Middle: Radar with objects
annotated. Bottom: LiDAR with objects projected from radar annotation. Note both image and LiDAR images
are degraded in fog, rain and snow. The yellow circles encloses false LiDAR points caused by snow flakes.
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
• Lidar-based object detectors are known to be sensitive to adverse weather conditions
such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-
background ratio (SBR).
• As a result, lidar-based object detectors trained on data captured in normal weather tend
to perform poorly in such scenarios.
• However, collecting and labelling sufficient training data in a diverse range of adverse
weather conditions is laborious and prohibitively expensive.
• A physics- based approach to simulate lidar point clouds in adverse weather conditions.
• These augmented datasets can then be used to train lidar-based detectors to improve
their all-weather reliability.
• Specifically, a hybrid Monte-Carlo based approach treats (i) the effects of large particles
by placing them randomly and comparing their back reflected power against the target,
and (ii) attenuation effects on average through calculation of scattering efficiencies from
the Mie theory and particle size distributions.
• Retraining networks with this augmented data improves mean average precision
evaluated on real world rainy scenes.
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
LiDAR: a) Bistatic lidar (where receiver-Rx and transmitter-
Tx share different optical paths) parameters relevant for
weather augmentation calculations. b) Light scattering by
a single particle in the wave picture with a cartoon
radiation pattern. c) Light scattering by a random
ensemble of droplets in the ray picture.
Rain: a) Extinction efficiency as a function of rain droplet
diameter. b) Rain particle size distribution using the
Marshall-Palmer distribution for different rain rates. c)
Extinction coefficient as a function of rain rate from Mie
theory (orange) and the asymptotic solution from (blue).
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
intensity by Beer-Lambert law
average extinction coefficient
maximum detectable range
extinction coefficients
standard deviation for range noise
width of a Gaussian beam
beam waist
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
Graph of probability of occurrence of rain rate versus
rain rate in mm/hr. Rain rates are sampled from this
distribution for all simulation experiments.
In addition to the rain augmentations, it augments lidar
scenes in the KITTI dataset with snow, moderate
advection fog, and strong advection fog. The algorithm for
the simulation of snow is similar to that of rain.
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
A qualitative comparison for simulating a rainy lidar scene
against real world rainy data. Top: A clear weather scene
(Scene A) from the Waymo Open Dataset. Bottom left:
Scene A augmented with rain rate 35 mm/hr. Bottom right:
A real world rainy scene (Scene B). Rain leads to sparser
scenes, increased range uncertainty and reduced visibility.
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
A comparison of detection before and after rain simulation for three networks trained on the KITTI dataset, PointPillars
(first column), PV-RCNN (second column), and deformable PV-RCNN (third column). The first row depicts predictions
under normal conditions, while the second row depicts predictions under rainy conditions. The green bounding boxes
are the GT annotations, while the red bounding boxes are the predictions.
Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
• It addresses the task of LiDAR- based 3D object detection in foggy weather.
• Collecting/annotating data in scenarios is time, labor and cost intensive.
• This paper tackles by simulating physically accurate fog into clear-weather
scenes, the existing real datasets in clear weather can be repurposed.
• 1) develop a physically valid fog simulation method that is applicable to any
LiDAR dataset. This unleashes the acquisition of large-scale foggy training
data at no extra cost. These partially synthetic data can be used to improve
the robustness of several perception methods, such as 3D object detection
and tracking or simultaneous localization and mapping, on real foggy data.
• 2) Through experiments with several SOTA detection approaches, fog
simulation can be leveraged to improve the performance for 3D object
detection in the presence of fog.
• The code is available at www.trace.ethz.ch/lidar fog simulation.
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
LiDAR returns caused by fog in the (top) scene. (a) shows
the strongest returns and (b) the last returns, color coded
by the LiDAR channel. The returns of the ground are
removed for better visibility of the points introduced by
fog. (red - low, cyan - high, 3D bounding box annotation
in green, ego vehicle dimensions in gray).
LiDAR returns caused by fog in the (top) scene. Color
coded by the LiDAR channel in (a) and by the intensity
in (b). The returns of the ground are removed for
better visibility of the points introduced by fog.
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
• To simulate the effect of fog on real-world LiDAR point clouds that have been recorded in
clear weather, it needs to resort to the optical system model that underlies the function
of the transmitter and receiver of the LiDAR sensor.
• It examines a single measurement/point, model the full signal of received power as a
function of the range and recover its exact form corresponding to the original clear-
weather measurement.
• This allows to operate in the signal domain and implement the transformation from clear
weather to fog simply by modifying the part of the impulse response that pertains to the
optical channel (i.e. the atmosphere).
• It makes the contribution of constructing a direct relation between the response (range-
dependent received signal power) in clear weather and in fog for the same 3D scene and
this relation enables to simulate fog on real clear-weather LiDAR measurements.
• Compared to clear weather, the spatial impulse response in fog is more involved, but it
can still be decomposed into two terms, corresponding to the hard and the soft target
respectively.
• Depending on the distance of the hard target from the sensor, the soft-target term of the
response may exhibit a larger maximum value than the hard-target term, which implies
that the measured range changes due to the presence of fog and becomes equal to the
point of maximum of the soft-target term.
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
Sketch of a LiDAR sensor where the transmitter
Tx and the receiver Rx do not have coaxial optics,
but have parallel axes. This is called a bistatic
beam configuration.
The two terms of the received signal power PR,fog from a single
LiDAR pulse, associated to the solid object that reflects the
pulse (Phard
R,fog) and the soft fog target (Psoft
R,fog), plotted across
the range domain. While in (a) the fog is not thick enough to
yield a return, in (b) it is thick enough to yield a return that
overshadows the solid object at R0 = 30m.
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
received signal power
hard target term
soft target term
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
Comparison of fog simulation (bottom) to the previous fog
simulation in STF dataset (middle) with α set to 0.06, which
corresponds to a meteorological optical range (MOR) ≈ 50m. In
the left column, the point cloud is color coded by the intensity
and in the right column it is color coded by the height (z value).
The top row shows the original point cloud.
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
Fog Simulation on Real LiDAR Point Clouds for 3D Object
Detection in Adverse Weather
The (top) row shows predictions by PV-RCNN trained on the original clear weather data (first row in tables above),
the (bottom) row shows predictions by PV-RCNN trained on a mix of clear weather and simulated foggy data
(fourth row in tables above) on three example scenes from the STF dense fog test split. Ground truth boxes in
color, predictions of the model in white.
DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
• For autonomous vehicles to viably replace human drivers they must contend with
inclement weather.
• Falling rain and snow introduce noise in LiDAR returns resulting in both false positive and
false negative object detections.
• This article introduces the Winter Adverse Driving dataSet (WADS) collected in the snow
belt region of Michigan’s Upper Peninsula.
• WADS is the multi-modal dataset featuring dense point-wise labeled sequential LiDAR
scans collected in severe winter weather; weather that would cause an experienced
driver to alter their driving behavior.
• It has labelled and will make available over 7 GB or 3.6 billion labelled LiDAR points out of
over 26 TB of total LiDAR and camera data collected.
• It also presents the Dynamic Statistical Outlier Removal (DSOR) filter, a statistical PCL-
based filter capable or removing snow with a higher recall than the SOTA snow de-
noising filter while being 28% faster.
• The dataset and DSOR filter will be at https://bitbucket.org/autonomymtu/dsor_filter.
DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Part of Winter Adverse Driving dataSet (WADS) showing
moderate snowfall 0.6 in/hr (1.5 cm/hr). (top) Clutter
from snow particles obscures LiDAR point clouds and
reduces visibility. Here, two oncoming vehicles are
concealed by the snow. (bottom) The DSOR filter is faster
at de-noising snow clutter than the SOTA and enables
object detection in moderate and severe snow.
DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Distribution of classes in the WADS dataset. Scenes from sub-urban driving including vehicles, roads, and
man-made structures are included. Two novel classes: falling-snow and adverse winter weather.
DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
A labeled sequence from the WADS dataset. Every point has a unique label and represents one of 22
classes. Active falling snow (beige) and accumulated snow (off-white) are unique to the dataset.
DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
PCL’s SOR (Statistical Outlier Removal) filter is a general
noise removal filter widely used for cleaning point clouds;
It does not account for the non-uniform distribution of
the point cloud and, when applied to a scan with falling
snow, fails to remove it.
The DROR (Dynamic Radius Outlier Removal) filter applies
a threshold to the mean distance of points to their
neighbors in a given radius to remove sparse snowflakes;
To address changing LiDAR point spacing with distance,
the DROR filter changes the search radius as the distance
increases from the sensor; The DROR filter achieves a high
accuracy but fails to produce a clean point cloud at high
snowfall rates.
The DSOR (Dynamic Statistical Outlier Removal) filter is an
extension of PCL’s SOR filter, designed to address the
inherent non-uniformity in point clouds.
DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Qualitative comparison of the SOR filter, DROR filter and DSOR filter (left to right). The original point cloud shows
snow clutter (orange points) that degrades LiDAR perception. The DSOR filter removes more snow compared to
both the SOR and DROR filters and preserves most of the environmental features.
DSOR: A Scalable Statistical Filter for Removing Falling Snow
from LiDAR Point Clouds in Severe Winter Weather
Percentage of filtered points as a function of range
(averaged over 100 point clouds). The DSOR filter
outperforms both the SOR and DROR filters in ranges
< 20m where most of the snow is concentrated.
The DSOR filter accurately filters out more snow than
the DROR filter and achieves a higher recall. Note that
the y-axis starts from 40 to better highlight differences
between the filters.
Thanks

More Related Content

What's hot

Drowsiness Detection Presentation
Drowsiness Detection PresentationDrowsiness Detection Presentation
Drowsiness Detection PresentationSaurabh Kawli
 
20190131 lidar-camera fusion semantic segmentation survey
20190131 lidar-camera fusion semantic segmentation survey20190131 lidar-camera fusion semantic segmentation survey
20190131 lidar-camera fusion semantic segmentation surveyTakuya Minagawa
 
Real time drowsy driver detection
Real time drowsy driver detectionReal time drowsy driver detection
Real time drowsy driver detectioncsandit
 
Fisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IVFisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IVYu Huang
 
fusion of Camera and lidar for autonomous driving II
fusion of Camera and lidar for autonomous driving IIfusion of Camera and lidar for autonomous driving II
fusion of Camera and lidar for autonomous driving IIYu Huang
 
3-d interpretation from single 2-d image for autonomous driving II
3-d interpretation from single 2-d image for autonomous driving II3-d interpretation from single 2-d image for autonomous driving II
3-d interpretation from single 2-d image for autonomous driving IIYu Huang
 
EYE TRACKING TECHNOLOGY
EYE TRACKING TECHNOLOGYEYE TRACKING TECHNOLOGY
EYE TRACKING TECHNOLOGYgeothomas18
 
Diabetic Retinopathy Analysis using Fundus Image
Diabetic Retinopathy Analysis using Fundus ImageDiabetic Retinopathy Analysis using Fundus Image
Diabetic Retinopathy Analysis using Fundus ImageManjushree Mashal
 
SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~
SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~
SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~SSII
 
Deep Learning’s Application in Radar Signal Data II
Deep Learning’s Application in Radar Signal Data IIDeep Learning’s Application in Radar Signal Data II
Deep Learning’s Application in Radar Signal Data IIYu Huang
 
Virtual retinal display ppt
Virtual retinal display pptVirtual retinal display ppt
Virtual retinal display pptHina Saxena
 
Multisensor Fusion and Integration - pres
Multisensor Fusion and Integration - presMultisensor Fusion and Integration - pres
Multisensor Fusion and Integration - presPraneel Chand
 
Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...
Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...
Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...Vignesh C
 
Single Image Super Resolution Overview
Single Image Super Resolution OverviewSingle Image Super Resolution Overview
Single Image Super Resolution OverviewLEE HOSEONG
 
Detection and recognition of face using neural network
Detection and recognition of face using neural networkDetection and recognition of face using neural network
Detection and recognition of face using neural networkSmriti Tikoo
 
3D SLAM introcution& current status
3D SLAM introcution& current status3D SLAM introcution& current status
3D SLAM introcution& current statuse8xu
 

What's hot (20)

Drowsiness Detection Presentation
Drowsiness Detection PresentationDrowsiness Detection Presentation
Drowsiness Detection Presentation
 
20190131 lidar-camera fusion semantic segmentation survey
20190131 lidar-camera fusion semantic segmentation survey20190131 lidar-camera fusion semantic segmentation survey
20190131 lidar-camera fusion semantic segmentation survey
 
Real time drowsy driver detection
Real time drowsy driver detectionReal time drowsy driver detection
Real time drowsy driver detection
 
Fisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IVFisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IV
 
fusion of Camera and lidar for autonomous driving II
fusion of Camera and lidar for autonomous driving IIfusion of Camera and lidar for autonomous driving II
fusion of Camera and lidar for autonomous driving II
 
3-d interpretation from single 2-d image for autonomous driving II
3-d interpretation from single 2-d image for autonomous driving II3-d interpretation from single 2-d image for autonomous driving II
3-d interpretation from single 2-d image for autonomous driving II
 
EYE TRACKING TECHNOLOGY
EYE TRACKING TECHNOLOGYEYE TRACKING TECHNOLOGY
EYE TRACKING TECHNOLOGY
 
Diabetic Retinopathy Analysis using Fundus Image
Diabetic Retinopathy Analysis using Fundus ImageDiabetic Retinopathy Analysis using Fundus Image
Diabetic Retinopathy Analysis using Fundus Image
 
SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~
SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~
SSII2021 [TS1] Visual SLAM ~カメラ幾何の基礎から最近の技術動向まで~
 
Face recognition
Face recognitionFace recognition
Face recognition
 
Deep Learning’s Application in Radar Signal Data II
Deep Learning’s Application in Radar Signal Data IIDeep Learning’s Application in Radar Signal Data II
Deep Learning’s Application in Radar Signal Data II
 
Virtual retinal display ppt
Virtual retinal display pptVirtual retinal display ppt
Virtual retinal display ppt
 
Multisensor Fusion and Integration - pres
Multisensor Fusion and Integration - presMultisensor Fusion and Integration - pres
Multisensor Fusion and Integration - pres
 
Data fusion
Data fusionData fusion
Data fusion
 
Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...
Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...
Drowsiness State Detection of Driver using Eyelid Movement- IRE Journal Confe...
 
Single Image Super Resolution Overview
Single Image Super Resolution OverviewSingle Image Super Resolution Overview
Single Image Super Resolution Overview
 
Detection and recognition of face using neural network
Detection and recognition of face using neural networkDetection and recognition of face using neural network
Detection and recognition of face using neural network
 
3D SLAM introcution& current status
3D SLAM introcution& current status3D SLAM introcution& current status
3D SLAM introcution& current status
 
Introduction of slam
Introduction of slamIntroduction of slam
Introduction of slam
 
face detection
face detectionface detection
face detection
 

Similar to LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)

Lidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rainLidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rainYu Huang
 
Lidar final ppt
Lidar final pptLidar final ppt
Lidar final pptrsarnagat
 
Lidar : light detection and rangeing
Lidar : light detection and rangeingLidar : light detection and rangeing
Lidar : light detection and rangeingRahul Bhagore
 
Introduction to lidar and its application
Introduction to lidar and its applicationIntroduction to lidar and its application
Introduction to lidar and its applicationVedant Srivastava
 
LiDAR and its application in civil engineering
LiDAR  and its application in civil engineeringLiDAR  and its application in civil engineering
LiDAR and its application in civil engineeringchippi babu
 
Lidarpptbysharath copy-140316044543-phpapp01
Lidarpptbysharath copy-140316044543-phpapp01Lidarpptbysharath copy-140316044543-phpapp01
Lidarpptbysharath copy-140316044543-phpapp01bhanu kushangala
 
Raymetrics at Open Coffee Athens #92
Raymetrics at Open Coffee Athens #92Raymetrics at Open Coffee Athens #92
Raymetrics at Open Coffee Athens #92Open Coffee Greece
 
Synthetic aperture radar
Synthetic aperture radar Synthetic aperture radar
Synthetic aperture radar Cigi Cletus
 
Airborne Laser Scanning Remote Sensing with LiDAR.ppt
Airborne Laser Scanning Remote Sensing with LiDAR.pptAirborne Laser Scanning Remote Sensing with LiDAR.ppt
Airborne Laser Scanning Remote Sensing with LiDAR.pptssuser6358fd
 
Lidar technology and it’s applications
Lidar technology and it’s applicationsLidar technology and it’s applications
Lidar technology and it’s applicationskarthik chegireddy
 
Differentiation between primary and secondary LIDAR system of Remote Sensing
Differentiation between primary and secondary LIDAR system of Remote SensingDifferentiation between primary and secondary LIDAR system of Remote Sensing
Differentiation between primary and secondary LIDAR system of Remote SensingNzar Braim
 
IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...
IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...
IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...IRJET Journal
 
Vegetation monitoring using gpm data over mongolia
Vegetation  monitoring using gpm data over mongoliaVegetation  monitoring using gpm data over mongolia
Vegetation monitoring using gpm data over mongoliaGeoMedeelel
 
Application Of Digital Signal Processing In Radar Signals
Application Of Digital Signal Processing In Radar SignalsApplication Of Digital Signal Processing In Radar Signals
Application Of Digital Signal Processing In Radar SignalsRichard Hogue
 

Similar to LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2) (20)

Lidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rainLidar in the adverse weather: dust, fog, snow and rain
Lidar in the adverse weather: dust, fog, snow and rain
 
Lidar final ppt
Lidar final pptLidar final ppt
Lidar final ppt
 
Lidar : light detection and rangeing
Lidar : light detection and rangeingLidar : light detection and rangeing
Lidar : light detection and rangeing
 
Introduction to lidar and its application
Introduction to lidar and its applicationIntroduction to lidar and its application
Introduction to lidar and its application
 
LiDAR and its application in civil engineering
LiDAR  and its application in civil engineeringLiDAR  and its application in civil engineering
LiDAR and its application in civil engineering
 
Lidar
LidarLidar
Lidar
 
Lidarpptbysharath copy-140316044543-phpapp01
Lidarpptbysharath copy-140316044543-phpapp01Lidarpptbysharath copy-140316044543-phpapp01
Lidarpptbysharath copy-140316044543-phpapp01
 
LIDAR
LIDARLIDAR
LIDAR
 
The UAE solar Atlas
The UAE solar AtlasThe UAE solar Atlas
The UAE solar Atlas
 
LIDAR
LIDARLIDAR
LIDAR
 
Raymetrics at Open Coffee Athens #92
Raymetrics at Open Coffee Athens #92Raymetrics at Open Coffee Athens #92
Raymetrics at Open Coffee Athens #92
 
Synthetic aperture radar
Synthetic aperture radar Synthetic aperture radar
Synthetic aperture radar
 
Airborne Laser Scanning Remote Sensing with LiDAR.ppt
Airborne Laser Scanning Remote Sensing with LiDAR.pptAirborne Laser Scanning Remote Sensing with LiDAR.ppt
Airborne Laser Scanning Remote Sensing with LiDAR.ppt
 
Lidar technology and it’s applications
Lidar technology and it’s applicationsLidar technology and it’s applications
Lidar technology and it’s applications
 
Differentiation between primary and secondary LIDAR system of Remote Sensing
Differentiation between primary and secondary LIDAR system of Remote SensingDifferentiation between primary and secondary LIDAR system of Remote Sensing
Differentiation between primary and secondary LIDAR system of Remote Sensing
 
Lidar and sensing
Lidar and sensingLidar and sensing
Lidar and sensing
 
IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...
IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...
IRJET- Advanced Border Security Alert for Fishermen and Smart Data Transfer u...
 
Lesnoy dozor small
Lesnoy dozor smallLesnoy dozor small
Lesnoy dozor small
 
Vegetation monitoring using gpm data over mongolia
Vegetation  monitoring using gpm data over mongoliaVegetation  monitoring using gpm data over mongolia
Vegetation monitoring using gpm data over mongolia
 
Application Of Digital Signal Processing In Radar Signals
Application Of Digital Signal Processing In Radar SignalsApplication Of Digital Signal Processing In Radar Signals
Application Of Digital Signal Processing In Radar Signals
 

More from Yu Huang

Application of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous DrivingApplication of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous DrivingYu Huang
 
Data Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous DrivingData Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous DrivingYu Huang
 
Techniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous DrivingTechniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous DrivingYu Huang
 
BEV Joint Detection and Segmentation
BEV Joint Detection and SegmentationBEV Joint Detection and Segmentation
BEV Joint Detection and SegmentationYu Huang
 
BEV Object Detection and Prediction
BEV Object Detection and PredictionBEV Object Detection and Prediction
BEV Object Detection and PredictionYu Huang
 
Fisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VIFisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VIYu Huang
 
Fisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving VFisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving VYu Huang
 
Prediction,Planninng & Control at Baidu
Prediction,Planninng & Control at BaiduPrediction,Planninng & Control at Baidu
Prediction,Planninng & Control at BaiduYu Huang
 
Cruise AI under the Hood
Cruise AI under the HoodCruise AI under the Hood
Cruise AI under the HoodYu Huang
 
Scenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous DrivingScenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous DrivingYu Huang
 
How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?Yu Huang
 
Annotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingAnnotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingYu Huang
 
Simulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atgSimulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atgYu Huang
 
Multi sensor calibration by deep learning
Multi sensor calibration by deep learningMulti sensor calibration by deep learning
Multi sensor calibration by deep learningYu Huang
 
Prediction and planning for self driving at waymo
Prediction and planning for self driving at waymoPrediction and planning for self driving at waymo
Prediction and planning for self driving at waymoYu Huang
 
Jointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planningJointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planningYu Huang
 
Data pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous drivingData pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous drivingYu Huang
 
Open Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planningOpen Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planningYu Huang
 
Autonomous Driving of L3/L4 Commercial trucks
Autonomous Driving of L3/L4 Commercial trucksAutonomous Driving of L3/L4 Commercial trucks
Autonomous Driving of L3/L4 Commercial trucksYu Huang
 
3-d interpretation from single 2-d image V
3-d interpretation from single 2-d image V3-d interpretation from single 2-d image V
3-d interpretation from single 2-d image VYu Huang
 

More from Yu Huang (20)

Application of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous DrivingApplication of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous Driving
 
Data Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous DrivingData Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous Driving
 
Techniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous DrivingTechniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous Driving
 
BEV Joint Detection and Segmentation
BEV Joint Detection and SegmentationBEV Joint Detection and Segmentation
BEV Joint Detection and Segmentation
 
BEV Object Detection and Prediction
BEV Object Detection and PredictionBEV Object Detection and Prediction
BEV Object Detection and Prediction
 
Fisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VIFisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VI
 
Fisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving VFisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving V
 
Prediction,Planninng & Control at Baidu
Prediction,Planninng & Control at BaiduPrediction,Planninng & Control at Baidu
Prediction,Planninng & Control at Baidu
 
Cruise AI under the Hood
Cruise AI under the HoodCruise AI under the Hood
Cruise AI under the Hood
 
Scenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous DrivingScenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous Driving
 
How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?
 
Annotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingAnnotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous Driving
 
Simulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atgSimulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atg
 
Multi sensor calibration by deep learning
Multi sensor calibration by deep learningMulti sensor calibration by deep learning
Multi sensor calibration by deep learning
 
Prediction and planning for self driving at waymo
Prediction and planning for self driving at waymoPrediction and planning for self driving at waymo
Prediction and planning for self driving at waymo
 
Jointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planningJointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planning
 
Data pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous drivingData pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous driving
 
Open Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planningOpen Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planning
 
Autonomous Driving of L3/L4 Commercial trucks
Autonomous Driving of L3/L4 Commercial trucksAutonomous Driving of L3/L4 Commercial trucks
Autonomous Driving of L3/L4 Commercial trucks
 
3-d interpretation from single 2-d image V
3-d interpretation from single 2-d image V3-d interpretation from single 2-d image V
3-d interpretation from single 2-d image V
 

Recently uploaded

Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations120cr0395
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
chaitra-1.pptx fake news detection using machine learning
chaitra-1.pptx  fake news detection using machine learningchaitra-1.pptx  fake news detection using machine learning
chaitra-1.pptx fake news detection using machine learningmisbanausheenparvam
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxwendy cai
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidNikhilNagaraju
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130Suhani Kapoor
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...ranjana rawat
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 

Recently uploaded (20)

Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
chaitra-1.pptx fake news detection using machine learning
chaitra-1.pptx  fake news detection using machine learningchaitra-1.pptx  fake news detection using machine learning
chaitra-1.pptx fake news detection using machine learning
 
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptx
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfid
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptxExploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
Roadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and RoutesRoadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and Routes
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 

LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)

  • 1. LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2) Yu Huang Sunnyvale, California Yu.huang07@gmail.com
  • 2. Outline • Canadian Adverse Driving Conditions Dataset, 2020, 2 • Deep multimodal sensor fusion in unseen adverse weather, 2020, 8 • RADIATE: A Radar Dataset for Automotive Perception in Bad Weather, 2021, 4 • Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection, 2021, 7 • Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather, 2021, 8 • DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather, 2021, 9
  • 3. Canadian Adverse Driving Conditions Dataset • The Canadian Adverse Driving Conditions (CADC) dataset was collected with the Autonomoose autonomous vehicle platform, based on a modified Lincoln MKZ. • The dataset, collected during winter within the Region of Waterloo, Canada, autonomous vehicle dataset that focuses on adverse driving conditions specifically. • It contains 7,000 frames collected through a variety of winter weather conditions of annotated data from 8 cameras (Ximea MQ013CG-E2), Lidar (VLP-32C) and a GNSS+INS system (Novatel OEM638). • Lidar frame annotations that represent ground truth for 3D object detection and tracking have been provided by Scale AI.
  • 4. Canadian Adverse Driving Conditions Dataset Autonomoose, autonomous vehicle testing platform A map of data collected for CADC
  • 5. Canadian Adverse Driving Conditions Dataset
  • 6. Canadian Adverse Driving Conditions Dataset
  • 7. Canadian Adverse Driving Conditions Dataset Top down lidar view of each snowfall levels with the corresponding front camera image. Top: left image couple is the light snow and the right side is medium snow. Bottom: left image couple is heavy snow and the right is extreme snow.
  • 8. Deep Multimodal Sensor Fusion in Unseen Adverse Weather • The fusion of multimodal sensor streams, such as camera, lidar, and radar measurements, plays a critical role in object detection for autonomous vehicles. • While existing methods exploit redundant info under good conditions, they fail to in adverse weather where the sensory streams can be asymmetrically distorted. • To address this data challenge, this paper presents a multi-modal dataset acquired by over 10,000 km of driving in northern Europe. • Although this dataset is a large multimodal dataset in adverse weather, with 100k labels for lidar, camera, radar and gated NIR sensors, it does not facilitate training as extreme weather is rare. • It presents a deep fusion network for robust fusion without a large corpus of labeled training data covering all asymmetric distortions. • Departing from proposal-level fusion, a single-shot model adaptively fuses features, driven by measurement entropy. • The dataset and all models will be published.
  • 9. Deep Multimodal Sensor Fusion in Unseen Adverse Weather Existing object detection methods, including efficient Single-Shot detectors (SSD), are trained on automotive datasets that are biased towards good weather conditions. While these methods work well in good conditions, they fail in rare weather events (top). Lidar only detectors, such as the same SSD model trained on projected lidar depth, might be distorted due to severe backscatter in fog or snow (center). These asymmetric distortions are a challenge for fusion methods, that rely on redundant information. The proposed method (bottom) learns to tackle unseen (potentially asymmetric) distortions in multimodal data without seeing training data of these rare scenarios.
  • 10. Deep Multimodal Sensor Fusion in Unseen Adverse Weather Right: Geographical coverage of the data collection campaign covering two months and 10,000 km in Germany, Sweden, Denmark and Finland. Top Left: Test vehicle setup with top- mounted lidar, gated camera with flash illumination, RGB camera, proprietary radar, FIR camera, weather station and road friction sensor. Bottom Left: Distribution of weather conditions throughout the data acquisition. The driving data is highly unbalanced with respect to weather conditions and only contains adverse conditions as rare samples.
  • 11. Deep Multimodal Sensor Fusion in Unseen Adverse Weather Multimodal sensor response of RGB camera, scanning lidar, gated camera and radar in a fog chamber with dense fog. Reference recordings under clear conditions are shown the first row, recordings in fog with visibility of 23m are shown in the second row.
  • 12. Deep Multimodal Sensor Fusion in Unseen Adverse Weather Overview of architecture consisting of four SSDs branches with deep feature exchange and adaptive fusion of lidar, RGB camera, gated camera and radar. All sensory data is projected into the camera coordinate system. To enable a steered fusion in-between sensors, the sensor entropy is provided to each feature exchange block (red). The deep feature exchange blocks (white) interchange information (blue) with parallel feature extraction blocks. The fused feature maps are analyzed by SSD blocks (orange).
  • 13. Deep Multimodal Sensor Fusion in Unseen Adverse Weather Normalized entropy with respect to the clear reference recording for a gated camera, rgb camera, radar and lidar in varying fog visibilities (left) and changing illumination (right). The entropy has been calculated based on a dynamic scenario within a controlled fog chamber and a static scenario with changing natural illumination settings. Note the asymmetric sensor failure for different sensor technologies.
  • 14. Deep Multimodal Sensor Fusion in Unseen Adverse Weather
  • 15. RADIATE: A Radar Dataset for Automotive Perception in Bad Weather • This paper presents the RAdar Dataset In Adverse weaThEr (RADIATE), aiming to facilitate research on object detection, tracking and scene understanding using radar sensing for safe autonomous driving. • RADIATE includes 3 hours of annotated radar images with more than 200K labelled road actors in total, on average about 4.6 instances per radar image. • It covers 8 different categories of actors in a variety of weather conditions (e.g., sun, night, rain, fog and snow) and driving scenarios (e.g., parked, urban, motorway and suburban), representing different levels of challenge. • This is a public radar dataset which provides high-resolution radar images on public roads with a large amount of road actors labelled. • Some baseline results of radar based object detection and recognition are given to show that the use of radar data is promising for automotive applications in bad weather, where vision and LiDAR can fail. • RADIATE also has stereo images, 32-channel LiDAR and GPS data, directed at other applications such as sensor fusion, localisation and mapping. • The public dataset can be accessed at http://pro.hw.ac.uk/radiate/.
  • 16. RADIATE: A Radar Dataset for Automotive Perception in Bad Weather Examples from RADIATE. This dataset contains radar, stereo camera, LiDAR and GPS data. It was collected in various weather conditions and driving scenarios with 8 categories of annotated objects. Qualitative results of radar based vehicle detection.
  • 17. RADIATE: A Radar Dataset for Automotive Perception in Bad Weather
  • 18. RADIATE: A Radar Dataset for Automotive Perception in Bad Weather Sensor setup for data collection. Category distribution for each scenario.
  • 19. RADIATE: A Radar Dataset for Automotive Perception in Bad Weather Data in various weather conditions. Top: Image with LiDAR points projected. Middle: Radar with objects annotated. Bottom: LiDAR with objects projected from radar annotation. Note both image and LiDAR images are degraded in fog, rain and snow. The yellow circles encloses false LiDAR points caused by snow flakes.
  • 20. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection • Lidar-based object detectors are known to be sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to- background ratio (SBR). • As a result, lidar-based object detectors trained on data captured in normal weather tend to perform poorly in such scenarios. • However, collecting and labelling sufficient training data in a diverse range of adverse weather conditions is laborious and prohibitively expensive. • A physics- based approach to simulate lidar point clouds in adverse weather conditions. • These augmented datasets can then be used to train lidar-based detectors to improve their all-weather reliability. • Specifically, a hybrid Monte-Carlo based approach treats (i) the effects of large particles by placing them randomly and comparing their back reflected power against the target, and (ii) attenuation effects on average through calculation of scattering efficiencies from the Mie theory and particle size distributions. • Retraining networks with this augmented data improves mean average precision evaluated on real world rainy scenes.
  • 21. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection LiDAR: a) Bistatic lidar (where receiver-Rx and transmitter- Tx share different optical paths) parameters relevant for weather augmentation calculations. b) Light scattering by a single particle in the wave picture with a cartoon radiation pattern. c) Light scattering by a random ensemble of droplets in the ray picture. Rain: a) Extinction efficiency as a function of rain droplet diameter. b) Rain particle size distribution using the Marshall-Palmer distribution for different rain rates. c) Extinction coefficient as a function of rain rate from Mie theory (orange) and the asymptotic solution from (blue).
  • 22. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection intensity by Beer-Lambert law average extinction coefficient maximum detectable range extinction coefficients standard deviation for range noise width of a Gaussian beam beam waist
  • 23. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection Graph of probability of occurrence of rain rate versus rain rate in mm/hr. Rain rates are sampled from this distribution for all simulation experiments. In addition to the rain augmentations, it augments lidar scenes in the KITTI dataset with snow, moderate advection fog, and strong advection fog. The algorithm for the simulation of snow is similar to that of rain.
  • 24. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection A qualitative comparison for simulating a rainy lidar scene against real world rainy data. Top: A clear weather scene (Scene A) from the Waymo Open Dataset. Bottom left: Scene A augmented with rain rate 35 mm/hr. Bottom right: A real world rainy scene (Scene B). Rain leads to sparser scenes, increased range uncertainty and reduced visibility.
  • 25. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection A comparison of detection before and after rain simulation for three networks trained on the KITTI dataset, PointPillars (first column), PV-RCNN (second column), and deformable PV-RCNN (third column). The first row depicts predictions under normal conditions, while the second row depicts predictions under rainy conditions. The green bounding boxes are the GT annotations, while the red bounding boxes are the predictions.
  • 26. Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of Adverse Weather Conditions for 3D Object Detection
  • 27. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather • It addresses the task of LiDAR- based 3D object detection in foggy weather. • Collecting/annotating data in scenarios is time, labor and cost intensive. • This paper tackles by simulating physically accurate fog into clear-weather scenes, the existing real datasets in clear weather can be repurposed. • 1) develop a physically valid fog simulation method that is applicable to any LiDAR dataset. This unleashes the acquisition of large-scale foggy training data at no extra cost. These partially synthetic data can be used to improve the robustness of several perception methods, such as 3D object detection and tracking or simultaneous localization and mapping, on real foggy data. • 2) Through experiments with several SOTA detection approaches, fog simulation can be leveraged to improve the performance for 3D object detection in the presence of fog. • The code is available at www.trace.ethz.ch/lidar fog simulation.
  • 28. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather LiDAR returns caused by fog in the (top) scene. (a) shows the strongest returns and (b) the last returns, color coded by the LiDAR channel. The returns of the ground are removed for better visibility of the points introduced by fog. (red - low, cyan - high, 3D bounding box annotation in green, ego vehicle dimensions in gray). LiDAR returns caused by fog in the (top) scene. Color coded by the LiDAR channel in (a) and by the intensity in (b). The returns of the ground are removed for better visibility of the points introduced by fog.
  • 29. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather • To simulate the effect of fog on real-world LiDAR point clouds that have been recorded in clear weather, it needs to resort to the optical system model that underlies the function of the transmitter and receiver of the LiDAR sensor. • It examines a single measurement/point, model the full signal of received power as a function of the range and recover its exact form corresponding to the original clear- weather measurement. • This allows to operate in the signal domain and implement the transformation from clear weather to fog simply by modifying the part of the impulse response that pertains to the optical channel (i.e. the atmosphere). • It makes the contribution of constructing a direct relation between the response (range- dependent received signal power) in clear weather and in fog for the same 3D scene and this relation enables to simulate fog on real clear-weather LiDAR measurements. • Compared to clear weather, the spatial impulse response in fog is more involved, but it can still be decomposed into two terms, corresponding to the hard and the soft target respectively. • Depending on the distance of the hard target from the sensor, the soft-target term of the response may exhibit a larger maximum value than the hard-target term, which implies that the measured range changes due to the presence of fog and becomes equal to the point of maximum of the soft-target term.
  • 30. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather Sketch of a LiDAR sensor where the transmitter Tx and the receiver Rx do not have coaxial optics, but have parallel axes. This is called a bistatic beam configuration. The two terms of the received signal power PR,fog from a single LiDAR pulse, associated to the solid object that reflects the pulse (Phard R,fog) and the soft fog target (Psoft R,fog), plotted across the range domain. While in (a) the fog is not thick enough to yield a return, in (b) it is thick enough to yield a return that overshadows the solid object at R0 = 30m.
  • 31. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather received signal power hard target term soft target term
  • 32. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather Comparison of fog simulation (bottom) to the previous fog simulation in STF dataset (middle) with α set to 0.06, which corresponds to a meteorological optical range (MOR) ≈ 50m. In the left column, the point cloud is color coded by the intensity and in the right column it is color coded by the height (z value). The top row shows the original point cloud.
  • 33. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather
  • 34. Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather The (top) row shows predictions by PV-RCNN trained on the original clear weather data (first row in tables above), the (bottom) row shows predictions by PV-RCNN trained on a mix of clear weather and simulated foggy data (fourth row in tables above) on three example scenes from the STF dense fog test split. Ground truth boxes in color, predictions of the model in white.
  • 35. DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather • For autonomous vehicles to viably replace human drivers they must contend with inclement weather. • Falling rain and snow introduce noise in LiDAR returns resulting in both false positive and false negative object detections. • This article introduces the Winter Adverse Driving dataSet (WADS) collected in the snow belt region of Michigan’s Upper Peninsula. • WADS is the multi-modal dataset featuring dense point-wise labeled sequential LiDAR scans collected in severe winter weather; weather that would cause an experienced driver to alter their driving behavior. • It has labelled and will make available over 7 GB or 3.6 billion labelled LiDAR points out of over 26 TB of total LiDAR and camera data collected. • It also presents the Dynamic Statistical Outlier Removal (DSOR) filter, a statistical PCL- based filter capable or removing snow with a higher recall than the SOTA snow de- noising filter while being 28% faster. • The dataset and DSOR filter will be at https://bitbucket.org/autonomymtu/dsor_filter.
  • 36. DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather Part of Winter Adverse Driving dataSet (WADS) showing moderate snowfall 0.6 in/hr (1.5 cm/hr). (top) Clutter from snow particles obscures LiDAR point clouds and reduces visibility. Here, two oncoming vehicles are concealed by the snow. (bottom) The DSOR filter is faster at de-noising snow clutter than the SOTA and enables object detection in moderate and severe snow.
  • 37. DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather Distribution of classes in the WADS dataset. Scenes from sub-urban driving including vehicles, roads, and man-made structures are included. Two novel classes: falling-snow and adverse winter weather.
  • 38. DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather A labeled sequence from the WADS dataset. Every point has a unique label and represents one of 22 classes. Active falling snow (beige) and accumulated snow (off-white) are unique to the dataset.
  • 39. DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather PCL’s SOR (Statistical Outlier Removal) filter is a general noise removal filter widely used for cleaning point clouds; It does not account for the non-uniform distribution of the point cloud and, when applied to a scan with falling snow, fails to remove it. The DROR (Dynamic Radius Outlier Removal) filter applies a threshold to the mean distance of points to their neighbors in a given radius to remove sparse snowflakes; To address changing LiDAR point spacing with distance, the DROR filter changes the search radius as the distance increases from the sensor; The DROR filter achieves a high accuracy but fails to produce a clean point cloud at high snowfall rates. The DSOR (Dynamic Statistical Outlier Removal) filter is an extension of PCL’s SOR filter, designed to address the inherent non-uniformity in point clouds.
  • 40. DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather Qualitative comparison of the SOR filter, DROR filter and DSOR filter (left to right). The original point cloud shows snow clutter (orange points) that degrades LiDAR perception. The DSOR filter removes more snow compared to both the SOR and DROR filters and preserves most of the environmental features.
  • 41. DSOR: A Scalable Statistical Filter for Removing Falling Snow from LiDAR Point Clouds in Severe Winter Weather Percentage of filtered points as a function of range (averaged over 100 point clouds). The DSOR filter outperforms both the SOR and DROR filters in ranges < 20m where most of the snow is concentrated. The DSOR filter accurately filters out more snow than the DROR filter and achieves a higher recall. Note that the y-axis starts from 40 to better highlight differences between the filters.