LiDAR in the Adverse Weather:
Dust, Snow, Rain and Fog
Yu Huang
Sunnyvale, California
Yu.huang07@gmail.com
Outline
• Boosting LIDAR-based semantic labeling by cross-modal training data
generation
• Weather Influence and Classification with Automotive Lidar Sensors
• A Benchmark for Lidar Sensors in Fog: Is Detection Breaking Down?
• CNN-based Lidar Point Cloud De-Noising in Adverse Weather
• LIBRE: The Multiple 3D LiDAR Dataset
• Characterization of Multiple 3D LiDARs for Localization and Mapping using
Normal Distributions Transform
• Performance Analysis of 10 Models of 3D LiDARs for Automated Driving
• LaNoising: A Data-driven Approach for 903nm ToF LiDAR Performance
Modeling under Fog
Boosting LIDAR-based semantic labeling by
cross-modal training data generation
• Mobile robots and autonomous vehicles rely on multi-modal sensor setups to
perceive and understand their surroundings.
• Aside from cameras, LiDAR sensors represent a central component of state-of-the
art perception systems.
• In addition to accurate spatial perception, a comprehensive semantic
understanding of the environment is essential for efficient and safe operation.
• This paper presents a deep neural network architecture called LiLaNet for point-
wise, multi-class semantic labeling of semi-dense LiDAR data.
• The network utilizes virtual image projections of the 3D point clouds for efficient
inference.
• An automated process is designed for large-scale cross-modal training data
generation called Autolabeling, in order to boost semantic labeling performance
while keeping the manual annotation effort low.
Boosting LIDAR-based semantic labeling by
cross-modal training data generation
• Based on initial experiments with regard to the discrimination of different semantic classes in
LiDAR data, it can first apply a mapping of the original Cityscapes labelset to a reduced set of 13
semantic classes (see Table 1).
• This reduced label set is better tailored to the specific properties of the data provided by current
LiDAR sensors, where limited spatial resolution and coarse reflectivity estimates prohibit truly
fine-grained semantic differentiation.
Boosting LIDAR-based semantic labeling by
cross-modal training data generation
Boosting LIDAR-based semantic labeling by
cross-modal training data generation
Autolabeling
Each pixel of the reference camera image is classified via an image-based semantic
labeling network. Subsequently, the point cloud is projected into the camera image and
the reference labels are transferred to the corresponding LiDAR points.
Boosting LIDAR-based semantic labeling by
cross-modal training data generation
Weather Influence and Classification with
Automotive Lidar Sensors
• Lidar sensors are often used in mobile robots and autonomous vehicles to
complement camera, radar and ultrasonic sensors for environment perception.
• Typically, perception algorithms are trained to only detect moving and static
objects as well as ground estimation, but intentionally ignore weather effects to
reduce false detections.
• This work presents an in-depth analysis of automotive lidar performance under
harsh weather conditions, i.e. heavy rain and dense fog.
• An extensive data set has been recorded for various fog and rain conditions,
which is the basis for the conducted in-depth analysis of the point cloud under
changing environmental conditions.
• It introduces an approach to detect and classify rain or fog with lidar sensors only
and achieve an mean union over intersection of 97.14% for a data set in
controlled environments.
Weather Influence and Classification with
Automotive Lidar Sensors
• For lidar the most challenging conditions are bright sun, fog, rain, dirt and spray.
• Significant attenuation for lidar compared to radar in challenging conditions.
• For lidar sensors objects could disappear behind the airborne dust.
• In general radar sensors are robust against fog compared to lidar and camera
sensors which are strongly affected by fog.
• Smaller changes in the distance of detected objects, while the intensity and the
number of points decrease dramatically for LiDAR in the rain.
• Influence of dust on lidar sensors is systematic and predictable, as the lidar
measures the leading edge of a dust cloud, which occurs with the lidar used from
a transmission of about 70 %.
• 1550nm lidar sensor is outperforming the 905nm sensor in adverse weather, due
to the lower restrictions on emitted light power to reach laser class 1.
Weather Influence and Classification with
Automotive Lidar Sensors
Scala
Weather Influence and Classification with
Automotive Lidar Sensors
VLP16
Weather Influence and Classification with
Automotive Lidar Sensors
A Benchmark for Lidar Sensors in Fog: Is
Detection Breaking Down?
• Autonomous driving at level five does not only means self-driving in
the sunshine.
• Adverse weather is especially critical because fog, rain, and snow
degrade the perception of the environment.
• In this work, current state of the art light detection and ranging (lidar)
sensors are tested in controlled conditions in a fog chamber.
• It presents current problems and disturbance patterns for four
different state of the art lidar systems.
• It also investigates how tuning internal parameters can improve their
performance in bad weather situations.
A Benchmark for Lidar Sensors in Fog: Is
Detection Breaking Down?
• Attenuation due to fog and smoke in a free space optical communication link are
not wavelength dependent for visibilities less than 500m and wavelengths in
between 600-1500 nm.
• Velodyne claims to increase the capabilities in adverse weather, due to the usage
of a digital signal processor which enables dynamically tuning the laser power if a
clear object reflection is not obtained from a certain laser pulse.
• Both Velodyne S2 and S3D are able to apply this process automatically and to
choose from eight power levels
• The noise level describes the ground uncertainty floor for a detected signal from
spurious light and thermal excitations.
• Velodyne provides a factory-side calibration containing a slope and focal distance
for each laser in order to correct the beam divergence.
A Benchmark for Lidar Sensors in Fog: Is
Detection Breaking Down?
A Benchmark for Lidar Sensors in Fog: Is
Detection Breaking Down?
CNN-based Lidar Point Cloud De-Noising in
Adverse Weather
• Lidar sensors are frequently used in environment perception for
autonomous vehicles and mobile robotics to complement camera, radar,
and ultrasonic sensors.
• Adverse weather conditions are significantly impacting the performance of
lidar based scene understanding by causing undesired measurement points
that in turn effect missing detections and false positives.
• In heavy rain or dense fog, water drops could be misinterpreted as objects
in front of the vehicle which brings a mobile robot to a full stop.
• This paper presents the CNN-based approach to understand and filter out
such adverse weather effects in point cloud data.
• Data is available at https://github.com/rheinzler/PointCloudDeNoising.
CNN-based Lidar Point Cloud De-Noising in
Adverse Weather
• De-noising algorithms in 3D space are often based on spatial features to discard
noise points caused by rain or snow.
• 2D depth image de-noising is mainly based on dense depth information obtained
by stereo vision and depth cameras.
• Traditional algorithms developed over years for camera image de-noising can be
applied in a straightforward fashion.
• These approaches can be split in three different categories: (1) spatial, (2)
statistical and (3) segmentation-based methods.
• Lidar point clouds are significantly less dense compared to camera images,
particularly at larger distances.
CNN-based Lidar Point Cloud De-Noising in
Adverse Weather
• In the 3D domain many approaches are based on the spatial vicinity or statistical
distributions of the point cloud, such as the statistical outlier removal (SOR) and
radius outlier removal (ROR) filter.
• The SOR defines the vicinity of a point based on its mean distance to all k
neighbors compared with a threshold derived by the global mean distance and
standard deviation of all points.
• The ROR filter directly counts the number of neighbors within the radius r in
order to decide whether a point is filtered or not.
• Approaches based on spatial vicinity consequently discard single reflections
without points in the neighborhood.
• However, sparsity is not a valid feature to filter scatter caused by fog or drizzle, as
soon as the density of the distribution of water drops increases.
CNN-based Lidar Point Cloud De-Noising in
Adverse Weather
WeatherNet
CNN-based Lidar Point Cloud De-Noising in
Adverse Weather
CNN-based Lidar Point Cloud De-Noising in
Adverse Weather
CNN-based Lidar Point Cloud De-Noising in
Adverse Weather
WeatherNet segmentation results for road data recorded under light rainfall
LIBRE: The Multiple 3D LiDAR Dataset
• LIBRE (LiDAR Benchmarking and Reference) is a dataset featuring 10 different LiDAR sensors,
covering a range of manufacturers, models, and laser configurations.
• Data from each sensor includes three different environments and configurations:
• static targets, where objects were placed at known distances and measured from a fixed position within
a controlled environment;
• adverse weather, where static obstacles were measured from a moving vehicle, captured in a weather
chamber where LiDARs were exposed to different conditions (fog, rain, strong light);
• and finally, dynamic traffic, where dynamic objects were captured from a vehicle driven on public urban
roads, multiple times at different times of the day, and including supporting sensors such as cameras,
infrared imaging, and odometry devices.
• LIBRE contributions: (1) provide a means for a comparison of currently available LiDARs, and (2)
facilitate the improvement of existing self-driving vehicles and robotics-related software, in terms
of development and tuning of LiDAR-based perception algorithms.
• Data site: https://sites.google.com/g.sp.m.is.nagoya-u.ac.jp/libre-dataset/
LIBRE: The Multiple 3D LiDAR Dataset
LIBRE: The Multiple 3D LiDAR Dataset
LIBRE: The Multiple 3D LiDAR Dataset
LIBRE: The Multiple 3D LiDAR Dataset
LIBRE: The Multiple 3D LiDAR Dataset
LIBRE: The Multiple 3D LiDAR Dataset
Dynamic traffic scenes by applying SOTA algorithms (Autoware) on point cloud.
LIBRE: The Multiple 3D LiDAR Dataset
“Rain pillars” as detected by a LiDAR
Characterization of Multiple 3D LiDARs for Localization
and Mapping using Normal Distributions Transform
• A comparison of ten 3D LiDAR sensors, covering a range of manufacturers,
models, and laser configurations, for the tasks of mapping and vehicle
localization, using as common reference the Normal Distributions Transform
(NDT) algorithm implemented in the self-driving open source platform Autoware.
• LiDAR data used is a subset of our LiDAR Benchmarking and Reference (LIBRE)
dataset, captured independently from each sensor, from a vehicle driven on
public urban roads multiple times, at different times of the day.
• In this study, it analyzed the performance and characteristics of each LiDAR for
the tasks of (1) 3D mapping including an assessment map quality based on mean
map entropy, and (2) 6-DOF localization using a ground truth reference map.
Characterization of Multiple 3D LiDARs for Localization
and Mapping using Normal Distributions Transform
Characterization of Multiple 3D LiDARs for Localization
and Mapping using Normal Distributions Transform
Characterization of Multiple 3D LiDARs for Localization
and Mapping using Normal Distributions Transform
Reference MMS map colored entropy (a) (green lower entropy, magenta is higher entropy) and by
plane variance (b) (red lower variance, green higher variance) for each point.
Characterization of Multiple 3D LiDARs for Localization
and Mapping using Normal Distributions Transform
Voxel grid filter applied to the OS1-64 point cloud
with different maximum range filters. Qualitative results of localization for each LiDAR
Performance Analysis of 10 Models of 3D
LiDARs for Automated Driving
• This work compares 10 commonly used 3D LiDARs, establishing several metrics to
assess their performance.
• Various outstanding issues with specific LiDARs were qualitatively identified.
• The accuracy and precision of individual LiDAR beams and accumulated point
clouds are evaluated in a controlled environment at distances from 5 to 180
meters.
• Reflective targets were used to characterize intensity patterns and quantify the
impact of surface reflectivity on accuracy and precision.
• A vehicle and pedestrian mannequin were used as additional targets of interest.
• A thorough assessment of these LiDARs is given with their potential applicability
for automated driving tasks.
Performance Analysis of 10 Models of 3D
LiDARs for Automated Driving
Japan Automobile Research Institute (JARI)
Performance Analysis of 10 Models of 3D
LiDARs for Automated Driving
The reflective target, vehicle and pedestrian
targets during experiments as seen from the
experimental vehicle's dashboard camera. Each
third of the reflective target is A0 size. The left-
most target is the black velvet, center target is a
regular poster board, and right-most target is a
diamond-grade reflective sheets.
Performance Analysis of 10 Models of 3D
LiDARs for Automated Driving
Performance Analysis of 10 Models of 3D
LiDARs for Automated Driving
LaNoising: A Data-driven Approach for 903nm
ToF LiDAR Performance Modeling under Fog
• As a critical sensor for high-level autonomous vehicles, LiDAR’s limitations in
adverse weather (e.g. rain, fog, snow, etc.) impede the deployment of self-driving
cars in all weather conditions.
• In this paper, model the performance of a popular 903nm ToF LiDAR under
various fog conditions based on a LiDAR dataset collected in a well-controlled
artificial fog chamber.
• Specifically, a two-stage data-driven method, called LaNoising (la for laser), is
proposed for generating LiDAR measurements under fog conditions.
• In the first stage, the Gaussian Process Regression (GPR) model is established to
predict whether a laser can successfully output a true detection range or not,
given certain fog visibility values.
• If not, then in the second stage, the Mixture Density Network (MDN) is used to
provide a probability prediction of the noisy measurement range.
LaNoising: A Data-driven Approach for 903nm
ToF LiDAR Performance Modeling under Fog
Point cloud under clear weather conditions (left) and noising simulation for fog environment (right).
LaNoising: A Data-driven Approach for 903nm
ToF LiDAR Performance Modeling under Fog
Range measurements change with increasing visibility for a laser in fog environment
LaNoising: A Data-driven Approach for 903nm
ToF LiDAR Performance Modeling under Fog
• The signal processing system in LiDAR is required to detect the true return signal
with a low signal-to-noise ratio.
• Typically, a thresholding algorithm is applied.
• Under the homogeneity assumption of fog, the extinction coefficient and object
surface reflectivity are both influenced by fog.
• However, since the signal processing unit is usually a black box embedded inside
the sensor (for commercial purposes), the analytical form of LiDAR ranging can
hardly be obtained.
• Therefore, it is necessary to adopt a data-driven approach to break through this
problem, as the analytic ranging form is critical to model noise under adverse
weather conditions.
LaNoising: A Data-driven Approach for 903nm
ToF LiDAR Performance Modeling under Fog
LaNoising: A Data-driven Approach for 903nm
ToF LiDAR Performance Modeling under Fog
Architecture of MDN
LaNoising: A Data-driven Approach for 903nm
ToF LiDAR Performance Modeling under Fog
GPR training and testing results
Thanks

Lidar in the adverse weather: dust, fog, snow and rain

  • 1.
    LiDAR in theAdverse Weather: Dust, Snow, Rain and Fog Yu Huang Sunnyvale, California Yu.huang07@gmail.com
  • 2.
    Outline • Boosting LIDAR-basedsemantic labeling by cross-modal training data generation • Weather Influence and Classification with Automotive Lidar Sensors • A Benchmark for Lidar Sensors in Fog: Is Detection Breaking Down? • CNN-based Lidar Point Cloud De-Noising in Adverse Weather • LIBRE: The Multiple 3D LiDAR Dataset • Characterization of Multiple 3D LiDARs for Localization and Mapping using Normal Distributions Transform • Performance Analysis of 10 Models of 3D LiDARs for Automated Driving • LaNoising: A Data-driven Approach for 903nm ToF LiDAR Performance Modeling under Fog
  • 3.
    Boosting LIDAR-based semanticlabeling by cross-modal training data generation • Mobile robots and autonomous vehicles rely on multi-modal sensor setups to perceive and understand their surroundings. • Aside from cameras, LiDAR sensors represent a central component of state-of-the art perception systems. • In addition to accurate spatial perception, a comprehensive semantic understanding of the environment is essential for efficient and safe operation. • This paper presents a deep neural network architecture called LiLaNet for point- wise, multi-class semantic labeling of semi-dense LiDAR data. • The network utilizes virtual image projections of the 3D point clouds for efficient inference. • An automated process is designed for large-scale cross-modal training data generation called Autolabeling, in order to boost semantic labeling performance while keeping the manual annotation effort low.
  • 4.
    Boosting LIDAR-based semanticlabeling by cross-modal training data generation • Based on initial experiments with regard to the discrimination of different semantic classes in LiDAR data, it can first apply a mapping of the original Cityscapes labelset to a reduced set of 13 semantic classes (see Table 1). • This reduced label set is better tailored to the specific properties of the data provided by current LiDAR sensors, where limited spatial resolution and coarse reflectivity estimates prohibit truly fine-grained semantic differentiation.
  • 5.
    Boosting LIDAR-based semanticlabeling by cross-modal training data generation
  • 6.
    Boosting LIDAR-based semanticlabeling by cross-modal training data generation Autolabeling Each pixel of the reference camera image is classified via an image-based semantic labeling network. Subsequently, the point cloud is projected into the camera image and the reference labels are transferred to the corresponding LiDAR points.
  • 7.
    Boosting LIDAR-based semanticlabeling by cross-modal training data generation
  • 8.
    Weather Influence andClassification with Automotive Lidar Sensors • Lidar sensors are often used in mobile robots and autonomous vehicles to complement camera, radar and ultrasonic sensors for environment perception. • Typically, perception algorithms are trained to only detect moving and static objects as well as ground estimation, but intentionally ignore weather effects to reduce false detections. • This work presents an in-depth analysis of automotive lidar performance under harsh weather conditions, i.e. heavy rain and dense fog. • An extensive data set has been recorded for various fog and rain conditions, which is the basis for the conducted in-depth analysis of the point cloud under changing environmental conditions. • It introduces an approach to detect and classify rain or fog with lidar sensors only and achieve an mean union over intersection of 97.14% for a data set in controlled environments.
  • 9.
    Weather Influence andClassification with Automotive Lidar Sensors • For lidar the most challenging conditions are bright sun, fog, rain, dirt and spray. • Significant attenuation for lidar compared to radar in challenging conditions. • For lidar sensors objects could disappear behind the airborne dust. • In general radar sensors are robust against fog compared to lidar and camera sensors which are strongly affected by fog. • Smaller changes in the distance of detected objects, while the intensity and the number of points decrease dramatically for LiDAR in the rain. • Influence of dust on lidar sensors is systematic and predictable, as the lidar measures the leading edge of a dust cloud, which occurs with the lidar used from a transmission of about 70 %. • 1550nm lidar sensor is outperforming the 905nm sensor in adverse weather, due to the lower restrictions on emitted light power to reach laser class 1.
  • 10.
    Weather Influence andClassification with Automotive Lidar Sensors Scala
  • 11.
    Weather Influence andClassification with Automotive Lidar Sensors VLP16
  • 12.
    Weather Influence andClassification with Automotive Lidar Sensors
  • 13.
    A Benchmark forLidar Sensors in Fog: Is Detection Breaking Down? • Autonomous driving at level five does not only means self-driving in the sunshine. • Adverse weather is especially critical because fog, rain, and snow degrade the perception of the environment. • In this work, current state of the art light detection and ranging (lidar) sensors are tested in controlled conditions in a fog chamber. • It presents current problems and disturbance patterns for four different state of the art lidar systems. • It also investigates how tuning internal parameters can improve their performance in bad weather situations.
  • 14.
    A Benchmark forLidar Sensors in Fog: Is Detection Breaking Down? • Attenuation due to fog and smoke in a free space optical communication link are not wavelength dependent for visibilities less than 500m and wavelengths in between 600-1500 nm. • Velodyne claims to increase the capabilities in adverse weather, due to the usage of a digital signal processor which enables dynamically tuning the laser power if a clear object reflection is not obtained from a certain laser pulse. • Both Velodyne S2 and S3D are able to apply this process automatically and to choose from eight power levels • The noise level describes the ground uncertainty floor for a detected signal from spurious light and thermal excitations. • Velodyne provides a factory-side calibration containing a slope and focal distance for each laser in order to correct the beam divergence.
  • 15.
    A Benchmark forLidar Sensors in Fog: Is Detection Breaking Down?
  • 16.
    A Benchmark forLidar Sensors in Fog: Is Detection Breaking Down?
  • 17.
    CNN-based Lidar PointCloud De-Noising in Adverse Weather • Lidar sensors are frequently used in environment perception for autonomous vehicles and mobile robotics to complement camera, radar, and ultrasonic sensors. • Adverse weather conditions are significantly impacting the performance of lidar based scene understanding by causing undesired measurement points that in turn effect missing detections and false positives. • In heavy rain or dense fog, water drops could be misinterpreted as objects in front of the vehicle which brings a mobile robot to a full stop. • This paper presents the CNN-based approach to understand and filter out such adverse weather effects in point cloud data. • Data is available at https://github.com/rheinzler/PointCloudDeNoising.
  • 18.
    CNN-based Lidar PointCloud De-Noising in Adverse Weather • De-noising algorithms in 3D space are often based on spatial features to discard noise points caused by rain or snow. • 2D depth image de-noising is mainly based on dense depth information obtained by stereo vision and depth cameras. • Traditional algorithms developed over years for camera image de-noising can be applied in a straightforward fashion. • These approaches can be split in three different categories: (1) spatial, (2) statistical and (3) segmentation-based methods. • Lidar point clouds are significantly less dense compared to camera images, particularly at larger distances.
  • 19.
    CNN-based Lidar PointCloud De-Noising in Adverse Weather • In the 3D domain many approaches are based on the spatial vicinity or statistical distributions of the point cloud, such as the statistical outlier removal (SOR) and radius outlier removal (ROR) filter. • The SOR defines the vicinity of a point based on its mean distance to all k neighbors compared with a threshold derived by the global mean distance and standard deviation of all points. • The ROR filter directly counts the number of neighbors within the radius r in order to decide whether a point is filtered or not. • Approaches based on spatial vicinity consequently discard single reflections without points in the neighborhood. • However, sparsity is not a valid feature to filter scatter caused by fog or drizzle, as soon as the density of the distribution of water drops increases.
  • 20.
    CNN-based Lidar PointCloud De-Noising in Adverse Weather WeatherNet
  • 21.
    CNN-based Lidar PointCloud De-Noising in Adverse Weather
  • 22.
    CNN-based Lidar PointCloud De-Noising in Adverse Weather
  • 23.
    CNN-based Lidar PointCloud De-Noising in Adverse Weather WeatherNet segmentation results for road data recorded under light rainfall
  • 24.
    LIBRE: The Multiple3D LiDAR Dataset • LIBRE (LiDAR Benchmarking and Reference) is a dataset featuring 10 different LiDAR sensors, covering a range of manufacturers, models, and laser configurations. • Data from each sensor includes three different environments and configurations: • static targets, where objects were placed at known distances and measured from a fixed position within a controlled environment; • adverse weather, where static obstacles were measured from a moving vehicle, captured in a weather chamber where LiDARs were exposed to different conditions (fog, rain, strong light); • and finally, dynamic traffic, where dynamic objects were captured from a vehicle driven on public urban roads, multiple times at different times of the day, and including supporting sensors such as cameras, infrared imaging, and odometry devices. • LIBRE contributions: (1) provide a means for a comparison of currently available LiDARs, and (2) facilitate the improvement of existing self-driving vehicles and robotics-related software, in terms of development and tuning of LiDAR-based perception algorithms. • Data site: https://sites.google.com/g.sp.m.is.nagoya-u.ac.jp/libre-dataset/
  • 25.
    LIBRE: The Multiple3D LiDAR Dataset
  • 26.
    LIBRE: The Multiple3D LiDAR Dataset
  • 27.
    LIBRE: The Multiple3D LiDAR Dataset
  • 28.
    LIBRE: The Multiple3D LiDAR Dataset
  • 29.
    LIBRE: The Multiple3D LiDAR Dataset
  • 30.
    LIBRE: The Multiple3D LiDAR Dataset Dynamic traffic scenes by applying SOTA algorithms (Autoware) on point cloud.
  • 31.
    LIBRE: The Multiple3D LiDAR Dataset “Rain pillars” as detected by a LiDAR
  • 32.
    Characterization of Multiple3D LiDARs for Localization and Mapping using Normal Distributions Transform • A comparison of ten 3D LiDAR sensors, covering a range of manufacturers, models, and laser configurations, for the tasks of mapping and vehicle localization, using as common reference the Normal Distributions Transform (NDT) algorithm implemented in the self-driving open source platform Autoware. • LiDAR data used is a subset of our LiDAR Benchmarking and Reference (LIBRE) dataset, captured independently from each sensor, from a vehicle driven on public urban roads multiple times, at different times of the day. • In this study, it analyzed the performance and characteristics of each LiDAR for the tasks of (1) 3D mapping including an assessment map quality based on mean map entropy, and (2) 6-DOF localization using a ground truth reference map.
  • 33.
    Characterization of Multiple3D LiDARs for Localization and Mapping using Normal Distributions Transform
  • 34.
    Characterization of Multiple3D LiDARs for Localization and Mapping using Normal Distributions Transform
  • 35.
    Characterization of Multiple3D LiDARs for Localization and Mapping using Normal Distributions Transform Reference MMS map colored entropy (a) (green lower entropy, magenta is higher entropy) and by plane variance (b) (red lower variance, green higher variance) for each point.
  • 36.
    Characterization of Multiple3D LiDARs for Localization and Mapping using Normal Distributions Transform Voxel grid filter applied to the OS1-64 point cloud with different maximum range filters. Qualitative results of localization for each LiDAR
  • 37.
    Performance Analysis of10 Models of 3D LiDARs for Automated Driving • This work compares 10 commonly used 3D LiDARs, establishing several metrics to assess their performance. • Various outstanding issues with specific LiDARs were qualitatively identified. • The accuracy and precision of individual LiDAR beams and accumulated point clouds are evaluated in a controlled environment at distances from 5 to 180 meters. • Reflective targets were used to characterize intensity patterns and quantify the impact of surface reflectivity on accuracy and precision. • A vehicle and pedestrian mannequin were used as additional targets of interest. • A thorough assessment of these LiDARs is given with their potential applicability for automated driving tasks.
  • 38.
    Performance Analysis of10 Models of 3D LiDARs for Automated Driving Japan Automobile Research Institute (JARI)
  • 39.
    Performance Analysis of10 Models of 3D LiDARs for Automated Driving The reflective target, vehicle and pedestrian targets during experiments as seen from the experimental vehicle's dashboard camera. Each third of the reflective target is A0 size. The left- most target is the black velvet, center target is a regular poster board, and right-most target is a diamond-grade reflective sheets.
  • 40.
    Performance Analysis of10 Models of 3D LiDARs for Automated Driving
  • 41.
    Performance Analysis of10 Models of 3D LiDARs for Automated Driving
  • 42.
    LaNoising: A Data-drivenApproach for 903nm ToF LiDAR Performance Modeling under Fog • As a critical sensor for high-level autonomous vehicles, LiDAR’s limitations in adverse weather (e.g. rain, fog, snow, etc.) impede the deployment of self-driving cars in all weather conditions. • In this paper, model the performance of a popular 903nm ToF LiDAR under various fog conditions based on a LiDAR dataset collected in a well-controlled artificial fog chamber. • Specifically, a two-stage data-driven method, called LaNoising (la for laser), is proposed for generating LiDAR measurements under fog conditions. • In the first stage, the Gaussian Process Regression (GPR) model is established to predict whether a laser can successfully output a true detection range or not, given certain fog visibility values. • If not, then in the second stage, the Mixture Density Network (MDN) is used to provide a probability prediction of the noisy measurement range.
  • 43.
    LaNoising: A Data-drivenApproach for 903nm ToF LiDAR Performance Modeling under Fog Point cloud under clear weather conditions (left) and noising simulation for fog environment (right).
  • 44.
    LaNoising: A Data-drivenApproach for 903nm ToF LiDAR Performance Modeling under Fog Range measurements change with increasing visibility for a laser in fog environment
  • 45.
    LaNoising: A Data-drivenApproach for 903nm ToF LiDAR Performance Modeling under Fog • The signal processing system in LiDAR is required to detect the true return signal with a low signal-to-noise ratio. • Typically, a thresholding algorithm is applied. • Under the homogeneity assumption of fog, the extinction coefficient and object surface reflectivity are both influenced by fog. • However, since the signal processing unit is usually a black box embedded inside the sensor (for commercial purposes), the analytical form of LiDAR ranging can hardly be obtained. • Therefore, it is necessary to adopt a data-driven approach to break through this problem, as the analytic ranging form is critical to model noise under adverse weather conditions.
  • 46.
    LaNoising: A Data-drivenApproach for 903nm ToF LiDAR Performance Modeling under Fog
  • 47.
    LaNoising: A Data-drivenApproach for 903nm ToF LiDAR Performance Modeling under Fog Architecture of MDN
  • 48.
    LaNoising: A Data-drivenApproach for 903nm ToF LiDAR Performance Modeling under Fog GPR training and testing results
  • 49.