SlideShare a Scribd company logo
1 of 47
Download to read offline
Driving Behavior for ADAS
and Autonomous Driving
Yu Huang
Yu.huang07@gmail.com
Sunnyvale, California
Outline
• Driver Behavior Modeling
• Driver Behavior Recognizing
• Driver Behavior Classification Model based on an Intelligent Driving Diagnosis System
• A Learning-Based Autonomous Driver: Emulate Driver’s Intelligence in Car Following
• E2E Learning of Driving Models from Large-scale Video Datasets
• Open Framework for Human-like Autonomous Driving by Inverse Reinforcement Learning
• Driver Action Prediction Using Bidirectional RNN
• DeepTest: Auto Testing of DNN-driven Autonomous Cars
• Deep Learning Based Analysis of Driver Behavior & Interaction with Automation
• A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving
• Personalization of ADAS and Autonomous Driving
Driver Behavior Modeling
• Estimating and predicting traffic situations over time is an essential capability for ADAS and
autonomous driving;
• The distribution of possible situation developments over several seconds can be anticipated
based on a history of noisy measurements of the pose and velocity of road users;
• Fine-grained predictions of future vehicle poses;
• High-level predictions on the level of intended driving routes.
• Predicting traffic situations is only tractable because the transitions of environment states are
not completely random, but instead show a lot of regularities;
• Cars, e.g., move according to kinematic and dynamic constraints.
• Also, traffic is structured by regulations and infrastructure like lanes, signs and traffic lights.
• Traffic participants do not act randomly, i.e. try to follow the traffic rules and avoid accidents.
• All these aspects combined lead to patterns in the behavior of traffic participants that can be
exploited by an intelligent system to anticipate how a situation will develop.
• The key to this ability is to find suitable models that capture these patterns.
Driver Behavior Modeling
• Driver behavior modeling (DBM) predicts driving maneuvers, driver intent, vehicle and driver
state, and environmental factors, to improve transportation safety and driving experience.
• These models are then typically incorporated into ADAS in the vehicles.
Driver Behavior Modeling
• Lane changing models:
• rule-based models
• discrete-choice probabilistic
models
• Artificial intelligence models
• incentive-based models
Driver Behavior Modeling
Driver Behavior Modeling
Driver Behavior Modeling
Lane changing decision models based on traffic characteristics
Classification of available approaches in lane changing studies
Driver Behavior Modeling
• Driver Assistance Cloud can
provide personalized driver
services, applications, & safety.
Road Network Modeling
The road network information includes multi-hierarchical road network and topological structure,
which is a “regional road network–road–road segment–carriageway–lane”. The correlation
between objects of different hierarchies can be maintained by the query table.
Road Network Modeling
• The road network is divided into regional road network, road, road segment, carriageway,
and lane;
• The lanes that possess the same direction in the same road usually have the same traffic
characteristics, such as speed limit, driving direction, vehicle type permission, and topology;
• It can take lane as a basic modeling unit and polymerizes the lanes with the same direction
and the same traffic characteristics into carriageways.
• The lane polymerization method not only reduces the data volume of the same lanes, but
also simplifies the topology and improves the efficiency of the network analysis.
• This model describes the relationship between different levels of road network, road, road
segment, and lane by tables.
Lanes Polymerizing into Carriageway
Road Network Modeling
Road Intersection and Lane Polymerization
Road Network Modeling
Multi-hierarchical Road Network and Topology Relationship
Vehicle Model
Vehicle Classification Hierarchy Structure and Coding
Recognizing Driver Behavior
• Recognizing driver characteristics is by itself not
a simple task with the other requirements of
active vehicle safety and comfort of vehicle
adding to the complexity.
• Information about the driver’s driving skill can
be used to adapt vehicle control parameters to
facilitate the specific driver’s needs in terms of
vehicle performance and active safety.
• According to certainty and uncertainty of the
driver’s model structure, identification
methods can be categorized into 3 aspects:
• parameter identification,
• non-parameter identification,
• semi-parameter identification.
driver-vehicle steering control systems
driver’s steering control law
the vehicle-driver model
Recognizing Driver Behavior
• Nonparametric models are better than the
parametric models;
• Driver’s signals (e.g., gas pedal pressure,
brake pedal pressure, steering angle) are more
efficient than environment and vehicle signals
(velocity, acceleration, and engine speed);
• Parametric method might have less stringent
input or output requirements, but it needs to
select a set of candidate driver models which
require a known forcing function;
• Nonparametric method with the black box
model only takes into account the relation
between input and output, but ignores the
inner state-variables.
Flow diagram of driver model identification
Recognizing Driver Behavior
• Human driving skill and characteristics need to be embedded into the vehicle dynamic
systems to improve the vehicle’s drivability, maneuverability, and fuel economy.
• Characterizing driver behavior and skill exactly is crucial to simulating driver behavior and
optimizing driver-vehicle- environment systems.
• Driver behavior encompasses the characteristics of dynamic, randomness, and nonlinearity,
as well as obeying certain distribution.
• Mapping from sensory input to driver’s action output might be strongly nonlinearity in
nature; hence, the traditional control methods like PID control are unable to simulate them.
• Most of stochastic, nonlinear, and fuzzy theories (HMMs, Hierarchical HMM, AR-HMM,
nonlinear regression models, NNs and fuzzy systems) have been used to recognize and
predict driver behavior.
Driver Behavior Classification Model based on an Intelligent
Driving Diagnosis System
• Design of a driver behaviors classifier
based on an intelligent driving diagnosis
system with signals acquired by a GPS
data logging system: position, velocity,
accelerations and steering angle.
• The classifier presents the structure of an
intelligent driver behaviors model based
on NNs, using as inputs statistical
transformations of the driving diagnosis
time signals: steering profiles, pedals
uses, speeding and getting out of the
lane and road.
• Applications: driver identification for
security systems and to classify a driver
into one of two categories, aggressive
and moderate.
Driver Behavior Classification Model based on an Intelligent
Driving Diagnosis System
Risky Areas Identification Graphic
Simulated Environment
A Learning-Based Autonomous Driver: Emulate
Driver’s Intelligence in Low Speed Car Following
• A good autonomous driving algorithms should have the features that: enable them to use the human
driver’s prior-knowledge to better understand traffic scenarios, perform robustly in most traffic scenarios
and be able to learn from human drivers to improve performance.
• 1st: the control model is built based on a human’s prior-knowledge and could be verified in simulation to
ensure the functionalities of driving.
• 2nd: the learning algorithm is implemented to optimize the pre-knowledge-based control model. This
enables the performance of the autonomous pilot to be improved through learning while avoiding the
problems of using pure training based control models.
• 3rd: an autonomous driver will be able to better interact with human traffic by emulating human driving
behavior, which also retains the potential to exceed human driver performance in some scenarios.
• Autonomous driving is implemented based on a Prediction- and Cost function-Based algorithm (PCB);
• PCB emulates a human driver’s decision process, modeled as traffic scenario prediction and evaluation.
• A learning method to optimize PCB with very limited training data, to predict and evaluate traffic scenarios.
A Learning-Based Autonomous Driver: Emulate
Driver’s Intelligence in Low Speed Car Following
• The autonomous driver model was designed to model a human driver’s decision process,
including prediction and evaluation of traffic scenarios.
• By using prediction, reduced the difficulty of building long-term heuristic cost functions.
• The evaluation was based on some human-understandable cost functions, easy to extend.
Freeway Driving Modules
Prediction- and Cost Function-Based Algorithm
E2E Learning of Driving Models from Large-scale Video Datasets
• Learning a generic vehicle motion model from large scale crowd-sourced video data, and
An e2e trainable architecture for learning to predict a distribution over future vehicle
egomotion from instantaneous monocular camera observations and previous vehicle state.
• It incorporates a FCN-LSTM architecture, learned from large-scale crowd-sourced vehicle
action data, and leverages scene segmentation side tasks to improve performance.
Autonomous driving is formulated as a future egomotion prediction problem.
E2E Learning of Driving Models from Large-scale Video Datasets
Comparison among architectures that can
fuse time-series information with visual inputs.
Learn a method of learning a driving policy
from demonstrated behaviors, and formulate the
problem as predicting future feasible actions.
The driving model defined as the admissibility
of which next motion is plausible given the
current observed world configuration.
The generic models take as input raw pixels and
current and prior vehicle state signals, and predict
the likelihood of future motion, defined over a
range of action or motion granularity, considering
both discrete and continuous settings.
E2E Learning of Driving Models from Large-scale Video Datasets
Comparison of learning approaches. Mediated
Perception relies on semantic-class labels at the
pixel level alone to drive motion prediction. The
Motion Reflex method learns a representation
based on raw pixels. Privileged Training learns
from raw pixels but allows side-training on
semantic segmentation tasks.
The model is able to jointly train motion prediction
and pixel-level supervised tasks.
It can use semantic segmentation as a side task
following “previleged” information learning
paradigm.
This leads to better performance in the experiments.
An Open Framework for Human-like Autonomous Driving
Using Inverse Reinforcement Learning
• Platform:
• Lane detection or localization and mapping.
• Detection and tracking of mobile objects (DATMO).
• Proprioceptive info.: car’s odometry and engine state (eg revolutions per minute, current gear).
• Control. For autonomous driving, the vehicle should expose interfaces to steering angle,
acceleration, brake and gear changing.
• Behavior learning and planning: BEhavior Learning LibrarY (Belly)
• Feature computation;
• (a) lateral displacement with respect to track center;
• (b) absolute speed;
• (c) relative speed with respect to traffic limitations;
• (d) collision distance to an obstacle.
• Cost- function computation;
• Motion trajectory planning algorithms;
• Learning algorithms themselves.
• Others: motion prediction, path tracking (velocity and pose controller).
An Open Framework for Human-like Autonomous Driving
Using Inverse Reinforcement Learning
• To leverage an driving
simulator (Torcs) by
developing a ROS
communication bridge for it.
• Based on that, build an
experimental framework for
the development and
evaluation of Human-like
autonomous driving based
on Inverse Reinforce
Learning (IRL).
Framework overview: (orange) platforms; (brown) IRL
and planning libraries; (blue) additional ROS modules.
Driver Action Prediction Using Bidirectional RNN
• Predicting driver actions early and accurately can help mitigate the effects of potentially
unsafe driving behaviors and avoid possible accidents.
• It formulates driver action prediction as a time series anomaly prediction problem.
• While the anomaly (driver actions of interest) detection might be trivial in this context,
finding patterns that consistently precede an anomaly requires searching for or extracting
features across multi-modal sensory inputs.
• A driver action prediction system is a real-time data acquisition, processing and learning
framework for predicting future or impending driver action.
• It incorporates camera-based knowledge of the driving environment and the driver
themselves, in addition to traditional vehicle dynamics.
• It uses a deep bidirectional RNN to learn the correlation btw sensory inputs and impending
driver behavior achieving accurate and high horizon action prediction.
• To predict key driver actions including acceleration, braking, lane change and turning at
durations of about 5 seconds before the action is executed by the driver.
Driver Action Prediction Using Bidirectional RNN
Left: The DAP (Driver Action Prediction) system. Right: The Prediction Network based on Deep Bidirectional RNN
DeepTest: Auto Testing of DNN-driven Autonomous Cars
• Most existing testing techniques for DNN-driven vehicles are heavily dependent on the
manual collection of test data under different driving conditions which become prohibitively
expensive as the number of test conditions increases.
• DeepTest, a systematic testing tool for automatically detecting erroneous behaviors of
DNN-driven vehicles that can potentially lead to fatal crashes.
• The tool is designed to automatically generated test cases leveraging real-world changes in
driving conditions like rain, fog, lighting conditions, etc.
• DeepTest systematically explore different parts of the DNN logic by generating test inputs
that maximize the numbers of activated neurons.
• DeepTest found thousands of erroneous behaviors under different realistic driving
conditions (e.g., blurring, rain, fog, etc.) many of which lead to potentially fatal crashes in 3
top performing DNNs in the Udacity self-driving car challenge.
DeepTest: Auto Testing of DNN-driven Autonomous Cars
• At a conceptual level, the erroneous corner-case behaviors in DNN-based software are
analogous to logic bugs in traditional software.
• Similar to the bug detection and patching cycle in software development, the erroneous
behaviors of DNNs, once detected, can be fixed by adding the error-inducing inputs to the
training data set and also by possibly changing the model structure / parameters.
• It is very hard to build robust safety-critical systems only using manual test cases.
• Building automated and systematic testing tools for DNN-based software is a novel and
important software engineering problem.
• How do we systematically explore the input-output spaces of an autonomous car DNN?
• How can we synthesize realistic inputs to automate such exploration?
• How can we optimize the exploration process?
• How do we automatically create a test oracle that can detect erroneous behaviors without
detailed manual specifications?
DeepTest: Auto Testing of DNN-driven Autonomous Cars
A simple autonomous car DNN that takes inputs from camera, light detection and ranging sensor (LiDAR),
and IR (infrared) sensor, and outputs steering angle, braking decision, and acceleration decision.
DeepTest: Auto Testing of DNN-driven Autonomous Cars
(Upper) A simplified CNN architecture with a
convolution kernel shown on the top-left part
of the input image. The same filter (edges with
same weights) is then moved across the entire
input space, and the dot products are computed
between the edge weights and the outputs of
the connected neurons.
(Lower) A simplified RNN architecture with
loops in its hidden layers. The unrolled one on
the right shows how the loop allows a
sequence of inputs (i.e. images) fed to the
RNN and the steering angle is predicted based
on all those images.
Large-Scale Deep Learning Based Analysis of Driver
Behavior and Interaction with Automation
• MIT Autonomous Vehicle Technology (MIT-AVT) study:
• To undertake large-scale real-world driving data collection that includes high-definition video to
fuel the development of deep learning based internal and external perception systems;
• To gain a holistic understanding of how human beings interact with vehicle automation
technology by integrating video data with vehicle state data, driver characteristics, mental models,
and self-reported experiences with technology;
• To identify how technology and other factors related to automation adoption and use can be
improved in ways that save lives.
• 21 Tesla Model S and Model X vehicles, 2 Volvo S90 vehicles, and 2 Range Rover Evoque
vehicles for both long-term (over a year per driver) and medium term (one month per driver)
naturalistic driving data collection.
• The recorded data streams include IMU, GPS, CAN messages, and high-definition video
streams of the driver face, the driver cabin, the forward roadway, and the instrument cluster
(on select vehicles).
• So far, there are 78 participants, 7,146 days of participation, 275,589 miles, and 3.5 billion
video frames.
Large-Scale Deep Learning Based Analysis of Driver
Behavior and Interaction with Automation
Large-Scale Deep Learning Based Analysis of Driver
Behavior and Interaction with Automation
• Deep learning can be defined in two ways:
• A branch of ML that uses NN that have many layers.
• A branch of ML that seeks to form hierarchies of data representation with minimum input from a human
being on the actual composition of the hierarchy.
• The key characteristic of deep learning is the ability of automated representation learning to
use large-scale data to generalize robustly over real-world edge cases that arise in any in-the-
wild application of machine learning: occlusion, lighting, perspective, scale, inter-class variation,
intra-class variation, etc.
• Leveraging the release large-scale annotated driving datasets, automotive deep learning
research aims to address detection, estimation, prediction, labeling, generation, control, and
planning tasks.
• Fine-grained Face Recognition, Body Pose Estimation, Semantic Scene Perception, Driving State Prediction.
• ADAS Features used:
• Adaptive Cruise Control (ACC), Pilot Assist (in the Volvo), Forward Alert Warning / City Safety (in the Volvo),
Automatic Emergency Braking, Lane Departure Warning (LDW), Lane Keep Assist (LKA), Blind Spot Monitor.
Large-Scale Deep Learning Based Analysis of Driver
Behavior and Interaction with Automation
• RIDER (Real-time Intelligent Driving Environment Recording system) components:
• A real-time-clock, GPS, IMU, and the ability to record up to 6 cameras at 720p resolution, remote
cellular connectivity.
Large-Scale Deep Learning Based Analysis of Driver
Behavior and Interaction with Automation
The MIT-AVT data pipeline, showing the process of
offloading, cleaning, synchronizing, and extracting
knowledge from data. On the left is the dependency-
constrained, asynchronous, distributed computing
framework. In the middle is the sequence of high level
procedures that perform several levels of knowledge
extraction. On the right are broad categories of data
produced by the pipeline, organized by size.
A Scenario-Adaptive Driving Behavior Prediction
Approach to Urban Autonomous Driving
• Prediction of surrounding vehicles’ driving behaviors plays a crucial role in autonomous
vehicles.
• Most traditional driving behavior prediction models work only for a specific traffic scenario and
cannot be adapted to different scenarios.
• In addition, priori driving knowledge was never considered sufficiently.
• A scenario-adaptive approach to solve these problems for Autonomous Vehicles (AVs).
• Continuous features of driving behavior were learned by Hidden Markov Models (HMMs).
• A knowledge base was constructed to specify the model adaptation strategies and store priori
probabilities based on the scenario’s characteristics.
• The target vehicle’s future behavior was predicted considering both a posteriori probabilities and a
priori probabilities.
• The application scope of traditional models can be extended to a variety of scenarios, while the
prediction performance can be improved by the consideration of priori knowledge.
A Scenario-Adaptive Driving Behavior Prediction
Approach to Urban Autonomous Driving
• The planning layer consists of 3 modules: route planning, decision making, and path planning.
• Route and path planning module generate proper global routes and safe local trajectories.
• The behavior decision-making module provides safe and reasonable abstract driving actions.
• The SoA decision-making system should have a forward-looking ability: the surrounding
vehicles’ future driving behavior should be predicted accurately.
• AVs should monitor variation in the target vehicle’s movement and understand the current
scenario from the view of the target vehicle. The future behavior predicted by prediction models.
• Types of info: vehicle kinematics, the relationships btw target vehicle and surrounding entities
(other vehicles, lane lines), and a priori knowledge (traffic rules and common sense of driving).
• Vehicle kinematics and relations with road entities were considered by almost all existing studies.
• These features can be applied directly by Time to X models to predict the behavior of the vehicle.
A Scenario-Adaptive Driving Behavior Prediction
Approach to Urban Autonomous Driving
• Interactions btw the target vehicle
and its surrounding vehicles also
have a significant influence on
target vehicles’ future behavior.
Overview of the driving behavior
prediction approach.
Priori knowledge information is
helpful for driving behavior prediction.
A priori probability is provided by a
rule-based reasoning system.
A Scenario-Adaptive Driving Behavior Prediction
Approach to Urban Autonomous Driving
Layout of the ontology model to describe the traffic scenario
A Scenario-Adaptive Driving Behavior Prediction
Approach to Urban Autonomous Driving
Schematic figure of road entity decomposition
A Scenario-Adaptive Driving Behavior Prediction
Approach to Urban Autonomous Driving
A flow diagram of on-line driving behavior prediction
Personalization of ADAS and Autonomous Driving
• Personalization is categorized into explicit and implicit personalization.
• Explicit personalization requires users to state their preferences and to explicitly change the
system by choosing a particular system setting that suits them best.
• Implicit personalization observes their behavior and derives a user model for the prediction
of user preferences or behavior based on these user data.
• Personalization of ADAS is to make ADAS interventions more efficient and to improve the
driving experience and usability of ADAS, by adapting to individual preferences of the driver.
• Personalization to autonomous driving: to be comfortable for different drivers, the driving
style of an AV should be adapted to the individual driver’s preferences.
• It is data driven: a model of the driver is learned from driving data.
• This model is either used to directly control the vehicle or to parameterize a controller.
Personalization of ADAS and Autonomous Driving
• The personalization process:
• Observe the driving behavior: assumption is that the driver is most comfortable with a
driving style that is similar to their own driving style.
• Build a model of human driving behavior: A driver model is learned from the data of an
individual driver and directly used as part of the controller, a high level controller, that
models the driving behavior and whose parameters are adapted to the specific driver
during personalization, and a low level controller, that is responsible for the actuation
of the vehicle according to the input from the high level controller.
• Validate the model.
• Off-line playback: recorded driving data are fed into the personalized controller to verify that
the controller correctly reproduces the observed driving behavior.
• Simulation in a traffic simulator: the personalized controller is tested in controlled traffic
situations and often compared with a standard controller.
• Field test: The personalized controller is implemented in a vehicle and tested in real traffic.
Personalization of ADAS and Autonomous Driving
• Personalization of ACC (adaptive cruise control): group-based or individual-based;
• Group: drivers are assigned to one of a small number of representative driving styles for which an
ACC control strategy is implemented;
• Individual: the ACC control strategy tries to best reproduce the driving style of an individual driver,
in which the system switches btw the learning mode and the running mode.
• The goal in personalized forward collision warning is to decrease the false alarm rate of the
system and to increase the warning time to give the driver a longer reaction time.
• The aim of personalized lane keeping is to detect the lane departures early and to minimize
the false alarm rate of the system.
• The gap acceptance, the longitudinal adjustments to find an acceptable gap and the way the
lane change maneuver itself is performed characterize the individual driving style and all
three aspects are modeled for lane change personalization.
• For autonomous driving personalization, reinforcement learning is used to find the a policy
that imitates the expert.
Thanks

More Related Content

What's hot

Telsa auto pilot
Telsa auto pilotTelsa auto pilot
Telsa auto pilotKRoshni
 
Autonomous vehicles
Autonomous vehiclesAutonomous vehicles
Autonomous vehiclesvishnum379
 
Self driving car
Self driving carSelf driving car
Self driving carzebatasneem
 
Role of localization and environment perception in autonomous driving
Role of localization and environment perception in autonomous drivingRole of localization and environment perception in autonomous driving
Role of localization and environment perception in autonomous drivingQualcomm Research
 
Adaptive cruise control system by NIKHIL R
Adaptive cruise control system by NIKHIL RAdaptive cruise control system by NIKHIL R
Adaptive cruise control system by NIKHIL RNikhil Kashyap
 
Collision Avoidance System
Collision Avoidance SystemCollision Avoidance System
Collision Avoidance SystemSiddharth Mehta
 
Jatin sharma (42162)
Jatin sharma (42162)Jatin sharma (42162)
Jatin sharma (42162)Jatin Sharma
 
Autonomous car
Autonomous carAutonomous car
Autonomous carAnil kale
 
Intelligent Vehicles
Intelligent VehiclesIntelligent Vehicles
Intelligent Vehiclesanakarenbm
 
Cruise control & Adaptive Cruise Control
Cruise control & Adaptive Cruise ControlCruise control & Adaptive Cruise Control
Cruise control & Adaptive Cruise ControlANAND THAKKAR
 

What's hot (20)

Autonomous cars by ihazn
Autonomous cars by ihaznAutonomous cars by ihazn
Autonomous cars by ihazn
 
Telsa auto pilot
Telsa auto pilotTelsa auto pilot
Telsa auto pilot
 
Autonomous vehicles
Autonomous vehiclesAutonomous vehicles
Autonomous vehicles
 
Self driving car
Self driving carSelf driving car
Self driving car
 
Self driving car
Self driving carSelf driving car
Self driving car
 
Autonomous cars
Autonomous carsAutonomous cars
Autonomous cars
 
Autonomous cars
Autonomous carsAutonomous cars
Autonomous cars
 
Role of localization and environment perception in autonomous driving
Role of localization and environment perception in autonomous drivingRole of localization and environment perception in autonomous driving
Role of localization and environment perception in autonomous driving
 
Waymo Driverless car
Waymo Driverless carWaymo Driverless car
Waymo Driverless car
 
AUTONOMOUS VEHICLES
AUTONOMOUS VEHICLESAUTONOMOUS VEHICLES
AUTONOMOUS VEHICLES
 
Adaptive cruise control system by NIKHIL R
Adaptive cruise control system by NIKHIL RAdaptive cruise control system by NIKHIL R
Adaptive cruise control system by NIKHIL R
 
Collision Avoidance System
Collision Avoidance SystemCollision Avoidance System
Collision Avoidance System
 
Adaptive cruise control’
Adaptive cruise control’Adaptive cruise control’
Adaptive cruise control’
 
Jatin sharma (42162)
Jatin sharma (42162)Jatin sharma (42162)
Jatin sharma (42162)
 
Autonomous car
Autonomous carAutonomous car
Autonomous car
 
Intelligent Vehicles
Intelligent VehiclesIntelligent Vehicles
Intelligent Vehicles
 
Cruise control & Adaptive Cruise Control
Cruise control & Adaptive Cruise ControlCruise control & Adaptive Cruise Control
Cruise control & Adaptive Cruise Control
 
Autonomous car
Autonomous carAutonomous car
Autonomous car
 
Autonomous vehicles
Autonomous vehiclesAutonomous vehicles
Autonomous vehicles
 
Lane assist
Lane assistLane assist
Lane assist
 

Similar to Driving behavior for ADAS and Autonomous Driving

Driving Behavior for ADAS and Autonomous Driving III
Driving Behavior for ADAS and Autonomous Driving IIIDriving Behavior for ADAS and Autonomous Driving III
Driving Behavior for ADAS and Autonomous Driving IIIYu Huang
 
Driving Behavior for ADAS and Autonomous Driving IV
Driving Behavior for ADAS and Autonomous Driving IVDriving Behavior for ADAS and Autonomous Driving IV
Driving Behavior for ADAS and Autonomous Driving IVYu Huang
 
Prediction,Planninng & Control at Baidu
Prediction,Planninng & Control at BaiduPrediction,Planninng & Control at Baidu
Prediction,Planninng & Control at BaiduYu Huang
 
Driving Behavior for ADAS and Autonomous Driving IX
Driving Behavior for ADAS and Autonomous Driving IXDriving Behavior for ADAS and Autonomous Driving IX
Driving Behavior for ADAS and Autonomous Driving IXYu Huang
 
Generation of Autonomous Vehicle Validation Scenarios Using Crash Data
Generation of Autonomous Vehicle Validation Scenarios Using Crash DataGeneration of Autonomous Vehicle Validation Scenarios Using Crash Data
Generation of Autonomous Vehicle Validation Scenarios Using Crash DataM. Ilhan Akbas
 
Smart cars
Smart carsSmart cars
Smart carsItcs399
 
Bertrand Fontaine - Deep Learning for driver/passenger detection of car trips
Bertrand Fontaine - Deep Learning for driver/passenger detection of car tripsBertrand Fontaine - Deep Learning for driver/passenger detection of car trips
Bertrand Fontaine - Deep Learning for driver/passenger detection of car tripsHendrik D'Oosterlinck
 
Driving behaviors for adas and autonomous driving XIII
Driving behaviors for adas and autonomous driving XIIIDriving behaviors for adas and autonomous driving XIII
Driving behaviors for adas and autonomous driving XIIIYu Huang
 
Driving Behavior for ADAS and Autonomous Driving V
Driving Behavior for ADAS and Autonomous Driving VDriving Behavior for ADAS and Autonomous Driving V
Driving Behavior for ADAS and Autonomous Driving VYu Huang
 
Scenario Generation for Validating Artifi cial Intelligence Based Autonomous ...
Scenario Generation for Validating Artificial Intelligence Based Autonomous ...Scenario Generation for Validating Artificial Intelligence Based Autonomous ...
Scenario Generation for Validating Artifi cial Intelligence Based Autonomous ...M. Ilhan Akbas
 
LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...
LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...
LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...cscpconf
 
automation.pptx
automation.pptxautomation.pptx
automation.pptxSabarDasal
 
paper Presentation
paper Presentationpaper Presentation
paper PresentationPranesh nair
 
TRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWARE
TRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWARETRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWARE
TRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWAREshrikrishna kesharwani
 
ICMLA 2015 - Car Following Markov Regime Classification and Calibration
ICMLA 2015 - Car Following Markov Regime Classification and CalibrationICMLA 2015 - Car Following Markov Regime Classification and Calibration
ICMLA 2015 - Car Following Markov Regime Classification and CalibrationMohamed AbdElAziz Khamis
 
Detection of Lane and Speed Breaker.pptx
Detection of Lane and Speed Breaker.pptxDetection of Lane and Speed Breaker.pptx
Detection of Lane and Speed Breaker.pptxAryanRoyDishu
 
Self-Driving Cars With Convolutional Neural Networks (CNN.pptx
Self-Driving Cars With Convolutional Neural Networks (CNN.pptxSelf-Driving Cars With Convolutional Neural Networks (CNN.pptx
Self-Driving Cars With Convolutional Neural Networks (CNN.pptxssuserf79e761
 

Similar to Driving behavior for ADAS and Autonomous Driving (20)

Driving Behavior for ADAS and Autonomous Driving III
Driving Behavior for ADAS and Autonomous Driving IIIDriving Behavior for ADAS and Autonomous Driving III
Driving Behavior for ADAS and Autonomous Driving III
 
Driving Behavior for ADAS and Autonomous Driving IV
Driving Behavior for ADAS and Autonomous Driving IVDriving Behavior for ADAS and Autonomous Driving IV
Driving Behavior for ADAS and Autonomous Driving IV
 
Prediction,Planninng & Control at Baidu
Prediction,Planninng & Control at BaiduPrediction,Planninng & Control at Baidu
Prediction,Planninng & Control at Baidu
 
Driving Behavior for ADAS and Autonomous Driving IX
Driving Behavior for ADAS and Autonomous Driving IXDriving Behavior for ADAS and Autonomous Driving IX
Driving Behavior for ADAS and Autonomous Driving IX
 
Generation of Autonomous Vehicle Validation Scenarios Using Crash Data
Generation of Autonomous Vehicle Validation Scenarios Using Crash DataGeneration of Autonomous Vehicle Validation Scenarios Using Crash Data
Generation of Autonomous Vehicle Validation Scenarios Using Crash Data
 
Smart cars
Smart carsSmart cars
Smart cars
 
Bertrand Fontaine - Deep Learning for driver/passenger detection of car trips
Bertrand Fontaine - Deep Learning for driver/passenger detection of car tripsBertrand Fontaine - Deep Learning for driver/passenger detection of car trips
Bertrand Fontaine - Deep Learning for driver/passenger detection of car trips
 
Driving behaviors for adas and autonomous driving XIII
Driving behaviors for adas and autonomous driving XIIIDriving behaviors for adas and autonomous driving XIII
Driving behaviors for adas and autonomous driving XIII
 
Driving Behavior for ADAS and Autonomous Driving V
Driving Behavior for ADAS and Autonomous Driving VDriving Behavior for ADAS and Autonomous Driving V
Driving Behavior for ADAS and Autonomous Driving V
 
Scenario Generation for Validating Artifi cial Intelligence Based Autonomous ...
Scenario Generation for Validating Artificial Intelligence Based Autonomous ...Scenario Generation for Validating Artificial Intelligence Based Autonomous ...
Scenario Generation for Validating Artifi cial Intelligence Based Autonomous ...
 
Final report
Final reportFinal report
Final report
 
LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...
LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...
LANE CHANGE DETECTION AND TRACKING FOR A SAFE-LANE APPROACH IN REAL TIME VISI...
 
automation.pptx
automation.pptxautomation.pptx
automation.pptx
 
paper Presentation
paper Presentationpaper Presentation
paper Presentation
 
TRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWARE
TRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWARETRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWARE
TRAFFIC SIMULATION AT TOLL ROAD SECTION USING VISSIM SOFTWARE
 
ICMLA 2015 - Car Following Markov Regime Classification and Calibration
ICMLA 2015 - Car Following Markov Regime Classification and CalibrationICMLA 2015 - Car Following Markov Regime Classification and Calibration
ICMLA 2015 - Car Following Markov Regime Classification and Calibration
 
TFT2.ppt
TFT2.pptTFT2.ppt
TFT2.ppt
 
FinalReport
FinalReportFinalReport
FinalReport
 
Detection of Lane and Speed Breaker.pptx
Detection of Lane and Speed Breaker.pptxDetection of Lane and Speed Breaker.pptx
Detection of Lane and Speed Breaker.pptx
 
Self-Driving Cars With Convolutional Neural Networks (CNN.pptx
Self-Driving Cars With Convolutional Neural Networks (CNN.pptxSelf-Driving Cars With Convolutional Neural Networks (CNN.pptx
Self-Driving Cars With Convolutional Neural Networks (CNN.pptx
 

More from Yu Huang

Application of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous DrivingApplication of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous DrivingYu Huang
 
The New Perception Framework in Autonomous Driving: An Introduction of BEV N...
The New Perception Framework  in Autonomous Driving: An Introduction of BEV N...The New Perception Framework  in Autonomous Driving: An Introduction of BEV N...
The New Perception Framework in Autonomous Driving: An Introduction of BEV N...Yu Huang
 
Data Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous DrivingData Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous DrivingYu Huang
 
Techniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous DrivingTechniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous DrivingYu Huang
 
BEV Joint Detection and Segmentation
BEV Joint Detection and SegmentationBEV Joint Detection and Segmentation
BEV Joint Detection and SegmentationYu Huang
 
BEV Object Detection and Prediction
BEV Object Detection and PredictionBEV Object Detection and Prediction
BEV Object Detection and PredictionYu Huang
 
Fisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VIFisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VIYu Huang
 
Fisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving VFisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving VYu Huang
 
Fisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IVFisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IVYu Huang
 
Cruise AI under the Hood
Cruise AI under the HoodCruise AI under the Hood
Cruise AI under the HoodYu Huang
 
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)Yu Huang
 
Scenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous DrivingScenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous DrivingYu Huang
 
How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?Yu Huang
 
Annotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingAnnotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingYu Huang
 
Simulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atgSimulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atgYu Huang
 
Multi sensor calibration by deep learning
Multi sensor calibration by deep learningMulti sensor calibration by deep learning
Multi sensor calibration by deep learningYu Huang
 
Prediction and planning for self driving at waymo
Prediction and planning for self driving at waymoPrediction and planning for self driving at waymo
Prediction and planning for self driving at waymoYu Huang
 
Jointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planningJointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planningYu Huang
 
Data pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous drivingData pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous drivingYu Huang
 
Open Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planningOpen Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planningYu Huang
 

More from Yu Huang (20)

Application of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous DrivingApplication of Foundation Model for Autonomous Driving
Application of Foundation Model for Autonomous Driving
 
The New Perception Framework in Autonomous Driving: An Introduction of BEV N...
The New Perception Framework  in Autonomous Driving: An Introduction of BEV N...The New Perception Framework  in Autonomous Driving: An Introduction of BEV N...
The New Perception Framework in Autonomous Driving: An Introduction of BEV N...
 
Data Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous DrivingData Closed Loop in Simulation Test of Autonomous Driving
Data Closed Loop in Simulation Test of Autonomous Driving
 
Techniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous DrivingTechniques and Challenges in Autonomous Driving
Techniques and Challenges in Autonomous Driving
 
BEV Joint Detection and Segmentation
BEV Joint Detection and SegmentationBEV Joint Detection and Segmentation
BEV Joint Detection and Segmentation
 
BEV Object Detection and Prediction
BEV Object Detection and PredictionBEV Object Detection and Prediction
BEV Object Detection and Prediction
 
Fisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VIFisheye based Perception for Autonomous Driving VI
Fisheye based Perception for Autonomous Driving VI
 
Fisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving VFisheye/Omnidirectional View in Autonomous Driving V
Fisheye/Omnidirectional View in Autonomous Driving V
 
Fisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IVFisheye/Omnidirectional View in Autonomous Driving IV
Fisheye/Omnidirectional View in Autonomous Driving IV
 
Cruise AI under the Hood
Cruise AI under the HoodCruise AI under the Hood
Cruise AI under the Hood
 
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
LiDAR in the Adverse Weather: Dust, Snow, Rain and Fog (2)
 
Scenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous DrivingScenario-Based Development & Testing for Autonomous Driving
Scenario-Based Development & Testing for Autonomous Driving
 
How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?How to Build a Data Closed-loop Platform for Autonomous Driving?
How to Build a Data Closed-loop Platform for Autonomous Driving?
 
Annotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingAnnotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous Driving
 
Simulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atgSimulation for autonomous driving at uber atg
Simulation for autonomous driving at uber atg
 
Multi sensor calibration by deep learning
Multi sensor calibration by deep learningMulti sensor calibration by deep learning
Multi sensor calibration by deep learning
 
Prediction and planning for self driving at waymo
Prediction and planning for self driving at waymoPrediction and planning for self driving at waymo
Prediction and planning for self driving at waymo
 
Jointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planningJointly mapping, localization, perception, prediction and planning
Jointly mapping, localization, perception, prediction and planning
 
Data pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous drivingData pipeline and data lake for autonomous driving
Data pipeline and data lake for autonomous driving
 
Open Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planningOpen Source codes of trajectory prediction & behavior planning
Open Source codes of trajectory prediction & behavior planning
 

Recently uploaded

Novel 3D-Printed Soft Linear and Bending Actuators
Novel 3D-Printed Soft Linear and Bending ActuatorsNovel 3D-Printed Soft Linear and Bending Actuators
Novel 3D-Printed Soft Linear and Bending ActuatorsResearcher Researcher
 
Immutable Image-Based Operating Systems - EW2024.pdf
Immutable Image-Based Operating Systems - EW2024.pdfImmutable Image-Based Operating Systems - EW2024.pdf
Immutable Image-Based Operating Systems - EW2024.pdfDrew Moseley
 
KCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosKCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosVictor Morales
 
Main Memory Management in Operating System
Main Memory Management in Operating SystemMain Memory Management in Operating System
Main Memory Management in Operating SystemRashmi Bhat
 
Earthing details of Electrical Substation
Earthing details of Electrical SubstationEarthing details of Electrical Substation
Earthing details of Electrical Substationstephanwindworld
 
Engineering Drawing section of solid
Engineering Drawing     section of solidEngineering Drawing     section of solid
Engineering Drawing section of solidnamansinghjarodiya
 
List of Accredited Concrete Batching Plant.pdf
List of Accredited Concrete Batching Plant.pdfList of Accredited Concrete Batching Plant.pdf
List of Accredited Concrete Batching Plant.pdfisabel213075
 
ROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.ppt
ROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.pptROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.ppt
ROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.pptJohnWilliam111370
 
Input Output Management in Operating System
Input Output Management in Operating SystemInput Output Management in Operating System
Input Output Management in Operating SystemRashmi Bhat
 
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMSHigh Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMSsandhya757531
 
Paper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdf
Paper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdfPaper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdf
Paper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdfNainaShrivastava14
 
CS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdfCS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdfBalamuruganV28
 
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENTFUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENTSneha Padhiar
 
Cost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based questionCost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based questionSneha Padhiar
 
Energy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptxEnergy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptxsiddharthjain2303
 
Robotics Group 10 (Control Schemes) cse.pdf
Robotics Group 10  (Control Schemes) cse.pdfRobotics Group 10  (Control Schemes) cse.pdf
Robotics Group 10 (Control Schemes) cse.pdfsahilsajad201
 
Prach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism CommunityPrach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism Communityprachaibot
 
US Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of ActionUS Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of ActionMebane Rash
 
SOFTWARE ESTIMATION COCOMO AND FP CALCULATION
SOFTWARE ESTIMATION COCOMO AND FP CALCULATIONSOFTWARE ESTIMATION COCOMO AND FP CALCULATION
SOFTWARE ESTIMATION COCOMO AND FP CALCULATIONSneha Padhiar
 

Recently uploaded (20)

Novel 3D-Printed Soft Linear and Bending Actuators
Novel 3D-Printed Soft Linear and Bending ActuatorsNovel 3D-Printed Soft Linear and Bending Actuators
Novel 3D-Printed Soft Linear and Bending Actuators
 
Immutable Image-Based Operating Systems - EW2024.pdf
Immutable Image-Based Operating Systems - EW2024.pdfImmutable Image-Based Operating Systems - EW2024.pdf
Immutable Image-Based Operating Systems - EW2024.pdf
 
KCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosKCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitos
 
Designing pile caps according to ACI 318-19.pptx
Designing pile caps according to ACI 318-19.pptxDesigning pile caps according to ACI 318-19.pptx
Designing pile caps according to ACI 318-19.pptx
 
Main Memory Management in Operating System
Main Memory Management in Operating SystemMain Memory Management in Operating System
Main Memory Management in Operating System
 
Earthing details of Electrical Substation
Earthing details of Electrical SubstationEarthing details of Electrical Substation
Earthing details of Electrical Substation
 
Engineering Drawing section of solid
Engineering Drawing     section of solidEngineering Drawing     section of solid
Engineering Drawing section of solid
 
List of Accredited Concrete Batching Plant.pdf
List of Accredited Concrete Batching Plant.pdfList of Accredited Concrete Batching Plant.pdf
List of Accredited Concrete Batching Plant.pdf
 
ROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.ppt
ROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.pptROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.ppt
ROBOETHICS-CCS345 ETHICS AND ARTIFICIAL INTELLIGENCE.ppt
 
Input Output Management in Operating System
Input Output Management in Operating SystemInput Output Management in Operating System
Input Output Management in Operating System
 
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMSHigh Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
 
Paper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdf
Paper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdfPaper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdf
Paper Tube : Shigeru Ban projects and Case Study of Cardboard Cathedral .pdf
 
CS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdfCS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdf
 
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENTFUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
 
Cost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based questionCost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based question
 
Energy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptxEnergy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptx
 
Robotics Group 10 (Control Schemes) cse.pdf
Robotics Group 10  (Control Schemes) cse.pdfRobotics Group 10  (Control Schemes) cse.pdf
Robotics Group 10 (Control Schemes) cse.pdf
 
Prach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism CommunityPrach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism Community
 
US Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of ActionUS Department of Education FAFSA Week of Action
US Department of Education FAFSA Week of Action
 
SOFTWARE ESTIMATION COCOMO AND FP CALCULATION
SOFTWARE ESTIMATION COCOMO AND FP CALCULATIONSOFTWARE ESTIMATION COCOMO AND FP CALCULATION
SOFTWARE ESTIMATION COCOMO AND FP CALCULATION
 

Driving behavior for ADAS and Autonomous Driving

  • 1. Driving Behavior for ADAS and Autonomous Driving Yu Huang Yu.huang07@gmail.com Sunnyvale, California
  • 2. Outline • Driver Behavior Modeling • Driver Behavior Recognizing • Driver Behavior Classification Model based on an Intelligent Driving Diagnosis System • A Learning-Based Autonomous Driver: Emulate Driver’s Intelligence in Car Following • E2E Learning of Driving Models from Large-scale Video Datasets • Open Framework for Human-like Autonomous Driving by Inverse Reinforcement Learning • Driver Action Prediction Using Bidirectional RNN • DeepTest: Auto Testing of DNN-driven Autonomous Cars • Deep Learning Based Analysis of Driver Behavior & Interaction with Automation • A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving • Personalization of ADAS and Autonomous Driving
  • 3. Driver Behavior Modeling • Estimating and predicting traffic situations over time is an essential capability for ADAS and autonomous driving; • The distribution of possible situation developments over several seconds can be anticipated based on a history of noisy measurements of the pose and velocity of road users; • Fine-grained predictions of future vehicle poses; • High-level predictions on the level of intended driving routes. • Predicting traffic situations is only tractable because the transitions of environment states are not completely random, but instead show a lot of regularities; • Cars, e.g., move according to kinematic and dynamic constraints. • Also, traffic is structured by regulations and infrastructure like lanes, signs and traffic lights. • Traffic participants do not act randomly, i.e. try to follow the traffic rules and avoid accidents. • All these aspects combined lead to patterns in the behavior of traffic participants that can be exploited by an intelligent system to anticipate how a situation will develop. • The key to this ability is to find suitable models that capture these patterns.
  • 4. Driver Behavior Modeling • Driver behavior modeling (DBM) predicts driving maneuvers, driver intent, vehicle and driver state, and environmental factors, to improve transportation safety and driving experience. • These models are then typically incorporated into ADAS in the vehicles.
  • 5. Driver Behavior Modeling • Lane changing models: • rule-based models • discrete-choice probabilistic models • Artificial intelligence models • incentive-based models
  • 8. Driver Behavior Modeling Lane changing decision models based on traffic characteristics Classification of available approaches in lane changing studies
  • 9. Driver Behavior Modeling • Driver Assistance Cloud can provide personalized driver services, applications, & safety.
  • 10. Road Network Modeling The road network information includes multi-hierarchical road network and topological structure, which is a “regional road network–road–road segment–carriageway–lane”. The correlation between objects of different hierarchies can be maintained by the query table.
  • 11. Road Network Modeling • The road network is divided into regional road network, road, road segment, carriageway, and lane; • The lanes that possess the same direction in the same road usually have the same traffic characteristics, such as speed limit, driving direction, vehicle type permission, and topology; • It can take lane as a basic modeling unit and polymerizes the lanes with the same direction and the same traffic characteristics into carriageways. • The lane polymerization method not only reduces the data volume of the same lanes, but also simplifies the topology and improves the efficiency of the network analysis. • This model describes the relationship between different levels of road network, road, road segment, and lane by tables. Lanes Polymerizing into Carriageway
  • 12. Road Network Modeling Road Intersection and Lane Polymerization
  • 13. Road Network Modeling Multi-hierarchical Road Network and Topology Relationship
  • 14. Vehicle Model Vehicle Classification Hierarchy Structure and Coding
  • 15. Recognizing Driver Behavior • Recognizing driver characteristics is by itself not a simple task with the other requirements of active vehicle safety and comfort of vehicle adding to the complexity. • Information about the driver’s driving skill can be used to adapt vehicle control parameters to facilitate the specific driver’s needs in terms of vehicle performance and active safety. • According to certainty and uncertainty of the driver’s model structure, identification methods can be categorized into 3 aspects: • parameter identification, • non-parameter identification, • semi-parameter identification. driver-vehicle steering control systems driver’s steering control law the vehicle-driver model
  • 16. Recognizing Driver Behavior • Nonparametric models are better than the parametric models; • Driver’s signals (e.g., gas pedal pressure, brake pedal pressure, steering angle) are more efficient than environment and vehicle signals (velocity, acceleration, and engine speed); • Parametric method might have less stringent input or output requirements, but it needs to select a set of candidate driver models which require a known forcing function; • Nonparametric method with the black box model only takes into account the relation between input and output, but ignores the inner state-variables. Flow diagram of driver model identification
  • 17. Recognizing Driver Behavior • Human driving skill and characteristics need to be embedded into the vehicle dynamic systems to improve the vehicle’s drivability, maneuverability, and fuel economy. • Characterizing driver behavior and skill exactly is crucial to simulating driver behavior and optimizing driver-vehicle- environment systems. • Driver behavior encompasses the characteristics of dynamic, randomness, and nonlinearity, as well as obeying certain distribution. • Mapping from sensory input to driver’s action output might be strongly nonlinearity in nature; hence, the traditional control methods like PID control are unable to simulate them. • Most of stochastic, nonlinear, and fuzzy theories (HMMs, Hierarchical HMM, AR-HMM, nonlinear regression models, NNs and fuzzy systems) have been used to recognize and predict driver behavior.
  • 18. Driver Behavior Classification Model based on an Intelligent Driving Diagnosis System • Design of a driver behaviors classifier based on an intelligent driving diagnosis system with signals acquired by a GPS data logging system: position, velocity, accelerations and steering angle. • The classifier presents the structure of an intelligent driver behaviors model based on NNs, using as inputs statistical transformations of the driving diagnosis time signals: steering profiles, pedals uses, speeding and getting out of the lane and road. • Applications: driver identification for security systems and to classify a driver into one of two categories, aggressive and moderate.
  • 19. Driver Behavior Classification Model based on an Intelligent Driving Diagnosis System Risky Areas Identification Graphic Simulated Environment
  • 20. A Learning-Based Autonomous Driver: Emulate Driver’s Intelligence in Low Speed Car Following • A good autonomous driving algorithms should have the features that: enable them to use the human driver’s prior-knowledge to better understand traffic scenarios, perform robustly in most traffic scenarios and be able to learn from human drivers to improve performance. • 1st: the control model is built based on a human’s prior-knowledge and could be verified in simulation to ensure the functionalities of driving. • 2nd: the learning algorithm is implemented to optimize the pre-knowledge-based control model. This enables the performance of the autonomous pilot to be improved through learning while avoiding the problems of using pure training based control models. • 3rd: an autonomous driver will be able to better interact with human traffic by emulating human driving behavior, which also retains the potential to exceed human driver performance in some scenarios. • Autonomous driving is implemented based on a Prediction- and Cost function-Based algorithm (PCB); • PCB emulates a human driver’s decision process, modeled as traffic scenario prediction and evaluation. • A learning method to optimize PCB with very limited training data, to predict and evaluate traffic scenarios.
  • 21. A Learning-Based Autonomous Driver: Emulate Driver’s Intelligence in Low Speed Car Following • The autonomous driver model was designed to model a human driver’s decision process, including prediction and evaluation of traffic scenarios. • By using prediction, reduced the difficulty of building long-term heuristic cost functions. • The evaluation was based on some human-understandable cost functions, easy to extend. Freeway Driving Modules Prediction- and Cost Function-Based Algorithm
  • 22. E2E Learning of Driving Models from Large-scale Video Datasets • Learning a generic vehicle motion model from large scale crowd-sourced video data, and An e2e trainable architecture for learning to predict a distribution over future vehicle egomotion from instantaneous monocular camera observations and previous vehicle state. • It incorporates a FCN-LSTM architecture, learned from large-scale crowd-sourced vehicle action data, and leverages scene segmentation side tasks to improve performance. Autonomous driving is formulated as a future egomotion prediction problem.
  • 23. E2E Learning of Driving Models from Large-scale Video Datasets Comparison among architectures that can fuse time-series information with visual inputs. Learn a method of learning a driving policy from demonstrated behaviors, and formulate the problem as predicting future feasible actions. The driving model defined as the admissibility of which next motion is plausible given the current observed world configuration. The generic models take as input raw pixels and current and prior vehicle state signals, and predict the likelihood of future motion, defined over a range of action or motion granularity, considering both discrete and continuous settings.
  • 24. E2E Learning of Driving Models from Large-scale Video Datasets Comparison of learning approaches. Mediated Perception relies on semantic-class labels at the pixel level alone to drive motion prediction. The Motion Reflex method learns a representation based on raw pixels. Privileged Training learns from raw pixels but allows side-training on semantic segmentation tasks. The model is able to jointly train motion prediction and pixel-level supervised tasks. It can use semantic segmentation as a side task following “previleged” information learning paradigm. This leads to better performance in the experiments.
  • 25. An Open Framework for Human-like Autonomous Driving Using Inverse Reinforcement Learning • Platform: • Lane detection or localization and mapping. • Detection and tracking of mobile objects (DATMO). • Proprioceptive info.: car’s odometry and engine state (eg revolutions per minute, current gear). • Control. For autonomous driving, the vehicle should expose interfaces to steering angle, acceleration, brake and gear changing. • Behavior learning and planning: BEhavior Learning LibrarY (Belly) • Feature computation; • (a) lateral displacement with respect to track center; • (b) absolute speed; • (c) relative speed with respect to traffic limitations; • (d) collision distance to an obstacle. • Cost- function computation; • Motion trajectory planning algorithms; • Learning algorithms themselves. • Others: motion prediction, path tracking (velocity and pose controller).
  • 26. An Open Framework for Human-like Autonomous Driving Using Inverse Reinforcement Learning • To leverage an driving simulator (Torcs) by developing a ROS communication bridge for it. • Based on that, build an experimental framework for the development and evaluation of Human-like autonomous driving based on Inverse Reinforce Learning (IRL). Framework overview: (orange) platforms; (brown) IRL and planning libraries; (blue) additional ROS modules.
  • 27. Driver Action Prediction Using Bidirectional RNN • Predicting driver actions early and accurately can help mitigate the effects of potentially unsafe driving behaviors and avoid possible accidents. • It formulates driver action prediction as a time series anomaly prediction problem. • While the anomaly (driver actions of interest) detection might be trivial in this context, finding patterns that consistently precede an anomaly requires searching for or extracting features across multi-modal sensory inputs. • A driver action prediction system is a real-time data acquisition, processing and learning framework for predicting future or impending driver action. • It incorporates camera-based knowledge of the driving environment and the driver themselves, in addition to traditional vehicle dynamics. • It uses a deep bidirectional RNN to learn the correlation btw sensory inputs and impending driver behavior achieving accurate and high horizon action prediction. • To predict key driver actions including acceleration, braking, lane change and turning at durations of about 5 seconds before the action is executed by the driver.
  • 28. Driver Action Prediction Using Bidirectional RNN Left: The DAP (Driver Action Prediction) system. Right: The Prediction Network based on Deep Bidirectional RNN
  • 29. DeepTest: Auto Testing of DNN-driven Autonomous Cars • Most existing testing techniques for DNN-driven vehicles are heavily dependent on the manual collection of test data under different driving conditions which become prohibitively expensive as the number of test conditions increases. • DeepTest, a systematic testing tool for automatically detecting erroneous behaviors of DNN-driven vehicles that can potentially lead to fatal crashes. • The tool is designed to automatically generated test cases leveraging real-world changes in driving conditions like rain, fog, lighting conditions, etc. • DeepTest systematically explore different parts of the DNN logic by generating test inputs that maximize the numbers of activated neurons. • DeepTest found thousands of erroneous behaviors under different realistic driving conditions (e.g., blurring, rain, fog, etc.) many of which lead to potentially fatal crashes in 3 top performing DNNs in the Udacity self-driving car challenge.
  • 30. DeepTest: Auto Testing of DNN-driven Autonomous Cars • At a conceptual level, the erroneous corner-case behaviors in DNN-based software are analogous to logic bugs in traditional software. • Similar to the bug detection and patching cycle in software development, the erroneous behaviors of DNNs, once detected, can be fixed by adding the error-inducing inputs to the training data set and also by possibly changing the model structure / parameters. • It is very hard to build robust safety-critical systems only using manual test cases. • Building automated and systematic testing tools for DNN-based software is a novel and important software engineering problem. • How do we systematically explore the input-output spaces of an autonomous car DNN? • How can we synthesize realistic inputs to automate such exploration? • How can we optimize the exploration process? • How do we automatically create a test oracle that can detect erroneous behaviors without detailed manual specifications?
  • 31. DeepTest: Auto Testing of DNN-driven Autonomous Cars A simple autonomous car DNN that takes inputs from camera, light detection and ranging sensor (LiDAR), and IR (infrared) sensor, and outputs steering angle, braking decision, and acceleration decision.
  • 32. DeepTest: Auto Testing of DNN-driven Autonomous Cars (Upper) A simplified CNN architecture with a convolution kernel shown on the top-left part of the input image. The same filter (edges with same weights) is then moved across the entire input space, and the dot products are computed between the edge weights and the outputs of the connected neurons. (Lower) A simplified RNN architecture with loops in its hidden layers. The unrolled one on the right shows how the loop allows a sequence of inputs (i.e. images) fed to the RNN and the steering angle is predicted based on all those images.
  • 33. Large-Scale Deep Learning Based Analysis of Driver Behavior and Interaction with Automation • MIT Autonomous Vehicle Technology (MIT-AVT) study: • To undertake large-scale real-world driving data collection that includes high-definition video to fuel the development of deep learning based internal and external perception systems; • To gain a holistic understanding of how human beings interact with vehicle automation technology by integrating video data with vehicle state data, driver characteristics, mental models, and self-reported experiences with technology; • To identify how technology and other factors related to automation adoption and use can be improved in ways that save lives. • 21 Tesla Model S and Model X vehicles, 2 Volvo S90 vehicles, and 2 Range Rover Evoque vehicles for both long-term (over a year per driver) and medium term (one month per driver) naturalistic driving data collection. • The recorded data streams include IMU, GPS, CAN messages, and high-definition video streams of the driver face, the driver cabin, the forward roadway, and the instrument cluster (on select vehicles). • So far, there are 78 participants, 7,146 days of participation, 275,589 miles, and 3.5 billion video frames.
  • 34. Large-Scale Deep Learning Based Analysis of Driver Behavior and Interaction with Automation
  • 35. Large-Scale Deep Learning Based Analysis of Driver Behavior and Interaction with Automation • Deep learning can be defined in two ways: • A branch of ML that uses NN that have many layers. • A branch of ML that seeks to form hierarchies of data representation with minimum input from a human being on the actual composition of the hierarchy. • The key characteristic of deep learning is the ability of automated representation learning to use large-scale data to generalize robustly over real-world edge cases that arise in any in-the- wild application of machine learning: occlusion, lighting, perspective, scale, inter-class variation, intra-class variation, etc. • Leveraging the release large-scale annotated driving datasets, automotive deep learning research aims to address detection, estimation, prediction, labeling, generation, control, and planning tasks. • Fine-grained Face Recognition, Body Pose Estimation, Semantic Scene Perception, Driving State Prediction. • ADAS Features used: • Adaptive Cruise Control (ACC), Pilot Assist (in the Volvo), Forward Alert Warning / City Safety (in the Volvo), Automatic Emergency Braking, Lane Departure Warning (LDW), Lane Keep Assist (LKA), Blind Spot Monitor.
  • 36. Large-Scale Deep Learning Based Analysis of Driver Behavior and Interaction with Automation • RIDER (Real-time Intelligent Driving Environment Recording system) components: • A real-time-clock, GPS, IMU, and the ability to record up to 6 cameras at 720p resolution, remote cellular connectivity.
  • 37. Large-Scale Deep Learning Based Analysis of Driver Behavior and Interaction with Automation The MIT-AVT data pipeline, showing the process of offloading, cleaning, synchronizing, and extracting knowledge from data. On the left is the dependency- constrained, asynchronous, distributed computing framework. In the middle is the sequence of high level procedures that perform several levels of knowledge extraction. On the right are broad categories of data produced by the pipeline, organized by size.
  • 38. A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving • Prediction of surrounding vehicles’ driving behaviors plays a crucial role in autonomous vehicles. • Most traditional driving behavior prediction models work only for a specific traffic scenario and cannot be adapted to different scenarios. • In addition, priori driving knowledge was never considered sufficiently. • A scenario-adaptive approach to solve these problems for Autonomous Vehicles (AVs). • Continuous features of driving behavior were learned by Hidden Markov Models (HMMs). • A knowledge base was constructed to specify the model adaptation strategies and store priori probabilities based on the scenario’s characteristics. • The target vehicle’s future behavior was predicted considering both a posteriori probabilities and a priori probabilities. • The application scope of traditional models can be extended to a variety of scenarios, while the prediction performance can be improved by the consideration of priori knowledge.
  • 39. A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving • The planning layer consists of 3 modules: route planning, decision making, and path planning. • Route and path planning module generate proper global routes and safe local trajectories. • The behavior decision-making module provides safe and reasonable abstract driving actions. • The SoA decision-making system should have a forward-looking ability: the surrounding vehicles’ future driving behavior should be predicted accurately. • AVs should monitor variation in the target vehicle’s movement and understand the current scenario from the view of the target vehicle. The future behavior predicted by prediction models. • Types of info: vehicle kinematics, the relationships btw target vehicle and surrounding entities (other vehicles, lane lines), and a priori knowledge (traffic rules and common sense of driving). • Vehicle kinematics and relations with road entities were considered by almost all existing studies. • These features can be applied directly by Time to X models to predict the behavior of the vehicle.
  • 40. A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving • Interactions btw the target vehicle and its surrounding vehicles also have a significant influence on target vehicles’ future behavior. Overview of the driving behavior prediction approach. Priori knowledge information is helpful for driving behavior prediction. A priori probability is provided by a rule-based reasoning system.
  • 41. A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving Layout of the ontology model to describe the traffic scenario
  • 42. A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving Schematic figure of road entity decomposition
  • 43. A Scenario-Adaptive Driving Behavior Prediction Approach to Urban Autonomous Driving A flow diagram of on-line driving behavior prediction
  • 44. Personalization of ADAS and Autonomous Driving • Personalization is categorized into explicit and implicit personalization. • Explicit personalization requires users to state their preferences and to explicitly change the system by choosing a particular system setting that suits them best. • Implicit personalization observes their behavior and derives a user model for the prediction of user preferences or behavior based on these user data. • Personalization of ADAS is to make ADAS interventions more efficient and to improve the driving experience and usability of ADAS, by adapting to individual preferences of the driver. • Personalization to autonomous driving: to be comfortable for different drivers, the driving style of an AV should be adapted to the individual driver’s preferences. • It is data driven: a model of the driver is learned from driving data. • This model is either used to directly control the vehicle or to parameterize a controller.
  • 45. Personalization of ADAS and Autonomous Driving • The personalization process: • Observe the driving behavior: assumption is that the driver is most comfortable with a driving style that is similar to their own driving style. • Build a model of human driving behavior: A driver model is learned from the data of an individual driver and directly used as part of the controller, a high level controller, that models the driving behavior and whose parameters are adapted to the specific driver during personalization, and a low level controller, that is responsible for the actuation of the vehicle according to the input from the high level controller. • Validate the model. • Off-line playback: recorded driving data are fed into the personalized controller to verify that the controller correctly reproduces the observed driving behavior. • Simulation in a traffic simulator: the personalized controller is tested in controlled traffic situations and often compared with a standard controller. • Field test: The personalized controller is implemented in a vehicle and tested in real traffic.
  • 46. Personalization of ADAS and Autonomous Driving • Personalization of ACC (adaptive cruise control): group-based or individual-based; • Group: drivers are assigned to one of a small number of representative driving styles for which an ACC control strategy is implemented; • Individual: the ACC control strategy tries to best reproduce the driving style of an individual driver, in which the system switches btw the learning mode and the running mode. • The goal in personalized forward collision warning is to decrease the false alarm rate of the system and to increase the warning time to give the driver a longer reaction time. • The aim of personalized lane keeping is to detect the lane departures early and to minimize the false alarm rate of the system. • The gap acceptance, the longitudinal adjustments to find an acceptable gap and the way the lane change maneuver itself is performed characterize the individual driving style and all three aspects are modeled for lane change personalization. • For autonomous driving personalization, reinforcement learning is used to find the a policy that imitates the expert.