Autonomous vehicles
Technologies & Implications
December, 2017
2
Why? Safety
5.5 millions crashes*1
1.5 million injuries*2
35 thousand fatalities*3
93% fatalities* caused by human error4
* US only, data reported per year
3
Google's Self-Driving Cars Project Demo
4
1
DRIVER
ASSISTANCE
2
PARTIAL
AUTOMATION
4
HIGH
AUTOMATION
3
CONDITIONAL
AUTOMATION
Levels of automation
5
FULL
AUTOMATION
0
NO
AUTOMATION
5
AUTONOMY LEVELS
SAE International J3016 Standard
LET’S REVIEW
6
LEVEL 5: GOAL
+
• Any traffic or weather condition
• Any place or any time
• No human required
FULLY
AUTONOMOUS
DRIVING
7
LEVEL 4: SPECIFIC OPERATIONAL DOMAINS
1
• Limited areas of operation
• Limited speeds
• Limited times of the day
FULLY
AUTONOMOUS
DRIVING, but
2
• New business model (like Uber, but without driver)
• Shared mobility: ride-sharing, car-sharing
MOBILITY
AS A
SERVICE
3 ~10 yearsTIMELINE
8
LEVEL 3: HANDOFF CONTROL BACK
1
• Autonomous mode, that may handoff control
back to the driver
• System should always detect such conditions
• System must ensure to give sufficient warning
HANDOFF
2
Research shows:
Longer disengaged, then longer to get back
With 100 km/h speed car travels ~30m every second.
To give 15 seconds warning system should see 500m
(5 football fields) away!
SAFETY
ISSUES
3 Deployed on new AUDI A8TIMELINE
9
LEVEL 2: HANDOFF AT ANY TIME
1
• Only second or two of warning
• Human must be ready for every such moment!
• Human driver should always supervise the system!
HANDOFF
2
Autonomy can fail to see danger ahead
Drivers under trust or over trust system.
Example: Braking, if adaptive cruise control sensor
doesn’t detect debris on the road
SAFETY
ISSUES
3 Exist: Tesla vehicles and othersTIMELINE
10
LEVEL 1: HANDS-ON, ASSISTANTS
1
• Adaptive Cruise Control
• Lane Keeping Assistance / Lane Departure Alert
• Pre-collision systems (w/ Pedestrian detection) /
Automatic Emergency Braking
• Traffic Jam Assist
• Traffic Sign Recognition
• Intelligent High-beam Control
ADAS /
Active
Systems
2 Exist: Mobileye devices in many cars and othersTIMELINE
11
TECHNOLOGY
PART 2
AUTONOMOUS
OVERVIEW
12
APPROACH
Car Platform / Sensors
Localization & Mapping
Sensing / Perception
Route Planning &
Movement Control
13
Embedded system: Sensors
Radars (long/short range)1
LIDARs (3D point cloud)2
Ultrasonic (parking)3
Cameras (narrow & wide angle, omnidirectional)4
GPS, IMU (inertial measurement unit)5
Odometry encoders (wheel encoders, …)6
14
Mapping & Localization: Map variations
1
NO MAPS
• Without maps no
need to localize
2
NAVIGATION
MAPS
• Localization at
accuracy of the
GPS ~ 10m (5m-
30m)
• What happens
when you lost
GPS signal (like
in big cities)?
4
3D MAPS,
DENSE POINT
CLOUD
• Very detailed 3D
point cloud maps
• Google is the
main contributor
• Localization
should be also at
accuracy ~10 cm
to properly
localize
3
HIGH-
DEFINITION
MAPS
• Detailed maps,
with many
landmarks
• Providers like
TomTom, HERE
• Localization
should be at
accuracy ~10cm
15
Mapping & Localization: Data collection
• Should be near real-time.
• Imagine: traffic signs has been changed or even road directions! Doesn’t match with
sensing!
• It all should be crowdsourced across car manufacturers and map providers.
• Landmark types: traffic signs, traffic lights, road markings, road structure, reflectors, …
• Data collected per vehicle should be small, so HD maps is very preferable in this case.
• Landmarks after preprocessing is just a small subset of data, after sensor post-processing.
• Size of data transfer: ~ 10KB to 1MB per kilometer.
MAP UPDATES
AMOUNT OF DATA COLLECTED
16
Sensing: major properties
Environmental model1 360 awareness2 Defines drivable paths3
17
Sensing: examples of major problems
3D Object Detection with Semantic 3D bounding boxes1
Free Space Segmentation / Drivable Paths2
Lane detection / fusion3
Task-specific issues:4
Semantic lanes information (country road vocabulary,
types of merges, …)
4.1
Road semantic information: slope estimation (predicting road curves)4.2
Accurate distance prediction to every object4.3
18
Sensing: 3D Object Detection & Trajectories
• 3D Bounding box is crucial
• Not all parts of cube seen during each frame
• Color codes to mark every side of the car
• Adding complexity: distance/range measurements, trajectory prediction
VEHICLES AT
ANY ANGLE
19
Sensing: Semantic Segmentation
• What road and what not? Context matters a lot.
• We need semantic information to help classify such tasks
• Road splatted by small elevation could be parking slot!
• What about snow, wet road (reflections)? Algorithm generalization is key!
SEMANTIC
FREE SPACE
20
Sensing: Lane Detection / Fusion
LANE
FUSION
Lane
detection
Confidence
improvements
Holistic Path
Planning
Alignment
Free Space
TASK
FUSE CENTER OF
THE LANE
21
VIRTUAL MILES: VERIFICATION
* Waymo’s simualtions (XView)
MODELING
SYNTHETHIC
SCENARIOS
22
Driver policy & route planning
• Planning future.
• Example:
Double lane merge,
Negotiating entry:
Roundabout ->
Multi-agent game:
Reinforcement learning:
– States, actions, rewards.
– Need to classify and predict other drivers behavior.
DRIVER POLICY
TECHNOLOGY
Survey of motion
planning,
B. Paden, 2016
23
Summary, safety first
1.2M people killed on the road every year
With proper, very careful execution of the technology we can
achieve safer roads
New businesses and models will dramatically change our life experience.

Технологии беспилотных автомобилей

  • 1.
    Autonomous vehicles Technologies &Implications December, 2017
  • 2.
    2 Why? Safety 5.5 millionscrashes*1 1.5 million injuries*2 35 thousand fatalities*3 93% fatalities* caused by human error4 * US only, data reported per year
  • 3.
  • 4.
  • 5.
    5 AUTONOMY LEVELS SAE InternationalJ3016 Standard LET’S REVIEW
  • 6.
    6 LEVEL 5: GOAL + •Any traffic or weather condition • Any place or any time • No human required FULLY AUTONOMOUS DRIVING
  • 7.
    7 LEVEL 4: SPECIFICOPERATIONAL DOMAINS 1 • Limited areas of operation • Limited speeds • Limited times of the day FULLY AUTONOMOUS DRIVING, but 2 • New business model (like Uber, but without driver) • Shared mobility: ride-sharing, car-sharing MOBILITY AS A SERVICE 3 ~10 yearsTIMELINE
  • 8.
    8 LEVEL 3: HANDOFFCONTROL BACK 1 • Autonomous mode, that may handoff control back to the driver • System should always detect such conditions • System must ensure to give sufficient warning HANDOFF 2 Research shows: Longer disengaged, then longer to get back With 100 km/h speed car travels ~30m every second. To give 15 seconds warning system should see 500m (5 football fields) away! SAFETY ISSUES 3 Deployed on new AUDI A8TIMELINE
  • 9.
    9 LEVEL 2: HANDOFFAT ANY TIME 1 • Only second or two of warning • Human must be ready for every such moment! • Human driver should always supervise the system! HANDOFF 2 Autonomy can fail to see danger ahead Drivers under trust or over trust system. Example: Braking, if adaptive cruise control sensor doesn’t detect debris on the road SAFETY ISSUES 3 Exist: Tesla vehicles and othersTIMELINE
  • 10.
    10 LEVEL 1: HANDS-ON,ASSISTANTS 1 • Adaptive Cruise Control • Lane Keeping Assistance / Lane Departure Alert • Pre-collision systems (w/ Pedestrian detection) / Automatic Emergency Braking • Traffic Jam Assist • Traffic Sign Recognition • Intelligent High-beam Control ADAS / Active Systems 2 Exist: Mobileye devices in many cars and othersTIMELINE
  • 11.
  • 12.
    12 APPROACH Car Platform /Sensors Localization & Mapping Sensing / Perception Route Planning & Movement Control
  • 13.
    13 Embedded system: Sensors Radars(long/short range)1 LIDARs (3D point cloud)2 Ultrasonic (parking)3 Cameras (narrow & wide angle, omnidirectional)4 GPS, IMU (inertial measurement unit)5 Odometry encoders (wheel encoders, …)6
  • 14.
    14 Mapping & Localization:Map variations 1 NO MAPS • Without maps no need to localize 2 NAVIGATION MAPS • Localization at accuracy of the GPS ~ 10m (5m- 30m) • What happens when you lost GPS signal (like in big cities)? 4 3D MAPS, DENSE POINT CLOUD • Very detailed 3D point cloud maps • Google is the main contributor • Localization should be also at accuracy ~10 cm to properly localize 3 HIGH- DEFINITION MAPS • Detailed maps, with many landmarks • Providers like TomTom, HERE • Localization should be at accuracy ~10cm
  • 15.
    15 Mapping & Localization:Data collection • Should be near real-time. • Imagine: traffic signs has been changed or even road directions! Doesn’t match with sensing! • It all should be crowdsourced across car manufacturers and map providers. • Landmark types: traffic signs, traffic lights, road markings, road structure, reflectors, … • Data collected per vehicle should be small, so HD maps is very preferable in this case. • Landmarks after preprocessing is just a small subset of data, after sensor post-processing. • Size of data transfer: ~ 10KB to 1MB per kilometer. MAP UPDATES AMOUNT OF DATA COLLECTED
  • 16.
    16 Sensing: major properties Environmentalmodel1 360 awareness2 Defines drivable paths3
  • 17.
    17 Sensing: examples ofmajor problems 3D Object Detection with Semantic 3D bounding boxes1 Free Space Segmentation / Drivable Paths2 Lane detection / fusion3 Task-specific issues:4 Semantic lanes information (country road vocabulary, types of merges, …) 4.1 Road semantic information: slope estimation (predicting road curves)4.2 Accurate distance prediction to every object4.3
  • 18.
    18 Sensing: 3D ObjectDetection & Trajectories • 3D Bounding box is crucial • Not all parts of cube seen during each frame • Color codes to mark every side of the car • Adding complexity: distance/range measurements, trajectory prediction VEHICLES AT ANY ANGLE
  • 19.
    19 Sensing: Semantic Segmentation •What road and what not? Context matters a lot. • We need semantic information to help classify such tasks • Road splatted by small elevation could be parking slot! • What about snow, wet road (reflections)? Algorithm generalization is key! SEMANTIC FREE SPACE
  • 20.
    20 Sensing: Lane Detection/ Fusion LANE FUSION Lane detection Confidence improvements Holistic Path Planning Alignment Free Space TASK FUSE CENTER OF THE LANE
  • 21.
    21 VIRTUAL MILES: VERIFICATION *Waymo’s simualtions (XView) MODELING SYNTHETHIC SCENARIOS
  • 22.
    22 Driver policy &route planning • Planning future. • Example: Double lane merge, Negotiating entry: Roundabout -> Multi-agent game: Reinforcement learning: – States, actions, rewards. – Need to classify and predict other drivers behavior. DRIVER POLICY TECHNOLOGY Survey of motion planning, B. Paden, 2016
  • 23.
    23 Summary, safety first 1.2Mpeople killed on the road every year With proper, very careful execution of the technology we can achieve safer roads New businesses and models will dramatically change our life experience.