http://tahoesiliconmountain.com/
Tahoe Silicon Mountain, a network of entrepreneurs and professionals who live and work in the Tahoe-Truckee area, is pleased to welcome Dr. Christos Papachristos to present at Mountain Minds Monday: “Self-Flying Drones: On a Mission to Navigate Dark, Dangerous and Unknown Worlds”
Even with all our current technical advances, dangerous and unpleasant jobs are still a part of modern life. Imagine a world where flying robots could be used to navigate in any environment, under any possible conditions, and complete risky tasks that humans currently perform.
Dr. Christos Papachristos, Post-Doc Researcher at the Autonomous Robots Lab at University of Nevada, Reno, will be speaking about how, without the benefit of GPS or previously mapped environments, drones can be used in beyond line-of-sight operation to autonomously navigate, map and explore dark and dusty, partially sealed or underground, and generally visually-degraded environments like mines or nuclear waste sites.
Dr. Papachristos will discuss the use of regular cameras with flashers, inertial sensors, 3D-structure time-of-flight cameras combined with infrared cameras, ionizing radiation detectors, and the logic behind the algorithms that guide the drones on their missions.
You can learn more about the Autonomous Robots Lab here: http://www.autonomousrobotslab.com/
Mountain Minds Monday will be on Monday, October 9th, 6-8 pm at Pizza on the Hill, in Tahoe Donner at 11509 Northwoods Blvd., Truckee. A $5 fee includes pizza and salad. Before and after the presentation, there will be time for networking.
The event will also be livestreamed and available online as it happens on YouTube: bit.ly/YouTubeTSM
This month’s event is sponsored by New Leaders, Holland & Hart LLP, Molsby & Bordner, LLP and The Lift.
You can find us on LinkedIn and Facebook and at TahoeSiliconMountain.com or sign up for email meeting announcements here: http://bit.ly/TSMEmail
Dev Dives: Streamline document processing with UiPath Studio Web
Self-Flying Drones: On a Mission to Navigate Dark, Dangerous and Unknown Worlds
1.
2. Self-Flying Drones: On a mission
to navigate Dark, Dangerous and Unknown Worlds
Christos Papachristos
Autonomous Robots Lab, University of Nevada, Reno
3. Broader Vision
´ In all non-too-distant visions of the future,
robots are part of everyday lives, fulfilling the
roles of humanity’s need for comfortable
transportation, everyday safety and security, a
tireless and reliable workforce, or even that of
a convenient company. The robots of such a
future –from single appliance to entire cities–
can operate on their own.
´ Robotics can promote sustainable and scalable
growth, even out societal disparity, improve our
quality of life, accelerate scientific progress, and
more.
´ To reach this scale, a concrete baseline is
necessary to provide the foundations of high-
level perception, navigation, task-handling,
reasoning, etc.
´ Autonomy is the key. The absolute baseline is
Robust Perception and Autonomous Planning.
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
4. Motivation
´ On one hand Aerial Robots are exceptional
candidates for jobs that require going to
remote locations to inspect, map, and monitor
their environment. These locations can be:
´ Particularly difficult to reach and/or GPS-denied.
´ Engulfed in complete darkness or White-washed.
´ Hazy due to atmospheric conditions or dust
within enclosed spaces.
´ Hazardous for human health.
´ Autonomous Aerial Robotic Operation in
GPS-denied Degraded Visual Environments
´ Indicative Application domains:
´ Nuclear Site Decommissioning
´ Remote Infrastructure Inspection
´ Oil & Gas Industry Inspection
´ Surveillance, Security Monitoring
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
5. Putting the Pieces together
´ Robotic Autonomy :
The ability to operate without the need for human action and reasoning and make own choices.
´ Generate moves
sequence from A to B.
´ Objectives: Exploration,
Inspection, …
´ Rules: Collision-free,
Power-limitations, …
´ Optimize
response.
´ Guarantee
constraints.
´ Robot
Configuration.
´ Sensor Suite.
´ Processing
Components.
´ Multi-modal Perception:
Visual, Inertial, LIDAR,
Thermal, …
´ Robust State Estimation:
GPS-denied, DVE(s), …
A baseline for Autonomous Mobile Robots
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
6. Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
Putting the Pieces together
´ Starting off with the basics for a simple yet fully-autonomous aerial robot:
7. Visual – Inertial SLAM
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ The Simultaneous Localization & Mapping problem
Vision-based feature detection and tracking
Detect – Track – Recover Structure from Motion
´ Formulate as Estimation process with Bayesian Reasoning.
´ Correlate robot pose Uncertainty to measurement
Uncertainty (landmark 3D positions).
´ Information Fusion via Joint Distribution of processes that
contain Uncertainty.
8. Visual – Inertial SLAM
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Visual-only SFM Limitations
2D-projective transformation introduces problem of scale
Inertial (IMU) data have absolute scale
´ Use an Extended Kalman Filter – Prediction Step
9. Visual – Inertial SLAM
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Visual-only SFM Limitations
2D-projective transformation introduces problem of scale
Inertial (IMU) data have absolute scale
´ Use an Extended Kalman Filter – Update Step
Visual-Inertial Localization – Altogether:
´ System Model: Propagation of Estimate &
Uncertainty based on Rigid Body model and
accelerometer & gyroscope data.
´ Measurement Model: Correction based on
landmark-states observation (camera-based
feature detection).
10. Visual – Inertial SLAM
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Visual – Inertial Localization and Mapping
Using a Stereo Camera
´ Reliable stereo camera model gives
better landmark estimation statistics.
´ Improved 3D pose estimation.
´ Consistent stereo depth map.
´ “Dense” Reconstruction / Mapping
Left
Camera
Right
Camera
11. Volumetric Mapping
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Visual-Inertial Localization and Mapping & Dense Reconstruction
From Dense to Volumetric Mapping
Voxel-grid from PointCloud (octomap)
´ Volumetric representation of the
known environment.
´ Makes distinction between occupied
& free voxels based on a probabilistic
hit/miss model of a depth sensor.
´ Efficient representation in memory
( octree structure, nodes store
“occupied” probability ).
´ Fast node (voxel) lookup (usually
hash-table based) given its 3D
coordinates.
12. Volumetric Mapping
´ Octomap(s)
Volumetric Mapping & Robotic Autonomy
Ray collision checks for landmark visibility
l1
l2 l3
l4
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
Fast node-lookup benefits ray-checking
´ Ray-casting / Ray-checking is the
process of checking along a 3D line
segment (ray) if occupied, free, or
unmapped voxels are crossed.
´ Checking if a transition from initial 3D
configuration to a desired one
(waypoint) will encounter an
obstacle.
´ Checking if a 3D landmark lies in
Line-of-Sight or “occluded”.
13. Path-Planning
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Core Path-planning Principles :
Random Sampling
Expanding Random Trees in the known (mapped) and free configuration space.
´ Only Collision-free transitions are
permitted for every segment.
´ Collision-free navigation along path.
14. Path-Planning
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Core Path-planning Principles :
Receding-Horizon Strategy
´ A finite number of path-planning moves (e.g. the first segment only) is performed.
´ Real-time feedback from mapping updates the environment knowledge. Based on this
updated state, path-planning is re-evaluated.
´ The first moves is performed again, with each iteration followed by a new map update.
15. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ The Objective:
Explore a location while mapping with consistency.
Given a bounded volume 𝑉"
, find a collision free path 𝜎 starting at an initial
configuration 𝜉%&%' ∈ Ξthat leads to identifying the free and occupied parts 𝑉*+,,
"
and 𝑉-..
"
when being executed, such that there does not exist any collision free
configuration from which any piece of 𝑉"
{𝑉*+,,
"
, 𝑉-..
"
} could be perceived.
Problem 1: Volumetric Exploration
Given a 𝑉1
⊂ 𝑉"
, find a collision free path 𝜎1
starting at an initial configuration 𝜉3 ∈ Ξ
and ending in a configuration 𝜉*%&45 ∈ Ξ that aims to improve the robot’s localization
and mapping confidence by following paths of optimized expected robot pose and
tracked landmarks covariance.
Problem 2: Belief Uncertainty-aware planningCombined Problem
The overall problem is that of exploring an unknown bounded 3D volume 𝑉"
⊂ ℝ7
, while
aiming to minimize the localization and mapping uncertainty as evaluated through a
metric over the robot pose and landmarks probabilistic belief.
Problem Definition
16. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Objective 1: Volumetric Exploration
Receding-Horizon Exploration & Mapping Path-planner (rhemplanner)
Two-level Path-planning
paradigm:
´ Addresses the combined
problem in a hierarchical
approach.
´ At every iteration, a finite
depth random tree is
spanned. Each vertex is
annotated with a collected
Information Gain – a metric of
how much new space is
going to be explored.
Planning Layer 1:
Volumetric Exploration
17. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Objective 1: Volumetric Exploration
Receding-Horizon Exploration & Mapping Path-planner (rhemplanner)
´ Tree-based exploration: At
every iteration, a finite depth
random tree is spanned.
Each vertex is annotated with
the collected Information
Gain – a metric of how much
new space is going to be
explored.
´ Within it, evaluation regarding
the path that overall leads to
the highest information gain is
conducted. This corresponds
to the best path for the given
iteration (a sequence of next-
best-views as sampled).
´ Receding Horizon: For the
extracted best path, only
the first viewpoint is
actually executed.
´ The system moves to it,
map is updated, process
is repeated.
Executed
Step
18. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Objective 1: Volumetric Exploration
Receding-Horizon Exploration & Mapping Path-planner (rhemplanner)
´ Probabilistic Re-observation term: Maximize newly explored space and try to re-observe
the parts where confidence whether they are occupied is low.
19. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Objective 2: Uncertainty – Aware Path-planning
Receding-Horizon Exploration & Mapping Path-planner (rhemplanner)
Two-level Path-planning paradigm:
´ Hierarchical structure:
´ Given an exploration 1st view-
point, another 2nd layer random
tree is spanned locally
“around” that vertex. Each
possible path leading to the
end-configuration is annotated
with a Belief Gain – a metric of
how much the robot belief has
improved / deteriorated.
´ The mechanism to propagate
robot’s belief has to be
established.
Planning Layer 2:
Uncertainty-Optimization
20. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Objective 2: Uncertainty – Aware Path-planning
Receding-Horizon Exploration & Mapping Path-planner (rhemplanner)
´ Exploit the EKF pipeline used for SLAM to propagate belief.
´ Propagate State
and Uncertainty
along all paths.
´ Assume closed-loop dynamics, simulate inertial measurements.
Predict
Step
21. ´ Exploit the EKF pipeline used for SLAM to propagate belief.
´ Propagate State
and Uncertainty
along all paths.
´ Use octomap representation to predict
landmark visibility / occlusion.
´ Perform virtual updates for all landmarks
expected to be seen.
Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Objective 2: Uncertainty – Aware Path-planning
Receding-Horizon Exploration & Mapping Path-planner (rhemplanner)
Update
Step
l1
l2
l3
l4
22. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Objective 2: Uncertainty – Aware Path-planning
Receding-Horizon Exploration & Mapping Path-planner (rhemplanner)
´ Finally, compute the propagated covariance matrix for every path:
´ The D-optimality metric is a measure of how “small” the corresponding ellipsoid is:
´ Choose the path that minimizes the D-optimality
metric – i.e. minimizes the Uncertainty on arrival.
´ This path might turn out to be the original straight
segment – Optimum is only selected out of a
finite number of randomly sampled trajectories.
23. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Receding-Horizon Uncertainty-aware Exploration & Mapping Path-planner (rhemplanner)
24. Autonomy & Active Perception
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Receding-Horizon Uncertainty-aware Exploration & Mapping Path-planner (rhemplanner)
26. Degraded Visual Environments
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Uncertainty-Aware Exploration & Mapping
Tying back to the original motivation
´ Particularly for DVEs, pure volumetric exploration is
not sufficient.
´ The selected viewpoints and their sequence will
heavily influence the localization of the robot.
´ A 1st generation of Multi-Modal sensor fusion for
GPS-denied localization and mapping in DVEs.
NIR Cameras
& IR LED(s)
Time-of-Flight
3D Camera
IMU
27. Degraded Visual Environments
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Uncertainty-Aware Exploration & Mapping
Tying back to the original motivation
´ A 1st generation of Multi-Modal sensor fusion for GPS-denied localization and mapping in DVEs.
´ Same Active Perception approach for reliable autonomy subject to the challenges of DVEs.
NIR Cameras
& IR LED(s)
Time-of-Flight
3D Camera
IMU
28. Degraded Visual Environments
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Multi-Modal Mapping Unit
Tightly-integrated Multi-Modal sensor (featuring Hardware-synchronization, Expansions, …)
´ Inertial Sensors (accelerometers, gyroscopes)
´ Vision (synchronized with flashing LEDs)
´ Depth Cameras (Time-of-Flight)
´ GPS integration-ready
´ Support for multi-Camera setups
29. Degraded Visual Environments
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Field Experiments
The “real” test – A driving force for improvement
´ Autonomous Robotic Navigation, Exploration, Inspection and Mapping in GPS-denied DVEs.
´ Technology developed in-house. Demonstrated in Field Experiments.
´ The 1st generation of Multi-Modal
sensor fusion:
´ Visual-Inertial / Depth – odometry
loosely-coupled via EKF.
30. Degraded Visual Environments
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Multi-Modal Mapping Unit
Tightly-integrated Multi-Modal sensor (featuring Hardware-synchronization, Expansions, …)
31. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Field Experiments
The “real” test – A driving force for improvement
Nuclearized Robots
32. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Motivation
The Nuclear Cleanup Mission
´ Nuclear facility decommissioning.
´ Soil and water cleanup.
´ Liquid radioactive waste processing & disposition.
´ Solid radioactive waste treatment, storage and disposal.
´ Nuclear materials and spent nuclear fuel management.
Figure from DOE – EM
33. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Motivation
DOE – EM Facilities Characterization
´ Flying & roving robots to help characterize DOE – EM facilities.
Goal set for demonstration of developed technologies in nuclear analog facilities of DOE – EM.
´ Identification and Semantic Classification of tanks, pipes, and other important structures to
intelligently focus the robot exploration and inspection tasks.
´ Radiation, Chemical, and Heat spatial maps are fused with 3D models of the environment.
´ Integrated Planning & Multi-Modal perception for comprehensive mapping of nuclear facilities,
Active Perception to improve (while benefiting from) radiation, chemical, heat estimation.
34. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Robotic Detection of Ionizing Radiation
Target Platform: Autonomous Micro Aerial Vehicles
´ Detection of Gamma Radiation – common need
and requirement of the decommissioning efforts.
Key technologies:
´ Miniature CeBr3, CsI, NaI scintillators with built-in
temperature compensated bias generator and
pre-amplifier alongside a Silicon Photomultiplier
tube.
´ Miniature solid-state low voltage detectors.
´ Gamma cameras (heavy for aerial robots).
35. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Robotic Detection of Ionizing Radiation
Gamma Radiation Detection
´ Sensor calibration is required too:
´ Radiation detectors can present significant
polarity characteristics. Calibration requires
exhaustive tests for different sensor orientations.
´ Differential installment of two gamma detectors
will potentially enhance source localization.
´ Integration of spectroscopy through relevant
algorithms and a multi-channel analyzer.
Calibration against known characterized
sources allows estimation of detected gamma
photon energy.
´ Estimation / Characterization of types of sources
contributing to a region’s radioactivity.
´ “Hunt” for specific expected radioactive source
types.
36. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Robotic Detection of Radiation
Other Radiation types:
´ Neutron Detection is relevant with homeland
security and industrial monitoring (e.g.
detection of nuclear weapons, personnel
monitoring, water content in soil).
´ Gas-filled (e.g. He-3), Scintillation, Solid-State.
´ Alpha Detection is very challenging. Alpha
particles are the heaviest and highly charged,
they quickly give up their energy to any
medium they pass.
´ Special detection methods are required, gas-
filled detector [ZnS(Ag)] combined with
Aluminized Mylar film.
´ Requires contact & robotic manipulation.
37. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Robotic Detection of Radiation
Autonomous Exploration & Mapping Aerial Robot for DVE and Nuclear Sites
Gamma Detector
Multi-Modal Perception Unit Scintillation Detector(s) Solid-State Detector(s)
38. Nuclearized Robotics
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Robotic Detection of Radiation
Autonomous Exploration & Mapping Aerial Robot for DVE and Nuclear Sites
39.
40. Further Research & Applications
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Robotic Detection of Change
´ Perform real-time change detection.
´ Expand to efficient 3D-to-3D change
detection approaches.
´ Incorporate change-driven “curiosity” in
planning algorithms.
41. Further Research & Applications
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Curious Robots
´ Employ a human-paradigm for interest: Collect more meaningful data without
necessarily having any explicit mission objective.
´ Ability to focus perceptual attention towards regions that have “Visual Saliency”
42. Further Research & Applications
Christos Papachristos, Autonomous Robots Lab, University of Nevada, Reno
´ Augmented – Reality Robotics
´ Provide Real-Time feedback of data annotated with Mission-Relevant information.
´ Take the actual flying away from the human, but still maintain the ability to redirect the
robot’s attention towards areas of interest.