What to measure:
Proprioceptive sensors
measure values internally to the system (robot),
e.g. motor speed, wheel load, heading of the robot, battery status
Exteroceptive sensors
information from the robots environment
distances to objects, intensity of the ambient light, unique features.
How to measure:
Passive sensors
energy coming for the environment
Active sensors
emit their proper energy and measure the reaction
better performance, but some influence on environment
1. Li Guan
lguan@cs.unc.edu
Sensors in Robotics
Savannah River Site Nuclear Surveillance Robot
Figure from Roland Siegwart,
Sensors for mobile robotics
Feature extraction
Fall ’06 COMP 790-072 Robotics Computer Science Dept. UNC-Chapel Hill
2. 2023/3/16 2
Classification of Sensors
What to measure:
Proprioceptive sensors
measure values internally to the system (robot),
e.g. motor speed, wheel load, heading of the robot, battery status
Exteroceptive sensors
information from the robots environment
distances to objects, intensity of the ambient light, unique features.
How to measure:
Passive sensors
energy coming for the environment
Active sensors
emit their proper energy and measure the reaction
better performance, but some influence on environment
7. 2023/3/16 7
Multi-Stripe Triangulation
To go faster, project multiple stripes
But which stripe is which?
Answer #1: assume surface continuity
e.g. Eyetronics’ ShapeCam
10. 2023/3/16 10
Time-Coded Light Patterns
Assign each stripe a unique illumination code
over time [Posdamer 82]
Space
Time
11. 2023/3/16 11
Direct 3D Depth Sensor
Basic idea: send out pulse of light (usually laser), time
how long it takes to return
Pulsed laser
measurement of elapsed time directly
resolving picoseconds
Phase shift measurement to produce range
estimation
Energy Integration
t
c
d
2
1
12. 2023/3/16 12
Pulsed Time of Flight
Advantages:
Large working volume (up to 100 m.)
Disadvantages:
Not-so-great accuracy (at best ~5 mm.)
Requires getting timing to ~30 picoseconds
Does not scale with working volume
Often used for scanning buildings, rooms,
archeological sites, etc.
15. 2023/3/16 15
Direct Integration: Canesta
3D Camera
2D array of time-of-flight
sensors
jitter too big on single
measurement,
but averages out on many
(10,000 measurements100x
improvement)
19. 2023/3/16 19
Sensor Errors
Systematic error deterministic errors
caused by factors that can (in theory) be modeled
prediction
e.g. calibration of a laser sensor or of the
distortion cause by the optic of a camera
Random error non-deterministic errors
no prediction possible
however, they can be described probabilistically
e.g. Hue instability of camera, black level noise of
camera ..
20. 2023/3/16 20
Probabilistic Sensor Fusion
S1 1 S2 2 S3 3
Given the sensor models
(Output | Input), (Output | Input), (Output | Input), ... ...
P P P
1 2 3
S
1
S
Input {Input Status Space} 1
=1 ,..., |Input Status Space|
Bayesian Inference
P(Input|Output ,Output ,Output )
(Output | Input)
=
(Output | Input )
i
i
k
n
i
i
n
i k
i
k
P
P
21. 2023/3/16 21
Sensor Fusion Example:
Probabilistic Visual Hull
Multiple Camera Sensors
Inward Looking
Reconstruct the
environment
Jean-Sebastien Franco, et. al. ICCV`05
figures from
http://graphics.csail.mit.edu/~
wojciech/vh/reduction.html
22. 2023/3/16 22
Fusion of Multi-View Silhouette Cues Using a
Space Occupancy Grid (ICCV `05)
Unreliable silhouettes: do not make decision about their location
Do sensor fusion: use all image information simultaneously
23. 2023/3/16 23
Bayesian formulation
Idea: we wish to find the content of the
scene from images, as a probability grid
Modeling the forward problem -
explaining image observations given the
grid state - is easy. It can be accounted
for in a sensor model.
Bayesian inference enables the
formulation of our initial inverse problem
from the sensor model
Simplification for tractability: independent
analysis and processing of voxels
24. 2023/3/16 24
Modeling
I: color information in images
B: background color models
F: silhouette detection variable (0 or 1): hidden
GX: occupancy at voxel X (0 or 1)
Sensor model:
Inference:
( | , )
( | , , ) ( | , )
X
X
F
P I G
P I F B P F G
,
,
,
,
( | , )
( | , )
( | , )
X
img pixel X
img pixel
X
img pixel X
G img pixel
P I G
P G I
P I G
Grid
Gx
26. 2023/3/16 26
Further, we can infer occlusion
Foreground object inference robust to partial occlusions, when
Static occluders, partial occlusion
This enables detection of discrepancies between the foreground volume and
where its silhouette is actually observed
Example (Old Well dataset with 9 cameras, frame#118, voxels>90%)
30. 2023/3/16 30
Other Reference
M. A. Abidiand R. C. Gonzalez, Data Fusion in Robotics and
Machine Intelligence, Academic Press, 1992.
P.K.Allen,Robotic object recognition using vision and touch,
KluwerAcademic Publishers, 1987
A. I. Hernandez, G. Carrault, F. Mora, L. Thoraval, G. Passariello,
and J. M. Schleich, “Multisensorfusion for atrialand ventricular
activity detection in coronary care monitoring, IEEE Transactions
on Biomedical Engineering, vol. 46, no. 10, pp. 1186–1190, 1999.
A. Hernandez, O. Basset, I. Magnin, A. Bremond, and G.
Gimenez, “Fusion of ultrasonic and radiographic images of the
breast, in Proc. IEEE UltrasonicsSymposium, pp. 1437–1440,
San Antonio, TX, USA, 1996.
32. 2023/3/16 32
Sensor Communication
Different Types of Sensors/Drivers
image sensors: camera, MRI, radar…
sound sensors: microphones, hydrophones, seismic sensors.
temperature sensors: thermometers
motion sensors: radar gun, speedometer, tachometer, odometer,
turn coordinator
…
Sensor Data Transmission
Size
Format
Frequency
SensorTalk (Honda Research Institute) `05
33. 2023/3/16 33
A Counterpart - RoboTalk
Copyright
Lucasfilm Ltd.
Mobile Robot with
Pan-Tilt Camera
Honda Asimo
Humanoid Robot
Allen Y. Yang, Hector Gonzalez-Banos, Victor Ng-Thow-Hing, James Davis, RoboTalk: controlling arms, bases and
androids through a single motion interface, IEEE Int. Conf. on Advanced Robotics (ICAR), 2005.
35. 2023/3/16 35
Robot? Sensor?
A PTZ (Pan/Tilt/Zoom) camera
Movable on its horizontal (Pan),
Vertical (Tilt), and focal length (Zoom)
axis.
The Mars Land Rover
A specialized sensing robot…
36. 2023/3/16 36
Why not just
SensorTalk/RoboTalk
Robot:
QoS – high
Throughput - low
Sensor:
Qos – low
Throughput – may be huge!
37. 2023/3/16 37
Objective of SensorTalk
Variety of Sensors
Different requirements (output frequency)
Different input/output
High re-usability of driver and application code
(Cross platform)
Multi-user access to the sensor
To build sensors from simpler sensors
Work together with RoboTalk
Think of a sensor as a robot – Pan-tilt-zoom camera
Think of a robot as a sensor – NASA Mars Exploration
Rover, ASIMO…
38. 2023/3/16 38
Objective
A communication tool
Coordinate different types of sensors
Facilitate different types of applications
A protocol
A set of rules to write the drivers & applications
A set of methods to support multiple clients (e.g.
write-locking)
A set of modes to transmit output data
40. 2023/3/16 40
Model of a Sensor
A service with parameters
Static Parameters (Input Signal, Output Signal)
Tunable Parameters
Client can query all parameters
Client can change tunable parameters that
are not being locked
41. 2023/3/16 41
Example #1: Heat Sensor
Parameters
output format (integer, double)
output value unit (Kelvin, oC)
gain
publishing frequency (1Hz ~ 29.99Hz)
Resolution of output value
…
43. 2023/3/16 43
Example #3: Visual Hull
Sensor
Parameters
number of camera views
Parameters related with each cameras
projection matrix of every view
output format
volume resolution
publishing frequency (1Hz~60Hz)
…
45. 2023/3/16 45
SensorTalk Scenario
Server Client
Up
Up
Subscribe
Create a client structure Return client ID
Ask for Description
Return Description
Control para “A”
Return new “A”
Call function to change “A”
46. 2023/3/16 46
SensorTalk Scenario (cont.)
Server Client
Get 1 frame (DIRECT)
Get 1 frame from driver Return the frame
Process the frame
Get frames (CONTINUOUS)
Get 1 frame from driver Return the frame
Get 1 frame from driver Return the frame
Get 1 frame from driver Return the frame
47. 2023/3/16 47
SensorTalk Scenario (cont.)
Server Client
Stop getting frames Return SUCCESS
Stop stream (CONTINUOUS)
Release
Disconnect
Close program
Delete the client
structure with ID
Waiting for other
connections
48. 2023/3/16 48
Demo
2 Virtual Cameras
1 “Visual Hull” sensor
Dataset from
http://www.mpi-sb.mpg.de/departments/irg3/kungfu/
A demo video
49. 2023/3/16 49
Conclusion
Recent Vision Sensors
Sensor Fusion Framework
More in SLAM
Multiple Sensor Cooperation
More in Multiple robot coordination
1st Summer School on Perception and Sensor Fusion in Mobile Robotics,
September 11~16, 2006 – Fermo, Italy
http://psfmr.univpm.it/2005/material.htm
Thanks, any Questions?