SlideShare a Scribd company logo
6 b i d lli S
EE6221 Robotics and Intelligent Sensors
(Part 3)
Lecture 1: Sensors
Lecture 1: Sensors
Guoqiang Hu
Guoqiang Hu
School of EEE, NTU
Outline
• Sensor classification
• Sensor characterization
M t d ( iti )
• Motor encoders (position)
• Heading/orientation sensors
• Ground-based beacons
Ground based beacons
• Active ranging
• Speed sensors
p
• Vision (sensors, depth from focus, stereo vision, optic
flow, etc.)
Reference:
• Introduction to Autonomous Mobile Robots (ch.4), R. Siegwart, and I. Nourbakhsh,
MIT Press 2004
2
MIT Press, 2004
• Sensors for Mobile Robots, Theory and Applications, H.R. Everett, A K Peters Ltd.,
1995
2
Robot Sensors
R b t S b di id d i t t b i l
• Robot Sensors can be divided into two basic classes:
1) Internal State Sensors
2) External State Sensors
2) External State Sensors.
• Internal sensors are used for control of the internal
states of the robot/machine.
• External sensors are needed for the robot/machine to
react intelligently with its surroundings.
Sonar IR haptic
Sonar, IR, haptic
feedback, tactile
sensors, range of
motion sensors,
motion sensors,
vision sensors (CCD
cameras), etc.
3
Internal State Sensors
I t l t t i t f d i d t
• Internal state sensors consists of devices used to
measure position, velocity, or acceleration of robot joints
and/or the end effectors. The following devices are some
g
of the sensors that belong to this class:
• Potentiometers • Synchros
• Linear inductive scales
• Differential transformers (LVDTs and RVDTs)
• Optical interrupters • Optical encoders
• Tachometers • Accelerometers
4
The successful control of most robots depends on
Internal State Sensors
The successful control of most robots depends on
being able to obtain accurate information about the joint
and/or end effectors
and/or end effectors.
Necessary to have devices (transducers) that provide
such information and can be readily utilized in a robot for
such information and can be readily utilized in a robot for
this purpose.
I ti l iti l it d/ l ti ( t
In particular, position, velocity, and/or acceleration (or at
least analog or digital representations of these
quantities) must be measured to ensure that the robotic
quantities) must be measured to ensure that the robotic
machine moves in a desired manner.
5
Internal State Sensors
The internal state sensors must not only permit the
required degree of accuracy to be achieved, they must
also be cost effective since each of the robot’s axes will
also be cost-effective since each of the robot s axes will
normally utilize such devices.
Th l i d h d i i l i i h
The sensor selection and the decision to place it either on
the load side or on the output of the joint actuator itself is
influenced by such factors as:
influenced by such factors as:
1) overall sensor cost,
2) power needs for a particular joint,
2) power needs for a particular joint,
3) maximum permissible size of the actuator,
4) sensor resolution, and
5) the need to monitor directly the actions of the joint.
6
External State Sensors
Th d l ll d t l t t i d
The second class, called external state sensors, is used
to monitor the robot’s geometric and/or dynamic relation to
its task, environment, or the objects that it is handling.
j g
Although it is possible to utilize a robot without any
external sensing, but in order to achieve greater
flexibilities and intelligent abilities, the robot will require
such devices to be aware of its surroundings.
Robotic sensing mainly gives robots the ability to see,
touch, hear and move, and uses algorithms that require
environmental feedback
environmental feedback.
Robots use different sensors to detect different factors of
the environment.
7
the environment.
External State Sensors
Th f t l t t
The usage of external state sensors:
• Vision: Visual sensors help robots to identify the
surrounding environment and take appropriate action
surrounding environment and take appropriate action.
• Touch: Touch patterns enable robots to interpret
human emotions in interactive applications. Robots use
human emotions in interactive applications. Robots use
touch signals to map the profile of a surface in hostile
environment such as a water pipe.
• Hearing: Robots can perceive our emotion through the
way we talk. Acoustic and linguistic features are
generally used to characterize emotions
generally used to characterize emotions.
• Movement: Provide a guidance system for an
automated robot to determine the ideal path to perform
8
its task.
External State Sensors
I t t f t l t t ( t f ll li t)
Instruments of external state sensors (not a full list):
Vision (camera, infrared, laser, RFID) for landmark
recognition 3D scene recovery scene understanding etc
recognition, 3D scene recovery, scene understanding, etc.
Tactile sensors: Four measurable features—force, contact
time, repetition, and contact area change—can effectively
time, repetition, and contact area change can effectively
categorize touch patterns.
Range finders (lasers, ultrasonic) for 3D scene recovery,
local position.
GPS (differential GPS) for global positioning
Odometry (encoders, gyroscopes, accelerometers) –
absolute motion recovery.
9
The Role of Multi-Sensor Integration and Fusion
Th f t l i t id t ith
The purpose of external sensors is to provide a system with
useful information concerning some features of interest in
the system’s environment. This enables the machine to act
y
intelligently via some intelligent programme.
The potential advantages in integrating and/or fusing
The potential advantages in integrating and/or fusing
information from multiple sensors are that the
information can be decided more accurately, concerning
f t th t i ibl t i ith i di id l
features that are impossible to perceive with individual
sensors, as well as in less time, and at a lesser cost.
These advantages correspond, respectively, to the notions
of the redundancy, complementarity, timeliness, and
cost of the information provided the system
10
cost of the information provided the system.
The Role of Multi-Sensor Integration and Fusion
R d d t i f ti i id d f f
Redundant information is provided from a group of
sensors (or a single sensor over time) when each sensor is
perceiving, possibly with a different fidelity, the same
p g p y y
features in the environment.
The integration or fusion of redundant information can
reduce overall uncertainty and thus serve to increase the
accuracy with which the features are perceived by the
system
system.
Multiple sensors providing redundant information can also
serve to increase reliability in the case of sensor error or
serve to increase reliability in the case of sensor error or
failure.
11
The Role of Multi-Sensor Integration and Fusion
C l t i f ti f lti l ll
Complementary information from multiple sensors allows
features in the environment to be perceived that are
impossible to perceive using just the information from each
p p g j
individual sensor operating separately.
More timely information, as compared to the speed at
which it could be provided by a single sensor, may be
provided by multiple sensors either because of the actual
speed of operation of each sensor or because of the
speed of operation of each sensor, or because of the
processing parallelism that may possibly be achieved as
part of tile integration process.
12
The Role of Multi-Sensor Integration and Fusion
L tl i f ti i th t t f t ith
Less costly information, in the context of a system with
multiple sensors, is information obtained at a lesser cost as
compared to the equivalent quality of information that could
p q q y
be obtained from a single sensor.
Unless the information provided by the single sensor is
being used for additional functions in the system, the total
cost of the single sensor should be compared to the total
cost of the integrated multi-sensor system
cost of the integrated multi sensor system.
With redundancy, it is possible then to use lower cost
sensors with lower quality that can be combined to
sensors with lower quality that can be combined to
achieve the same quality of reading from a high quality
sensor (thereby being costly).
13
Classification of Sensors for Robots
Cl ifi ti b d f ti
Classification based on functions:
• Proprioceptive sensors
Measure the internal state of the system (robot) e g
– Measure the internal state of the system (robot), e.g.,
Angular position of motor wheel, motor speed, wheel
load, battery status, etc.
– Examples: GPS, INS, Shaft Encoders, Compass, etc.
14
Classification of Sensors for Robots
E t ti
• Exteroceptive sensors
– Determine the measurements of objects relative to a
robot's frame of reference (i.e., information about the
robot s frame of reference (i.e., information about the
robot’s environment)
– Measure: e.g., distance from the robot to another
object
– Keep the robot from colliding with other objects
Types: contact sensors range sensors vision
– Types: contact sensors, range sensors, vision
sensors
• Exproprioceptive sensors (Directional Sensors)
p p p ( )
15
Classification of Sensors for Robots
Cl ifi ti b d h f
Classification based on where energy comes from:
• Active sensors
emit their own energy and measure the reaction
– emit their own energy and measure the reaction
– better performance, but some influence on the
environment and generally consumes more energy
– Examples: ultrasonic, laser, and IR sensors.
• Passive sensors
– receives energy already in the environment
cons me less energ b t often ha e signal and noise
– consume less energy, but often have signal and noise
problems
– Examples: camera
16
p
General Classification
17
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
General Classification
18
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sensor Characteristics
Si i ht ti lt i t f
• Size, weight, power consumption, voltage, interface
• Field of view (FOV)
• Range (or full scale)
• Range (or full scale)
– lower and upper limits
– e.g., triangulation range finder may provide
g , g g y p
unpredictable values outside its working range
• Dynamic range (e.g., camera pixels)
– ratio between the smallest and largest possible
values of a changeable quantity (e.g., sound, light,
etc.)
)
– often specified in decibels (ratio between powers)
– e.g., power measurement from 1mW to 20W:
19
10log[20/0.001]=43dB
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sensor Characteristics
• Resolution
– smallest input change that can be detected (due to
noise physical limitations or determined by the A/D
noise, physical limitations or determined by the A/D
conversion)
• Linearity
– variation of output signal as function of the input
signal
sometimes expressed as % of full scale
– sometimes expressed as % of full scale
• Hysteresis
– path-dependency (e g thermostats some tactile
path dependency (e.g. thermostats, some tactile
sensors)
20
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sensor Characteristics
B d idth F
• Bandwidth or Frequency
– the speed at which a sensor can provide a stream of
readings
readings
– usually there is an upper limit depending on the
sensor and the sampling rate
– one has also to consider delay (phase) of the signal
• Sensitivity
ti f t t h t i t h
– ratio of output change to input change
• Cross-sensitivity (and cross-talk)
– sensitivity to other environmental parameters
– sensitivity to other environmental parameters
– influence of other active sensors
21
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sensor Characteristics
A ( tit d it f )
• Accuracy (exactitude, opposite of error)
– degree of conformity between the measurement and
true value
true value
– often expressed as a portion of the true value:
Accuracy=1-|m-v|/v, m = measured value, v = true
l
value
• Precision (or repeatability)
relates to reproducibility
– relates to reproducibility
of sensor results:
precision=range divided by
precision range divided by
standard deviation
22
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Types of Errors
S t ti d t i i ti
• Systematic errors: deterministic
– caused by factors that can (in theory) be modeled and predicted
– e.g., distortion caused by the optics of a camera
• Random errors: non-deterministic
– no prediction possible
– however, they may be described probabilistically
however, they may be described probabilistically
– e.g., hue instability of camera, black level noise of
photoreceptors, non-returning echoes in ultrasonic sensors, etc.
• Others
• Others
– cross-sensitivity of sensors, motion blur
– rarely possible to model -> appear as “random” errors but are
ith t ti d
neither systematic nor random
– systematic errors and random errors might be well defined in
controlled environment. This is often not the case for mobile
robots!
23
robots!
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Wheel/Motor Encoders
M iti d f th h l t i
• Measure position or speed of the wheels or steering.
• Integrate wheel movements to get an estimate of the position ->
odometry
• Optical encoders are proprioceptive sensors thus the position
estimation in relation to a fixed reference frame is only valuable for
short movements.
• Typical resolutions: 64 - 2048 increments per revolution.
24
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Magnetic compasses
Si 2000 B C
• Since over 2000 B.C.
– when Chinese suspended a piece of naturally magnetite from a
silk thread and used it to guide a chariot over land.
• Magnetic field on earth
– absolute measure of orientation.
• Large variety of solutions to measure the earth magnetic field
g y g
– mechanical magnetic compass
– direct measure of the magnetic field: Hall-effect, Flux Gate,
magnetoresistive sensors
magnetoresistive sensors
• Major drawback
– weakness of the Earth magnetic field (0.3-0.6 μT)
il di t b d b ti bj t th
– easily disturbed by magnetic objects or other sources
– not working in indoor environments (except locally)
• Often used in attitude sensing…
25
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Gyroscopes
Th l f i t i t h i hi l
• The goal of gyroscopic systems is to measure changes in vehicle
orientation by taking advantage of physical laws that produce
predictable effects under rotation.
Th f ti i i i l
• Three functioning principles:
– Mechanical
– Optical
– Micro-electromechanical systems (MEMS)
• Two categories
– Orientation -> directly measure angles (very rare in robotics!)
Orientation directly measure angles (very rare in robotics!)
– Rate gyros -> measure rotation velocity, which can be
integrated…
• A problem common to all gyroscopes is that of drift => unless the
• A problem common to all gyroscopes is that of drift => unless the
error is corrected through reference to some alternate
measurement, the drift will eventually exceed the required accuracy
of the system.
26
of the system.
Mechanical Gyroscopes
R l l t ( i i ti ) th t d f
• Rely on angular momentum (gyroscopic inertia): the tendency of a
rotating object to keep rotating at the same angular speed about the
same axis of rotation in the absence of an external torque.
R ti t i ti l t th i i d th t f
• Reactive torque is proportional to the spinning speed, the rate of
precession and the wheel’s inertia.
• No torque can be transmitted from the outer pivot to the wheel axis
> i i i ill th f b t bl
=> spinning axis will therefore be space-stable.
• Nevertheless, remaining friction in the bearings of the gyro axis
introduce some errors over time (drift).
• A high-quality mechanical gyro can cost up to $100’000 and drift
may be kept as low as 0.02°/h
• Mechanically complex => not much used in mobile robotics.
27
Optical Gyroscopes
B d th b h i f ti l t di i t ti
• Based on the behavior of an optical standing wave in a rotating
frame (the Sagnac effect).
• Optical gyroscopes use the interference of light to detect mechanical
t ti
rotation:
– two light beams travel (e.g. along an optical fiber of up to 5 km in
length) in opposite directions
– the beam traveling against the rotation experiences a slightly
shorter path than the other beam
– the resulting phase shift affects how the beams interfere with
each other when they are combined
– the intensity of the combined beam then depends on the rotation
rate of the device
• Resolution can be smaller than 0.0001°/s
28
MEMS Rate Gyros
B d ib ti h i l l t t C i li
• Based on vibrating mechanical elements to sense Coriolis
acceleration.
• Coriolis acceleration is the apparent acceleration that arises in a
t ti f f f
rotating frame of reference:
• Various possible MEMS structure: tuning-fork, vibrating wheel,
wineglass resonator
• MEMS gyros have no rotating parts, have low-power consumption
requirements, and are small => they are quickly replacing
mechanical and optical gyros in robotic applications.
29
Accelerometers and Pressure Sensors
• Accelerometers / inclinometers
• Accelerometers / inclinometers
– An accelerometer is a device that measures the proper acceleration
of the device.
Conceptually an accelerometer behaves as a damped mass on a
– Conceptually, an accelerometer behaves as a damped mass on a
spring.
– MEMS are based on a cantilever beam with a proof mass.
Th f i th b d fl ti i ft iti
– The way of measuring the beam deflection is often capacitive or
piezoresistive.
– MEMS accelerometers can have three axes => inclinometers.
• Pressure / airspeed / altitude
– Deforming membrane (piezoresistive, capacitive, optical,
electromagnetic, etc.)
– Either absolute or differential (Pitot tube)
– Example of the Freescale MPX:
• resolution of 4 cm in altitude
30
resolution of 4 cm in altitude
• variations due to atmospheric pressure changes over one day: up
to 45 m
Inertial Measurement Units (IMU) or Inertial Navigation
System (INS)
• A device that utilizes sensors such as gyroscopes and accelerometers to
• A device that utilizes sensors such as gyroscopes and accelerometers to
estimate the relative position (velocity, and acceleration) of a vehicle in
motion.
• A complete IMU/INS maintains a 6 DOF estimate of the vehicle pose:
• A complete IMU/INS maintains a 6 DOF estimate of the vehicle pose:
position (x,y,z) and orientation (yaw, pitch and roll).
• This requires integrating information from the sensors in real time.
• IMU/INS are extremely sensitive to measurement errors => an external
source of information such as GPS and/or magnetometer is often
required to correct this (cf. ch. 8 for more information on sensor fusion).
31
required to correct this (cf. ch. 8 for more information on sensor fusion).
• List of IMU/INS products: http://damien.douxchamps.net/research/imu/
(adapted from the Handbook of Robotics, 2008, ch.20.4)
Beacon-based positioning
• Beacons are signaling devices with a precisely known position.
• Beacon-based navigation is used since the humans started to travel
– Natural beacons (landmarks) like stars, mountains or the sun
– Artificial beacons like lighthouses
• The recently introduced Global Positioning System (GPS)
revolutionized modern navigation technology
e o ut o ed ode a gat o tec o ogy
– Already one of the key sensors for outdoor mobile robotics
– For indoor robots GPS is not applicable
• Major drawback with the use of beacons indoors:
• Major drawback with the use of beacons indoors:
– Beacons require changes in the environment -> costly
– Limit flexibility and adaptability to changing environments
32
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Passive optical beacons
P i t fl ti b f k iti
• Passive retroreflective beacons of known position
• Distance and heading of at least two beacons are measured (e.g.
using a scanning laser range finder)
(adapted from Siegwart & Nourbakhsh, 2004, ch. 5)
33
Active ultrasonic beacons
R b t t k th itt l ti
• Robots must know the emitter locations
• Using time of flight (TOF), they can deduce their position
• Time synchronization is required (e.g. RF or IR)
34
Global positioning system (GPS)
F t
• Facts:
– Developed for military use by the US (NAVSTAR)
– Recently became accessible for commercial applications
– 24 satellites (including three spares) orbiting the Earth every 12
hours at a height of 20’190 km
– 4 satellites are located in each of six planes inclined by 55° with
p y
respect to the plane of the Earth’s equator
– Location of any GPS receiver is determined through a time of
flight measurement
g
• Technical challenges:
– Time synchronization between the individual satellites and the
GPS receiver
GPS receiver
– Real-time update of the exact location of the satellites
– Precise measurement of the time of flight
Interferences ith other signals reflections
35
– Interferences with other signals, reflections
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Global positioning system (GPS)
36
(adapted from Everett, 1995, section 14.2)
Global positioning system (GPS)
Time synchronization:
• Time synchronization:
– ultra-precision time synchronization is extremely important
– roughly 0.3m/ns => position accuracy proportional to precision of time measurement
– atomic clocks on each satellite
atomic clocks on each satellite
• Real time update of the exact location of the satellites:
– monitoring the satellites from a number of widely distributed ground stations
– master station analyses all the measurements and transmits the actual position to each of
h lli
the satellites
• Exact measurement of the time of flight
– the receiver correlates a pseudocode with the same code coming from the satellite
– the delay time for best correlation represents the time of flight
the delay time for best correlation represents the time of flight
– quartz clock on the GPS receivers are not very precise
– the range measurement with four satellites allows to identify the three values (x, y, z) for
the position and the clock correction dT
• Recent commercial GPS receivers have a position accuracy within 20 m (typ. 10 m) in the
horizontal plane and 45 m in the vertical plane, depending on the number of satellites within
line of sight and multipath issues. WAAS allows to get close to 1-2 m accuracy.
• The update rate is typically between 1 and 4 Hz only.
37
p yp y y
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
TOF Range Sensors
R i f ti
• Range information:
– key element for obstacle avoidance, localization and environment modeling
• Ultrasonic sensors as well as laser range sensors make use of
propagation speed of sound or electromagnetic waves respectively.
The traveled distance of a sound or electromagnetic wave is given
by d=c * t
• Where
– d = distance traveled (usually round-trip)
– c = speed of wave propagation
t ti f fli ht
– t = time of flight
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
38
TOF Range Sensors
• Important to keep in mind:
Important to keep in mind:
– Propagation speed of sound: 0.3m/ms
– Propagation speed of electromagnetic signals: 0.3m/ns
3 t d t 10 f lt i t l
– 3 meters correspond to 10ms for an ultrasonic system versus only
10ns for a laser range sensor
=> time of flight with electromagnetic signals involves very fast
l t i
electronics
=> laser range sensors are more expensive and delicate to design
• The quality of TOF range sensors mainly depends on:
– Uncertainties about the exact time of arrival of the reflected signal
– Inaccuracies in the time of fight measure (laser range sensors)
– Opening angle of transmitted beam (especially ultrasonic range
Opening angle of transmitted beam (especially ultrasonic range
sensors)
– Interaction with the target (surface, diffuse/specular reflections)
Variation of propagation speed (sound)
39
– Variation of propagation speed (sound)
– Speed of mobile robot and target (if not at stand still)
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Ultrasonic Sensors
T it k t f ( lt i )
• Transmit a packet of (ultrasonic) pressure waves
• Distance d of the echoing object can be calculated based on the
propagation speed of sound c and the time of flight t. d=c*t/2
• Typical frequency: 40 180kHz
• Typical frequency: 40 - 180kHz
• Generation of sound wave: piezo transducer
– transmitter and receiver can be separated or not
• Sound beam propagates in a cone (approx )
• Sound beam propagates in a cone (approx.)
– opening angles around 20 to 70 degrees
– regions of constant depth
segments of an arc
– segments of an arc
(sphere for 3D)
40 Typical intensity distribution of a ultrasonic sensor
Laser Rangefinders
Al k l d LIDAR (LI ht D t ti A d R i )
• Also known as laser radar or LIDAR (LIght Detection And Ranging)
• Transmitted and received beams are coaxial
• Transmitter illuminates a target with a collimated beam (laser)
• Diffuse reflection with most surfaces because wavelength is typ.
824nm
• Receiver detects the time needed for round-trip
p
• An optional mechanism sweeps the light beam to cover the required
scene (in 2D or 3D).
41
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Laser Rangefinders
Ti f fli ht t i ll hi d
• Time of flight measurement is generally achieved
using one of the following methods:
– Pulsed laser (e.g. Sick)
• direct measurement of time of flight
• requires resolving picoseconds (3m = 10ns)
– Phase shift measurement (e.g. Hokuyo)
( g y )
• sensor transmits 100% amplitude modulated
light at a known frequency and measures the
phase shift between the transmitted and
phase shift between the transmitted and
reflected signals
• technically easier than the above method
42
Phase-shift laser Rangefinders
• Distance D, between the beam splitter and the target is given by
(Woodbury et al 1993): D = lamda*theta/(4*pi)
(Woodbury, et al., 1993): D = lamda theta/(4 pi)
where theta is the phase difference between transmitted and
reflected beam and with c the speed of light, f the modulating
frequency
frequency
• Confidence in the range (phase/time estimate) is inversely
proportional to the square of the received signal amplitude.
Hence dark distant objects ill not prod ce as good range
43
• Hence dark, distant objects will not produce as good range
estimates as closer, brighter objects.

More Related Content

Similar to Lec 1 - Sensors (3hrs).pdf

First fare 2010 lab-view sensors for frc robots
First fare 2010 lab-view sensors for frc robotsFirst fare 2010 lab-view sensors for frc robots
First fare 2010 lab-view sensors for frc robots
Oregon FIRST Robotics
 
sensors in robotics
sensors in roboticssensors in robotics
sensors in robotics
Omkar Lokhande
 
A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems
IJECEIAES
 
PC-based mobile robot navigation sytem
PC-based mobile robot navigation sytemPC-based mobile robot navigation sytem
PC-based mobile robot navigation sytem
ANKIT SURATI
 
Robotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly peopleRobotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly people
theijes
 
Robotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly peopleRobotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly people
theijes
 
embedded system report
embedded system reportembedded system report
embedded system report
manish katara
 
kashif.pptx
kashif.pptxkashif.pptx
kashif.pptx
mdmaaz1232
 
Sensor and different types of sensor .pdf
Sensor and different types of sensor  .pdfSensor and different types of sensor  .pdf
Sensor and different types of sensor .pdf
Ibrahim Tareq
 
Multisensor data fusion based autonomous mobile
Multisensor data fusion based autonomous mobileMultisensor data fusion based autonomous mobile
Multisensor data fusion based autonomous mobile
eSAT Publishing House
 
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
eSAT Publishing House
 
Automation biped robot @1000KV Technologies 9030844877
Automation biped robot @1000KV Technologies 9030844877Automation biped robot @1000KV Technologies 9030844877
Automation biped robot @1000KV Technologies 9030844877
1000kv technologies
 
IRJET- Survey Paper on Human Following Robot
IRJET- Survey Paper on Human Following RobotIRJET- Survey Paper on Human Following Robot
IRJET- Survey Paper on Human Following Robot
IRJET Journal
 
DSR_Unit-5_Sensors.pptx
DSR_Unit-5_Sensors.pptxDSR_Unit-5_Sensors.pptx
DSR_Unit-5_Sensors.pptx
PuneetMathur39
 
First fare 2011 sensors for frc robots
First fare 2011 sensors for frc robotsFirst fare 2011 sensors for frc robots
First fare 2011 sensors for frc robots
Oregon FIRST Robotics
 
Robotics sensors
Robotics sensorsRobotics sensors
Robotics sensors
Pankaj Pundir
 
HCI for Real world Applications
HCI for Real world ApplicationsHCI for Real world Applications
HCI for Real world Applications
IOSR Journals
 
L01117074
L01117074L01117074
L01117074
IOSR Journals
 
Project report on Iot Based Garbage Monitoring System
Project report on Iot Based Garbage Monitoring System  Project report on Iot Based Garbage Monitoring System
Project report on Iot Based Garbage Monitoring System
alok pal
 
Vehicle density sensor system to manage traffic
Vehicle density sensor system to manage trafficVehicle density sensor system to manage traffic
Vehicle density sensor system to manage traffic
eSAT Publishing House
 

Similar to Lec 1 - Sensors (3hrs).pdf (20)

First fare 2010 lab-view sensors for frc robots
First fare 2010 lab-view sensors for frc robotsFirst fare 2010 lab-view sensors for frc robots
First fare 2010 lab-view sensors for frc robots
 
sensors in robotics
sensors in roboticssensors in robotics
sensors in robotics
 
A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems
 
PC-based mobile robot navigation sytem
PC-based mobile robot navigation sytemPC-based mobile robot navigation sytem
PC-based mobile robot navigation sytem
 
Robotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly peopleRobotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly people
 
Robotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly peopleRobotic assistance for the visually impaired and elderly people
Robotic assistance for the visually impaired and elderly people
 
embedded system report
embedded system reportembedded system report
embedded system report
 
kashif.pptx
kashif.pptxkashif.pptx
kashif.pptx
 
Sensor and different types of sensor .pdf
Sensor and different types of sensor  .pdfSensor and different types of sensor  .pdf
Sensor and different types of sensor .pdf
 
Multisensor data fusion based autonomous mobile
Multisensor data fusion based autonomous mobileMultisensor data fusion based autonomous mobile
Multisensor data fusion based autonomous mobile
 
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
 
Automation biped robot @1000KV Technologies 9030844877
Automation biped robot @1000KV Technologies 9030844877Automation biped robot @1000KV Technologies 9030844877
Automation biped robot @1000KV Technologies 9030844877
 
IRJET- Survey Paper on Human Following Robot
IRJET- Survey Paper on Human Following RobotIRJET- Survey Paper on Human Following Robot
IRJET- Survey Paper on Human Following Robot
 
DSR_Unit-5_Sensors.pptx
DSR_Unit-5_Sensors.pptxDSR_Unit-5_Sensors.pptx
DSR_Unit-5_Sensors.pptx
 
First fare 2011 sensors for frc robots
First fare 2011 sensors for frc robotsFirst fare 2011 sensors for frc robots
First fare 2011 sensors for frc robots
 
Robotics sensors
Robotics sensorsRobotics sensors
Robotics sensors
 
HCI for Real world Applications
HCI for Real world ApplicationsHCI for Real world Applications
HCI for Real world Applications
 
L01117074
L01117074L01117074
L01117074
 
Project report on Iot Based Garbage Monitoring System
Project report on Iot Based Garbage Monitoring System  Project report on Iot Based Garbage Monitoring System
Project report on Iot Based Garbage Monitoring System
 
Vehicle density sensor system to manage traffic
Vehicle density sensor system to manage trafficVehicle density sensor system to manage traffic
Vehicle density sensor system to manage traffic
 

Recently uploaded

Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...
Transcat
 
5G Radio Network Througput Problem Analysis HCIA.pdf
5G Radio Network Througput Problem Analysis HCIA.pdf5G Radio Network Througput Problem Analysis HCIA.pdf
5G Radio Network Througput Problem Analysis HCIA.pdf
AlvianRamadhani5
 
Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...
Prakhyath Rai
 
2. protection of river banks and bed erosion protection works.ppt
2. protection of river banks and bed erosion protection works.ppt2. protection of river banks and bed erosion protection works.ppt
2. protection of river banks and bed erosion protection works.ppt
abdatawakjira
 
Applications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdfApplications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdf
Atif Razi
 
Data Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason WebinarData Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason Webinar
UReason
 
AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...
AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...
AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...
Paris Salesforce Developer Group
 
SENTIMENT ANALYSIS ON PPT AND Project template_.pptx
SENTIMENT ANALYSIS ON PPT AND Project template_.pptxSENTIMENT ANALYSIS ON PPT AND Project template_.pptx
SENTIMENT ANALYSIS ON PPT AND Project template_.pptx
b0754201
 
Null Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAMNull Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAM
Divyanshu
 
一比一原版(USF毕业证)旧金山大学毕业证如何办理
一比一原版(USF毕业证)旧金山大学毕业证如何办理一比一原版(USF毕业证)旧金山大学毕业证如何办理
一比一原版(USF毕业证)旧金山大学毕业证如何办理
uqyfuc
 
TIME TABLE MANAGEMENT SYSTEM testing.pptx
TIME TABLE MANAGEMENT SYSTEM testing.pptxTIME TABLE MANAGEMENT SYSTEM testing.pptx
TIME TABLE MANAGEMENT SYSTEM testing.pptx
CVCSOfficial
 
ITSM Integration with MuleSoft.pptx
ITSM  Integration with MuleSoft.pptxITSM  Integration with MuleSoft.pptx
ITSM Integration with MuleSoft.pptx
VANDANAMOHANGOUDA
 
Call For Paper -3rd International Conference on Artificial Intelligence Advan...
Call For Paper -3rd International Conference on Artificial Intelligence Advan...Call For Paper -3rd International Conference on Artificial Intelligence Advan...
Call For Paper -3rd International Conference on Artificial Intelligence Advan...
ijseajournal
 
Height and depth gauge linear metrology.pdf
Height and depth gauge linear metrology.pdfHeight and depth gauge linear metrology.pdf
Height and depth gauge linear metrology.pdf
q30122000
 
Object Oriented Analysis and Design - OOAD
Object Oriented Analysis and Design - OOADObject Oriented Analysis and Design - OOAD
Object Oriented Analysis and Design - OOAD
PreethaV16
 
SCALING OF MOS CIRCUITS m .pptx
SCALING OF MOS CIRCUITS m                 .pptxSCALING OF MOS CIRCUITS m                 .pptx
SCALING OF MOS CIRCUITS m .pptx
harshapolam10
 
一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理
一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理
一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理
ecqow
 
NATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENT
NATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENTNATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENT
NATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENT
Addu25809
 
Pressure Relief valve used in flow line to release the over pressure at our d...
Pressure Relief valve used in flow line to release the over pressure at our d...Pressure Relief valve used in flow line to release the over pressure at our d...
Pressure Relief valve used in flow line to release the over pressure at our d...
cannyengineerings
 
Accident detection system project report.pdf
Accident detection system project report.pdfAccident detection system project report.pdf
Accident detection system project report.pdf
Kamal Acharya
 

Recently uploaded (20)

Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...
 
5G Radio Network Througput Problem Analysis HCIA.pdf
5G Radio Network Througput Problem Analysis HCIA.pdf5G Radio Network Througput Problem Analysis HCIA.pdf
5G Radio Network Througput Problem Analysis HCIA.pdf
 
Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...
 
2. protection of river banks and bed erosion protection works.ppt
2. protection of river banks and bed erosion protection works.ppt2. protection of river banks and bed erosion protection works.ppt
2. protection of river banks and bed erosion protection works.ppt
 
Applications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdfApplications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdf
 
Data Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason WebinarData Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason Webinar
 
AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...
AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...
AI + Data Community Tour - Build the Next Generation of Apps with the Einstei...
 
SENTIMENT ANALYSIS ON PPT AND Project template_.pptx
SENTIMENT ANALYSIS ON PPT AND Project template_.pptxSENTIMENT ANALYSIS ON PPT AND Project template_.pptx
SENTIMENT ANALYSIS ON PPT AND Project template_.pptx
 
Null Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAMNull Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAM
 
一比一原版(USF毕业证)旧金山大学毕业证如何办理
一比一原版(USF毕业证)旧金山大学毕业证如何办理一比一原版(USF毕业证)旧金山大学毕业证如何办理
一比一原版(USF毕业证)旧金山大学毕业证如何办理
 
TIME TABLE MANAGEMENT SYSTEM testing.pptx
TIME TABLE MANAGEMENT SYSTEM testing.pptxTIME TABLE MANAGEMENT SYSTEM testing.pptx
TIME TABLE MANAGEMENT SYSTEM testing.pptx
 
ITSM Integration with MuleSoft.pptx
ITSM  Integration with MuleSoft.pptxITSM  Integration with MuleSoft.pptx
ITSM Integration with MuleSoft.pptx
 
Call For Paper -3rd International Conference on Artificial Intelligence Advan...
Call For Paper -3rd International Conference on Artificial Intelligence Advan...Call For Paper -3rd International Conference on Artificial Intelligence Advan...
Call For Paper -3rd International Conference on Artificial Intelligence Advan...
 
Height and depth gauge linear metrology.pdf
Height and depth gauge linear metrology.pdfHeight and depth gauge linear metrology.pdf
Height and depth gauge linear metrology.pdf
 
Object Oriented Analysis and Design - OOAD
Object Oriented Analysis and Design - OOADObject Oriented Analysis and Design - OOAD
Object Oriented Analysis and Design - OOAD
 
SCALING OF MOS CIRCUITS m .pptx
SCALING OF MOS CIRCUITS m                 .pptxSCALING OF MOS CIRCUITS m                 .pptx
SCALING OF MOS CIRCUITS m .pptx
 
一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理
一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理
一比一原版(CalArts毕业证)加利福尼亚艺术学院毕业证如何办理
 
NATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENT
NATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENTNATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENT
NATURAL DEEP EUTECTIC SOLVENTS AS ANTI-FREEZING AGENT
 
Pressure Relief valve used in flow line to release the over pressure at our d...
Pressure Relief valve used in flow line to release the over pressure at our d...Pressure Relief valve used in flow line to release the over pressure at our d...
Pressure Relief valve used in flow line to release the over pressure at our d...
 
Accident detection system project report.pdf
Accident detection system project report.pdfAccident detection system project report.pdf
Accident detection system project report.pdf
 

Lec 1 - Sensors (3hrs).pdf

  • 1. 6 b i d lli S EE6221 Robotics and Intelligent Sensors (Part 3) Lecture 1: Sensors Lecture 1: Sensors Guoqiang Hu Guoqiang Hu School of EEE, NTU
  • 2. Outline • Sensor classification • Sensor characterization M t d ( iti ) • Motor encoders (position) • Heading/orientation sensors • Ground-based beacons Ground based beacons • Active ranging • Speed sensors p • Vision (sensors, depth from focus, stereo vision, optic flow, etc.) Reference: • Introduction to Autonomous Mobile Robots (ch.4), R. Siegwart, and I. Nourbakhsh, MIT Press 2004 2 MIT Press, 2004 • Sensors for Mobile Robots, Theory and Applications, H.R. Everett, A K Peters Ltd., 1995 2
  • 3. Robot Sensors R b t S b di id d i t t b i l • Robot Sensors can be divided into two basic classes: 1) Internal State Sensors 2) External State Sensors 2) External State Sensors. • Internal sensors are used for control of the internal states of the robot/machine. • External sensors are needed for the robot/machine to react intelligently with its surroundings. Sonar IR haptic Sonar, IR, haptic feedback, tactile sensors, range of motion sensors, motion sensors, vision sensors (CCD cameras), etc. 3
  • 4. Internal State Sensors I t l t t i t f d i d t • Internal state sensors consists of devices used to measure position, velocity, or acceleration of robot joints and/or the end effectors. The following devices are some g of the sensors that belong to this class: • Potentiometers • Synchros • Linear inductive scales • Differential transformers (LVDTs and RVDTs) • Optical interrupters • Optical encoders • Tachometers • Accelerometers 4
  • 5. The successful control of most robots depends on Internal State Sensors The successful control of most robots depends on being able to obtain accurate information about the joint and/or end effectors and/or end effectors. Necessary to have devices (transducers) that provide such information and can be readily utilized in a robot for such information and can be readily utilized in a robot for this purpose. I ti l iti l it d/ l ti ( t In particular, position, velocity, and/or acceleration (or at least analog or digital representations of these quantities) must be measured to ensure that the robotic quantities) must be measured to ensure that the robotic machine moves in a desired manner. 5
  • 6. Internal State Sensors The internal state sensors must not only permit the required degree of accuracy to be achieved, they must also be cost effective since each of the robot’s axes will also be cost-effective since each of the robot s axes will normally utilize such devices. Th l i d h d i i l i i h The sensor selection and the decision to place it either on the load side or on the output of the joint actuator itself is influenced by such factors as: influenced by such factors as: 1) overall sensor cost, 2) power needs for a particular joint, 2) power needs for a particular joint, 3) maximum permissible size of the actuator, 4) sensor resolution, and 5) the need to monitor directly the actions of the joint. 6
  • 7. External State Sensors Th d l ll d t l t t i d The second class, called external state sensors, is used to monitor the robot’s geometric and/or dynamic relation to its task, environment, or the objects that it is handling. j g Although it is possible to utilize a robot without any external sensing, but in order to achieve greater flexibilities and intelligent abilities, the robot will require such devices to be aware of its surroundings. Robotic sensing mainly gives robots the ability to see, touch, hear and move, and uses algorithms that require environmental feedback environmental feedback. Robots use different sensors to detect different factors of the environment. 7 the environment.
  • 8. External State Sensors Th f t l t t The usage of external state sensors: • Vision: Visual sensors help robots to identify the surrounding environment and take appropriate action surrounding environment and take appropriate action. • Touch: Touch patterns enable robots to interpret human emotions in interactive applications. Robots use human emotions in interactive applications. Robots use touch signals to map the profile of a surface in hostile environment such as a water pipe. • Hearing: Robots can perceive our emotion through the way we talk. Acoustic and linguistic features are generally used to characterize emotions generally used to characterize emotions. • Movement: Provide a guidance system for an automated robot to determine the ideal path to perform 8 its task.
  • 9. External State Sensors I t t f t l t t ( t f ll li t) Instruments of external state sensors (not a full list): Vision (camera, infrared, laser, RFID) for landmark recognition 3D scene recovery scene understanding etc recognition, 3D scene recovery, scene understanding, etc. Tactile sensors: Four measurable features—force, contact time, repetition, and contact area change—can effectively time, repetition, and contact area change can effectively categorize touch patterns. Range finders (lasers, ultrasonic) for 3D scene recovery, local position. GPS (differential GPS) for global positioning Odometry (encoders, gyroscopes, accelerometers) – absolute motion recovery. 9
  • 10. The Role of Multi-Sensor Integration and Fusion Th f t l i t id t ith The purpose of external sensors is to provide a system with useful information concerning some features of interest in the system’s environment. This enables the machine to act y intelligently via some intelligent programme. The potential advantages in integrating and/or fusing The potential advantages in integrating and/or fusing information from multiple sensors are that the information can be decided more accurately, concerning f t th t i ibl t i ith i di id l features that are impossible to perceive with individual sensors, as well as in less time, and at a lesser cost. These advantages correspond, respectively, to the notions of the redundancy, complementarity, timeliness, and cost of the information provided the system 10 cost of the information provided the system.
  • 11. The Role of Multi-Sensor Integration and Fusion R d d t i f ti i id d f f Redundant information is provided from a group of sensors (or a single sensor over time) when each sensor is perceiving, possibly with a different fidelity, the same p g p y y features in the environment. The integration or fusion of redundant information can reduce overall uncertainty and thus serve to increase the accuracy with which the features are perceived by the system system. Multiple sensors providing redundant information can also serve to increase reliability in the case of sensor error or serve to increase reliability in the case of sensor error or failure. 11
  • 12. The Role of Multi-Sensor Integration and Fusion C l t i f ti f lti l ll Complementary information from multiple sensors allows features in the environment to be perceived that are impossible to perceive using just the information from each p p g j individual sensor operating separately. More timely information, as compared to the speed at which it could be provided by a single sensor, may be provided by multiple sensors either because of the actual speed of operation of each sensor or because of the speed of operation of each sensor, or because of the processing parallelism that may possibly be achieved as part of tile integration process. 12
  • 13. The Role of Multi-Sensor Integration and Fusion L tl i f ti i th t t f t ith Less costly information, in the context of a system with multiple sensors, is information obtained at a lesser cost as compared to the equivalent quality of information that could p q q y be obtained from a single sensor. Unless the information provided by the single sensor is being used for additional functions in the system, the total cost of the single sensor should be compared to the total cost of the integrated multi-sensor system cost of the integrated multi sensor system. With redundancy, it is possible then to use lower cost sensors with lower quality that can be combined to sensors with lower quality that can be combined to achieve the same quality of reading from a high quality sensor (thereby being costly). 13
  • 14. Classification of Sensors for Robots Cl ifi ti b d f ti Classification based on functions: • Proprioceptive sensors Measure the internal state of the system (robot) e g – Measure the internal state of the system (robot), e.g., Angular position of motor wheel, motor speed, wheel load, battery status, etc. – Examples: GPS, INS, Shaft Encoders, Compass, etc. 14
  • 15. Classification of Sensors for Robots E t ti • Exteroceptive sensors – Determine the measurements of objects relative to a robot's frame of reference (i.e., information about the robot s frame of reference (i.e., information about the robot’s environment) – Measure: e.g., distance from the robot to another object – Keep the robot from colliding with other objects Types: contact sensors range sensors vision – Types: contact sensors, range sensors, vision sensors • Exproprioceptive sensors (Directional Sensors) p p p ( ) 15
  • 16. Classification of Sensors for Robots Cl ifi ti b d h f Classification based on where energy comes from: • Active sensors emit their own energy and measure the reaction – emit their own energy and measure the reaction – better performance, but some influence on the environment and generally consumes more energy – Examples: ultrasonic, laser, and IR sensors. • Passive sensors – receives energy already in the environment cons me less energ b t often ha e signal and noise – consume less energy, but often have signal and noise problems – Examples: camera 16 p
  • 17. General Classification 17 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 18. General Classification 18 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 19. Sensor Characteristics Si i ht ti lt i t f • Size, weight, power consumption, voltage, interface • Field of view (FOV) • Range (or full scale) • Range (or full scale) – lower and upper limits – e.g., triangulation range finder may provide g , g g y p unpredictable values outside its working range • Dynamic range (e.g., camera pixels) – ratio between the smallest and largest possible values of a changeable quantity (e.g., sound, light, etc.) ) – often specified in decibels (ratio between powers) – e.g., power measurement from 1mW to 20W: 19 10log[20/0.001]=43dB (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 20. Sensor Characteristics • Resolution – smallest input change that can be detected (due to noise physical limitations or determined by the A/D noise, physical limitations or determined by the A/D conversion) • Linearity – variation of output signal as function of the input signal sometimes expressed as % of full scale – sometimes expressed as % of full scale • Hysteresis – path-dependency (e g thermostats some tactile path dependency (e.g. thermostats, some tactile sensors) 20 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 21. Sensor Characteristics B d idth F • Bandwidth or Frequency – the speed at which a sensor can provide a stream of readings readings – usually there is an upper limit depending on the sensor and the sampling rate – one has also to consider delay (phase) of the signal • Sensitivity ti f t t h t i t h – ratio of output change to input change • Cross-sensitivity (and cross-talk) – sensitivity to other environmental parameters – sensitivity to other environmental parameters – influence of other active sensors 21 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 22. Sensor Characteristics A ( tit d it f ) • Accuracy (exactitude, opposite of error) – degree of conformity between the measurement and true value true value – often expressed as a portion of the true value: Accuracy=1-|m-v|/v, m = measured value, v = true l value • Precision (or repeatability) relates to reproducibility – relates to reproducibility of sensor results: precision=range divided by precision range divided by standard deviation 22 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 23. Types of Errors S t ti d t i i ti • Systematic errors: deterministic – caused by factors that can (in theory) be modeled and predicted – e.g., distortion caused by the optics of a camera • Random errors: non-deterministic – no prediction possible – however, they may be described probabilistically however, they may be described probabilistically – e.g., hue instability of camera, black level noise of photoreceptors, non-returning echoes in ultrasonic sensors, etc. • Others • Others – cross-sensitivity of sensors, motion blur – rarely possible to model -> appear as “random” errors but are ith t ti d neither systematic nor random – systematic errors and random errors might be well defined in controlled environment. This is often not the case for mobile robots! 23 robots! (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 24. Wheel/Motor Encoders M iti d f th h l t i • Measure position or speed of the wheels or steering. • Integrate wheel movements to get an estimate of the position -> odometry • Optical encoders are proprioceptive sensors thus the position estimation in relation to a fixed reference frame is only valuable for short movements. • Typical resolutions: 64 - 2048 increments per revolution. 24 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 25. Magnetic compasses Si 2000 B C • Since over 2000 B.C. – when Chinese suspended a piece of naturally magnetite from a silk thread and used it to guide a chariot over land. • Magnetic field on earth – absolute measure of orientation. • Large variety of solutions to measure the earth magnetic field g y g – mechanical magnetic compass – direct measure of the magnetic field: Hall-effect, Flux Gate, magnetoresistive sensors magnetoresistive sensors • Major drawback – weakness of the Earth magnetic field (0.3-0.6 μT) il di t b d b ti bj t th – easily disturbed by magnetic objects or other sources – not working in indoor environments (except locally) • Often used in attitude sensing… 25 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 26. Gyroscopes Th l f i t i t h i hi l • The goal of gyroscopic systems is to measure changes in vehicle orientation by taking advantage of physical laws that produce predictable effects under rotation. Th f ti i i i l • Three functioning principles: – Mechanical – Optical – Micro-electromechanical systems (MEMS) • Two categories – Orientation -> directly measure angles (very rare in robotics!) Orientation directly measure angles (very rare in robotics!) – Rate gyros -> measure rotation velocity, which can be integrated… • A problem common to all gyroscopes is that of drift => unless the • A problem common to all gyroscopes is that of drift => unless the error is corrected through reference to some alternate measurement, the drift will eventually exceed the required accuracy of the system. 26 of the system.
  • 27. Mechanical Gyroscopes R l l t ( i i ti ) th t d f • Rely on angular momentum (gyroscopic inertia): the tendency of a rotating object to keep rotating at the same angular speed about the same axis of rotation in the absence of an external torque. R ti t i ti l t th i i d th t f • Reactive torque is proportional to the spinning speed, the rate of precession and the wheel’s inertia. • No torque can be transmitted from the outer pivot to the wheel axis > i i i ill th f b t bl => spinning axis will therefore be space-stable. • Nevertheless, remaining friction in the bearings of the gyro axis introduce some errors over time (drift). • A high-quality mechanical gyro can cost up to $100’000 and drift may be kept as low as 0.02°/h • Mechanically complex => not much used in mobile robotics. 27
  • 28. Optical Gyroscopes B d th b h i f ti l t di i t ti • Based on the behavior of an optical standing wave in a rotating frame (the Sagnac effect). • Optical gyroscopes use the interference of light to detect mechanical t ti rotation: – two light beams travel (e.g. along an optical fiber of up to 5 km in length) in opposite directions – the beam traveling against the rotation experiences a slightly shorter path than the other beam – the resulting phase shift affects how the beams interfere with each other when they are combined – the intensity of the combined beam then depends on the rotation rate of the device • Resolution can be smaller than 0.0001°/s 28
  • 29. MEMS Rate Gyros B d ib ti h i l l t t C i li • Based on vibrating mechanical elements to sense Coriolis acceleration. • Coriolis acceleration is the apparent acceleration that arises in a t ti f f f rotating frame of reference: • Various possible MEMS structure: tuning-fork, vibrating wheel, wineglass resonator • MEMS gyros have no rotating parts, have low-power consumption requirements, and are small => they are quickly replacing mechanical and optical gyros in robotic applications. 29
  • 30. Accelerometers and Pressure Sensors • Accelerometers / inclinometers • Accelerometers / inclinometers – An accelerometer is a device that measures the proper acceleration of the device. Conceptually an accelerometer behaves as a damped mass on a – Conceptually, an accelerometer behaves as a damped mass on a spring. – MEMS are based on a cantilever beam with a proof mass. Th f i th b d fl ti i ft iti – The way of measuring the beam deflection is often capacitive or piezoresistive. – MEMS accelerometers can have three axes => inclinometers. • Pressure / airspeed / altitude – Deforming membrane (piezoresistive, capacitive, optical, electromagnetic, etc.) – Either absolute or differential (Pitot tube) – Example of the Freescale MPX: • resolution of 4 cm in altitude 30 resolution of 4 cm in altitude • variations due to atmospheric pressure changes over one day: up to 45 m
  • 31. Inertial Measurement Units (IMU) or Inertial Navigation System (INS) • A device that utilizes sensors such as gyroscopes and accelerometers to • A device that utilizes sensors such as gyroscopes and accelerometers to estimate the relative position (velocity, and acceleration) of a vehicle in motion. • A complete IMU/INS maintains a 6 DOF estimate of the vehicle pose: • A complete IMU/INS maintains a 6 DOF estimate of the vehicle pose: position (x,y,z) and orientation (yaw, pitch and roll). • This requires integrating information from the sensors in real time. • IMU/INS are extremely sensitive to measurement errors => an external source of information such as GPS and/or magnetometer is often required to correct this (cf. ch. 8 for more information on sensor fusion). 31 required to correct this (cf. ch. 8 for more information on sensor fusion). • List of IMU/INS products: http://damien.douxchamps.net/research/imu/ (adapted from the Handbook of Robotics, 2008, ch.20.4)
  • 32. Beacon-based positioning • Beacons are signaling devices with a precisely known position. • Beacon-based navigation is used since the humans started to travel – Natural beacons (landmarks) like stars, mountains or the sun – Artificial beacons like lighthouses • The recently introduced Global Positioning System (GPS) revolutionized modern navigation technology e o ut o ed ode a gat o tec o ogy – Already one of the key sensors for outdoor mobile robotics – For indoor robots GPS is not applicable • Major drawback with the use of beacons indoors: • Major drawback with the use of beacons indoors: – Beacons require changes in the environment -> costly – Limit flexibility and adaptability to changing environments 32 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 33. Passive optical beacons P i t fl ti b f k iti • Passive retroreflective beacons of known position • Distance and heading of at least two beacons are measured (e.g. using a scanning laser range finder) (adapted from Siegwart & Nourbakhsh, 2004, ch. 5) 33
  • 34. Active ultrasonic beacons R b t t k th itt l ti • Robots must know the emitter locations • Using time of flight (TOF), they can deduce their position • Time synchronization is required (e.g. RF or IR) 34
  • 35. Global positioning system (GPS) F t • Facts: – Developed for military use by the US (NAVSTAR) – Recently became accessible for commercial applications – 24 satellites (including three spares) orbiting the Earth every 12 hours at a height of 20’190 km – 4 satellites are located in each of six planes inclined by 55° with p y respect to the plane of the Earth’s equator – Location of any GPS receiver is determined through a time of flight measurement g • Technical challenges: – Time synchronization between the individual satellites and the GPS receiver GPS receiver – Real-time update of the exact location of the satellites – Precise measurement of the time of flight Interferences ith other signals reflections 35 – Interferences with other signals, reflections (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 36. Global positioning system (GPS) 36 (adapted from Everett, 1995, section 14.2)
  • 37. Global positioning system (GPS) Time synchronization: • Time synchronization: – ultra-precision time synchronization is extremely important – roughly 0.3m/ns => position accuracy proportional to precision of time measurement – atomic clocks on each satellite atomic clocks on each satellite • Real time update of the exact location of the satellites: – monitoring the satellites from a number of widely distributed ground stations – master station analyses all the measurements and transmits the actual position to each of h lli the satellites • Exact measurement of the time of flight – the receiver correlates a pseudocode with the same code coming from the satellite – the delay time for best correlation represents the time of flight the delay time for best correlation represents the time of flight – quartz clock on the GPS receivers are not very precise – the range measurement with four satellites allows to identify the three values (x, y, z) for the position and the clock correction dT • Recent commercial GPS receivers have a position accuracy within 20 m (typ. 10 m) in the horizontal plane and 45 m in the vertical plane, depending on the number of satellites within line of sight and multipath issues. WAAS allows to get close to 1-2 m accuracy. • The update rate is typically between 1 and 4 Hz only. 37 p yp y y (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 38. TOF Range Sensors R i f ti • Range information: – key element for obstacle avoidance, localization and environment modeling • Ultrasonic sensors as well as laser range sensors make use of propagation speed of sound or electromagnetic waves respectively. The traveled distance of a sound or electromagnetic wave is given by d=c * t • Where – d = distance traveled (usually round-trip) – c = speed of wave propagation t ti f fli ht – t = time of flight (adapted from Siegwart & Nourbakhsh, 2004, ch. 4) 38
  • 39. TOF Range Sensors • Important to keep in mind: Important to keep in mind: – Propagation speed of sound: 0.3m/ms – Propagation speed of electromagnetic signals: 0.3m/ns 3 t d t 10 f lt i t l – 3 meters correspond to 10ms for an ultrasonic system versus only 10ns for a laser range sensor => time of flight with electromagnetic signals involves very fast l t i electronics => laser range sensors are more expensive and delicate to design • The quality of TOF range sensors mainly depends on: – Uncertainties about the exact time of arrival of the reflected signal – Inaccuracies in the time of fight measure (laser range sensors) – Opening angle of transmitted beam (especially ultrasonic range Opening angle of transmitted beam (especially ultrasonic range sensors) – Interaction with the target (surface, diffuse/specular reflections) Variation of propagation speed (sound) 39 – Variation of propagation speed (sound) – Speed of mobile robot and target (if not at stand still) (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 40. Ultrasonic Sensors T it k t f ( lt i ) • Transmit a packet of (ultrasonic) pressure waves • Distance d of the echoing object can be calculated based on the propagation speed of sound c and the time of flight t. d=c*t/2 • Typical frequency: 40 180kHz • Typical frequency: 40 - 180kHz • Generation of sound wave: piezo transducer – transmitter and receiver can be separated or not • Sound beam propagates in a cone (approx ) • Sound beam propagates in a cone (approx.) – opening angles around 20 to 70 degrees – regions of constant depth segments of an arc – segments of an arc (sphere for 3D) 40 Typical intensity distribution of a ultrasonic sensor
  • 41. Laser Rangefinders Al k l d LIDAR (LI ht D t ti A d R i ) • Also known as laser radar or LIDAR (LIght Detection And Ranging) • Transmitted and received beams are coaxial • Transmitter illuminates a target with a collimated beam (laser) • Diffuse reflection with most surfaces because wavelength is typ. 824nm • Receiver detects the time needed for round-trip p • An optional mechanism sweeps the light beam to cover the required scene (in 2D or 3D). 41 (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
  • 42. Laser Rangefinders Ti f fli ht t i ll hi d • Time of flight measurement is generally achieved using one of the following methods: – Pulsed laser (e.g. Sick) • direct measurement of time of flight • requires resolving picoseconds (3m = 10ns) – Phase shift measurement (e.g. Hokuyo) ( g y ) • sensor transmits 100% amplitude modulated light at a known frequency and measures the phase shift between the transmitted and phase shift between the transmitted and reflected signals • technically easier than the above method 42
  • 43. Phase-shift laser Rangefinders • Distance D, between the beam splitter and the target is given by (Woodbury et al 1993): D = lamda*theta/(4*pi) (Woodbury, et al., 1993): D = lamda theta/(4 pi) where theta is the phase difference between transmitted and reflected beam and with c the speed of light, f the modulating frequency frequency • Confidence in the range (phase/time estimate) is inversely proportional to the square of the received signal amplitude. Hence dark distant objects ill not prod ce as good range 43 • Hence dark, distant objects will not produce as good range estimates as closer, brighter objects.