More Related Content Similar to Wearable Computing - Part II: Sensors (20) More from Daniel Roggen (9) Wearable Computing - Part II: Sensors2. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Physical Social
Mental
Mental
• Emotion awareness
– Own/Others
– Sadness, joy
– Depression
• Cognitive awareness
– Cognitive load
– Attention, concentration
– Stress
• …
Social
• Social interactions
– Detect known people
• Social network
– Information exchange
– Optimization of organizations
• Crowd / collective behavior
• …
Dimensions of context
Physical
• User location
– Absolute, relative
• User activity
– Manipulative gestures,
pointing movements, modes
of locomotion, posture,
composite and hierarchical
activities
• …
Environment
• Map of surrounding services
• Environment characteristics
– Temperature, light, humidity
• Radio fingerprints
• …
Environment
3. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Sensing context
• Sensors are needed to infer the user’s context
• « Software sensors »
– SMS on mobile phones: e.g. used to infer friend network
– Email reception: e.g. work / home presence
– Calendar: e.g. location, work activities
– …
• Hardware sensors
– (the typical definition of sensors in EE )
– Accelerometers: e.g. gesture recognition
– RFID tags: e.g. object detection
– Reed switch e.g. door/window open/closed
– GPS: e.g. outdoor location
– WIFI fingerprints: e.g. indoor location
– …
4. © Daniel Roggen www.danielroggen.net droggen@gmail.com
There is no « Drink Sensor »
• Simple sensors (e.g. RFID) can provide a "binary" information
– Presence (e.g. RFID, Proximity infrared sensors)
– Movement (e.g. ADXL345 accelerometer ‘activity/inactivity pin’)
– Fall (e.g. ADXL345 accelerometer ‘freefall pin’)
• But in general « activity-X sensor » does not exist
– Sensor data must be interpreted
– Multiple sensors must be correlated (data fusion)
– Several factors influence the sensor data
• Drinking while standing: the arm reaches the object then the mouth
• Drinking while walking: the arm moves, and also the whole body
• Context is inferring from the sensor data with
– Signal processing
– Machine learning
– Reasoning
• Can be integrated into a « sensor node » or « smart sensor »
– Sensor chip + data processing in a device
5. © Daniel Roggen www.danielroggen.net droggen@gmail.comRoggen et al., Collecting complex activity datasets in highly rich networked sensor environments, Proc INSS, 2010
6. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Accelerometer
• The most common sensor in activity recognition
• Used in mobile phones to rotate screen
• Nintendoo Wii
• Highly miniaturized with MEMS technology
[1] ADXL335 datasheet, Analog devices
[2] http://www.silicondesigns.com/tech.html
[2]
• ADXL335 (analog output)
– 3D accelerometer
– 4x4 mm
– Vin: 1.8-3.6V
– I: 350 uA
– Analog output: 300mV/g
– <4$/unit
7. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Accelerometer
• Information about static postures and movements
• Static acceleration
– Without user movement (or very slow)
– Earth gravity projected onto accelerometer coordinates
– Senses limb angle
α
Ax
r
Az
α
1g
• Dynamic acceleration
– Two components: limb acceleration and earth gravity
Ax
Ay
Az
1g
Gesture in x-z plane Gesture in x-y plane
• Challenges:
– On-body rotation
• use acceleration magnitude!
– Plane of gesture
• Different signals!
8. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Walking, running, jumping
Figo, Diniz, Ferreira, Cardoso. Preprocessing techniques for context recognition from accelerometer data, Pers Ubiquit Comput, 14:645–662, 2010
9. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Car manufacturing activities
Data from Zappi et al, Activity recognition from on-body sensors: accuracy-power trade-off by dynamic sensor selection, EWSN, 2008
Dataset available at: http://www.wearable.ethz.ch/resources/Dataset
10. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Swim styles
Delphin
Kraul
Brust
Rücken
(jeweils zwei Bahnen)
Bächlin, Förster, Tröster, SwimMaster: A wearable assistant for swimmer, Ubicomp, 2009
11. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Swim styles
Bächlin, Förster, Tröster, SwimMaster: A wearable assistant for swimmer, Ubicomp, 2009
Armstrokes
12. © Daniel Roggen www.danielroggen.net droggen@gmail.com
OPPORTUNITY Dataset
• Appartment flat, early morning activities
"Activity of daily living" run
• Loose high level instructions
• Primitives to high-level activities
• 5 repetitions
• 12 subjects
• ~20mn / run
Drill run
• Scripted sequence
• Only activity primitives
• 20 repetitions
• 12 subjects
• ~30mn / run
http://vimeo.com/8704668
13. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Gyroscope
• Measures rate of turn (degrees / sec)
– Recently introduced in mobile phones (iPhone 4)
– Camera image stabilization
– Augmented reality (with accelerometer and compass as IMU)
• Principle:
– MEMS vibrating structure
– Coriolis force displaces masses
– Change in capacitance
• Suited to
– measure limb rotation
– heading change
– step length
[1] http://www.findmems.com/wikimems-learn/introduction-to-mems-gyroscopes
[1]
14. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Detection of snowboard turns with a gyro [1]
[1] Holleczek et al., Recognizing Turns and Other Snowboarding Activities with a Gyroscope, ISWC, 2010
15. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Inertial measurement unit
• Combination of
– 3D accelerometer
– 3D compass
– 3D gyroscope
• Provides device orientation in earth reference frame
– Euler angles
– Quaternions
[1] Xsens MT9 orientation sensor
16. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Inertial measurement unit: principle
• [1]: Kalman filter to estimate 3DoF orientation
– Accelerometer and compass compensate drift in integration of gyroscope
– "Attitude and heading reference system"
– Accelerometer stabilizes attitude: dynamic acceleration is zero on average (earth gravity)
– Compass stabilizes heading (careful with magnetic disturbances, metallic objects!)
[1] MTi and MTx User Manual and Technical Documentation, Xsens, 2009
[2] Torres, Flynn, Angove, Murphy, Mathuna, Motion Tracking Algorithms for Inertial Measurement, BodyNets, 2007
[2]
17. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Inertial measurement unit
• One IMU per body segment for body-model reconstruction
[1] Xsens MVN - Inertial Motion Capture
[2] Stiefmeier et al, Wearable Activity Tracking in Car Manufacturing, PCM, 2008
[1]
[2]
18. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Location awareness in wearables
Augmented Reality Annotations
Uratani et al., Wearable Augmented Reality System
with Annotation Visualization, CREST, 2004
Virtual object in physical world
Tinmith AR, University South Australia
Modeling transportation, likely routes, significant places...
Patterson, Inferring High-Level Behavior from Low-Level Sensors, Ubicomp 2003
Ashbrook, Using GPS to Learn Significant Location, Personal and Ubiquitous Computing 7(5)
Location-based activity
segmentation
Stiefmeier, Ogris, ETHZ, Uni Passau
Location-Aware Computing
19. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Location awareness
Hightower, Borriello, Location systems for ubiquitous computing. IEEE Computer (2001) 57–
Localization beacons
IR, Bluetooth
Existing radios
Wifi, GSM
Indoor App. Specific
UWB (Ubisense)
Video tracking
+ Large area coverage
+ “Symbolic” localization (room#) + Indoor, deployment
+ High accuracy + Unobtrusive
- Urban canyons, indoor - Wireless map
- Occlusion, privacy
- Environment instrumentation
- Instrumentation, short range
Ideally, for wearable computing:
• Indoor and outdoor use
• No instrumented environment
• No a-priori knowledge / map
• Anytime, anywhere
• Low-power
• Small
Satellite-systems
GPS
20. © Daniel Roggen www.danielroggen.net droggen@gmail.com
GPS revisited
• Simultaneous measurement of time-of
flight (phase) from 4 satellites
• Weak signal: -160dBW, 1575MHz L1
• For outdoors....
• ... but some indoor capability [1]
• Uses lots of energy
[1] Kjærgaard et al., Indoor Positioning Using GPS Revisited, Pervasive, 2010
[2] http://en.wikipedia.org/wiki/GNSS_positioning_calculation
[1]
[2]
21. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Radio fingerprints
• Radio fingerprint quite unique for any location:
– Visible radios (Wifi MAC address, GSM towers)
– Signal strength
• APIs for absolute localization from radio fingerprints:
– http://www.opencellid.org/ (cellphone towers)
– Google geolocation API
• In principle not limited to radio:
– information also in sound, visual patterns, etc...
22. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Dead reckoning
• Historically used by sailors: speed estimation and compass
• With wearable sensors [1]:
– Compass to estimate heading
– Count steps (pedometer, accelerometer); integrate acceleration over single steps
[1] Randell, Djiallis, Muller, Personal Position Measurement Using Dead Reckoning, ISWC, 2003
• Quite accurate (~10%) with a correct motion model
– (e.g. compass must be parallel to ground)
• Integration of errors lead to "closing-the-loop" problem
– Identify already visited points with additional sensors and "close the loop"
– Multiple sensors and Kalman filter
23. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Dead reckoning with vision (optical flow)
Forward motion Rotation Side step
• Typical motion pattern result in characteristic optical flows
Dead-reckoning from optical flow:
1. Compute optical flow
2. Derive corresponding camera movement (egomotion)
3. Integrate egomotion in a map
[1] Roggen et al, Mapping by seeing: wearable vision-based dead-reckoning, and closing the loop, EuroSSC 2007
Slow Fast
24. © Daniel Roggen www.danielroggen.net droggen@gmail.com
• Straight walk: 5.5m
• Estimated distance: 5.17m
• Relative error: 6% Speed profile
Path integration (outdoor)
• Square walk (10m sides)
• Distance: 38.8m
• Estimated distance: 43.4m
• Relative error: 12%
• Challenge: high rotation speeds (30°/s-90°/s)
• h=145cm, α=48°
• Chest-placed camera
Footsteps
[1] Roggen et al, Mapping by seeing: wearable vision-based dead-reckoning, and closing the loop, EuroSSC 2007
25. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Closing the loop
• Errors accumulate during path integration
• Coinciding start-end point of trajectory do not map on
same location
A-posteriori correction of motion vectors
1. Recognize identical locations A, B
2. Correct motion vectors so A matches B
→ Idea: closing the loop
Ωcl=f(Ω)Ω
[1] Roggen et al, Mapping by seeing: wearable vision-based dead-reckoning, and closing the loop, EuroSSC 2007
• 6-12% distance error
• Underestimation of rotation
– Note: camera only! With a gyro can be improved
26. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Estimating proximity from passive fingerprints: sound
• Proximity information is important for many applications!
[1] Wirz et al., A wearable, ambient sound-based approach for infrastructureless fuzzy proximity estimation, ISWC 2010
27. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Estimating proximity from passive fingerprints: Wifi
[1] Cugia, Wirz, ETH Tech Rep, 2011
• Devices A, B scan Wifi (MAC addresses)
• Compute fingerprint similarity
– E.g. Jaccard index: normalized number
MAC address seen by both A and B
Sensor number Hears WIFI signals
S0 W2 W6 W7
S1 W1 W3 W4 W5
S2 W2 W3 W4 W5 W6 W7
S3 W0 W1 W2 W3 W4 W5
S4 W0 W1 W2 W3 W4 W5
28. © Daniel Roggen www.danielroggen.net droggen@gmail.com
From proximity to topology
• Proximity can be used to find node topology
– Binary case: "sense"/"not sense" (e.g. communication in/out-of range)
– Or distance estimate
• Convert proximity to topology
– Multidimensional scaling [1,2]
• "Anchor nodes" (e.g. with GPS) can "ground" the topology
[1] Koo, Cha, Autonomous Construction of a WiFi Access Point Map Using Multidimensional Scaling, Pervasive, 2011
[2] Chan&So, Efficient Weighted Multidimensional Scaling for Wireless Sensor Network Localization, IEEE Tr Sig Proc, 57(11), 2009
29. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Active capacitive sensing [1,2]
• Challenge: sensing inner body function
– E.g. drinking, heart rate, respiration, etc.
• Principle:
– Capacitor with human body as dielectric
– Body function affect dielectric: muscle contractions, joint motions, air
entering the lungs, drinking
– Readout using a Colpitts (LC) oscillator
– Suitable for textile integration!
[1] Cheng et al., Active Capacitive Sensing: Exploring a New Wearable Sensing Modality for Activity Recognition, Pervasive, 2011
[2] Lukowicz et al., On-Body Sensing: From Gesture-Based Input to Activity-Driven Interaction, IEEE Computer Magazine
Dielectric
30. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Active capacitive sensing [1,2]
• Characteristic signal for swallowing
• Also reacts to movements (head direction, body movement)
[1] Cheng et al., Active Capacitive Sensing: Exploring a New Wearable Sensing Modality for Activity Recognition, Pervasive, 2011
[2] Lukowicz et al., On-Body Sensing: From Gesture-Based Input to Activity-Driven Interaction, IEEE Computer Magazine
31. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Electrooculography [1]
• Measures E potential around the eye
• Can identify saccades, fixation, blinks, movement patterns
• Alternative technology: camera-based
[1] Bulling et al., What’s in the Eyes for Context-Awareness? Pervasive computing magazine, 2011
33. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Video on the emotional computer
34. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Sensing intense fear (startle event)
• Skin conductance: Galvanic Skin Response = Electrodermal Activity
– Affected by « internal states »: emotions, phobias, arousal, stress
• Controlled by the sympathetic nervous system (SNS) [1]
– Low frequency « tonic » part (0-0.05 Hz)
– Fast changing phasic component (0.05 Hz-1.5 Hz)
• Quickly reacts to the « fight or flight » reflex (e.g. fear / startle event)
[1] Fuller GD. GSR History, & Physiology. San Francisco: Biofeedback Institute San Francisco; 1977.
[2] Schumm et al., Effect of Movements on the Electrodermal Response after a Startle Event, Methods of Information in Biomedicine, 2008
Wearable EDA sensors from [2]
35. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Experiment
• Walking on a treadmill
• Listening to music
• Loud « bang » at unexpected moments
Schumm et al., Effect of Movements on the Electrodermal Response after a Startle Event, Methods of Information in Biomedicine, 2008
36. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Experiment
• Visualization with cross-correlogram
Schumm et al., Effect of Movements on the Electrodermal Response after a Startle Event, Methods of Information in Biomedicine, 2008
EDA activation after starle…
…but some challenges:
• Physical activity
• Background EDA activation
• No unique response to startle
Startle events
aligned at t=0
Multimodal approach?
• Sensing movement and EDA
• …
37. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Sensing cognitive states from eye movements
[1] Hayhoe, Ballard Eye movements in natural behavior, TRENDS in Cognitive Sciences Vol.9 No.4 April 2005
[2] Heisz, Shore, More efficient scanning for familiar faces, Journal of Vision (2008) 8(1):9, 1–10
[3] Bulling et al., What’s in the Eyes for Context-Awareness? Pervasive computing magazine, 2011
[1]
[1]
Familiar v.s. unfamiliar faces [2]
Gaze patterns are consciously and unconsciously
controlled. Can eye movements reveal something
about cognitive states?
• "Eventually, analyzing the link between unconscious eye movements and cognition
might even pave the way for a new genre of pervasive computing systems that can
sense and adapt to a person’s cognitive context. [....] A computing system is
cognition-aware if it can sense and adapt to a person’s cognitive context." [3]
38. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Sensing cognitive states from eye movements [1]
[1] Bulling et al., What’s in the Eyes for Context-Awareness? Pervasive computing magazine, 2011
Applications:
• Memory assistants
• Smart computer interfaces
• Significant change in fixation count
• Ongoing: classify in "seen" / "unseen"
• Challenge: outside of the lab!
• Cognition: Memory, Stress, Reasoning, ...
• Can a computer detect whether a picture has already been seen before?
39. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Group near exit
Clogged
Dense
Queuing
Group split
Lane formation
• Capability and properties of mobile system...
• for machine recognition of crowd behavior
• Supports situation awareness
Social context: sensing crowd behaviors
40. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Social context: sensing crowd behaviors
Wirz, M.; Roggen, D.; Tröster, G.: Decentralized Detection of Group Formations from Wearable Acceleration Sensors
Wirz et al. A Methodology towards the Detection of Collective Behavior Patterns by Means of Body-Worn Sensors, Workshop at Pervasive, 2010
Roggen et al., Recognition of crowd behavior from mobile sensors with pattern analysis and graph clustering methods, Submitted to NHM, 2011
41. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Integrating sensors into clothing
• Clothing integration is the "dream" of wearable computing
• "Normal" clothes are extremely ruggedized!
• Challenges for smart-textiles / sensorized garments:
– Washing
– Bending
– Friction
– Integration in standard manufacturing process
– Requires sophisticated equipment!
• Example technology:
– Kapton substrate, thin film transistor
– Thin straps woven with standard machine
Our smart textile fabrication process, Cherenack et al,
IEEE Electr. Dev. Lett, vol. no. 7, pp. 740-742, July
2010
A woven temperature sensor,
Kinkeldei et al, Proc. of the IEEE
Sensors Conference, pp 1-4 2009
42. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Smart-textiles
System on Textiles
• Development of conductive fabrics for
signal transmission in wearables
• Investigation of the electrical
performance: measuring and modeling
high-frequency properties
• Simulation and optimization of different
fabrics and configurations
• Interconnections
Woven fabric with
conductive fibers
Unobtrusive context recognition, healthcare (e.g. rehabilitation), sports
Routing Methods Adapted to e-Textiles. I. Locher, T. Kirstein and G. Tröster, Proc. 37th International Symposium on
Microelectronics (IMAPS 2004), Long Beach CA, Nov. 14-18, 2004
43. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Smart-textiles
Textile Antennas
• Magnetic coupled antenna
• Signal transmission between pieces of clothing.
Textile Body Area Network
Trans-
mitter
Receiver
H
• Bluetooth antenna
• Signal transmission
along body
• Communication with
near infrastructure
• Modeling and measuring of electrical properties
• Modeling and measuring of textile-specific properties
Design and Characterization of Purely Textile Patch Antennas. I. Locher, M. Klemm, T. Kirstein and G. Tröster, IEEE
Transactions on Advanced Packaging, Vol. 29, No. 4, Nov. 2006, pp. 777-788
44. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Index
Finger
Temperature Profile Temperature Sensor
• 74 g/m2
• Polyester Interlining
• Insulated copper wires
Temperature Profile Estimation with Smart Textiles. I. Locher, T. Kirstein and G. Tröster, Proc. 1st International Scientific
Conference Ambience 05, Tampere, Finland, Sept. 19-20, 2005
Smart-textiles
Textile Temperature Sensor
45. © Daniel Roggen www.danielroggen.net droggen@gmail.com
C o n d u c tiv e t e x tile
S p a c e r ( f o a m , t e x t ile )
N o n c o n d u c t iv e t e x t ile
Capacitive textile Sensor
Electrodes embroidered with
conductive yarn on both sides
of compressible spacer
Applications:
• Medicine (Decubitus prevention)
• Weight measurement in car seats
Smart-textiles
Capacitive Textile Pressure Sensor
Textile Pressure Sensor for Muscle Activity and Motion Detection. J. Meyer, P. Lukowicz, G. Tröster, ISWC 2006: Proceedings
of the 10th IEEE International Symposium on Wearable Computers, Montreux, Switzerland, 11.-14. October 2006
46. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Integrating sensors into clothing
• A rapid prototyping technology [1]
[1] Harms et al., Rapid prototyping of smart garments for activity-aware applications, JAISE, 2009
47. © Daniel Roggen www.danielroggen.net droggen@gmail.com
lower body
upper body
VDRG: Velocity-Damped Resonant Generator
CDRG: Coulomb-Damped Resonant Generator
CFPG: Coulomb-Force Parameteric Generator
Optimization of Inertial Micropower Generators for Human Walking Motion. Thomas von Büren, P. D. Mitcheson, T. C. Green, E.
M. Yeatman, A. S. Holmes, and G. Tröster. IEEE Sensors Journal, 2005
Energy Harvesting
Mechanical µ−Generator for On-Body Sensor Networks
48. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Maximum-Power-Point Tracking for On-Body Context Systems. N.B. Bharatula, J.A. Ward, P. Lukowicz, G. Tröster, ISWC
2006: Proceedings of the 10th IEEE International Symposium on Wearable Computers
Energy Harvesting
Hybrid solar-battery power supply
Solar cell
Li-Ion battery
49. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Existing datasets
PlaceLab dataset
TU Munchen kitchen
Many others
Homesetting
(van Kasteren)
Long-term activity
(van Laerhoeven)
+ Real behaviors in house
+ Marker free motion capture
+ Application specific
+ Real behaviors in house
+ Free-living activities
- Small number of instances
- No wearable sensors
- Often not open
- Binary sensors
- Annotations
http://www.wearable.ethz.ch/resources/Dataset
50. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Collecting large scale datasets: Lessons learned
• Appartment flat, early morning activities
"Activity of daily living" run
• Loose high level instructions
• Primitives to high-level activities
• 5 repetitions
• 12 subjects
• ~20mn / run
Drill run
• Scripted sequence
• Only activity primitives
• 20 repetitions
• 12 subjects
• ~30mn / run
51. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Wearable sensors: sense body movement (and sound)
• Nodes: 24 (mostly IMUs, accel.)
• Modalities: 4
• Networks: 6
52. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Instrumented objects: Sense object use
• Sensor nodes: 12 objects
• Modalities: 2 (acceleration, gyroscope)
• Networks: 1 (Bluetooth)
Accelerometer + gyroscope
FSRs
XSense
53. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Ambient sensors: sense environment interactions
• Sensor nodes: 36
• Modalities: 9
• Networks: 8
54. © Daniel Roggen www.danielroggen.net droggen@gmail.com
The OPPORTUNITY dataset: key facts
Sensor rich
• 72 sensors (28 sensors in 2.4GHz band)
• 10 modalities
• 15 wired and wireless systems (USB, Bluetooth, 802.15.4, custom)
• 2.5% data loss on wireless sensors
Activity rich
• 12 subjects, ~25 hours of total data
• > 30'000 interaction primitives (object, environment)
• Annotation length = 230% of dataset length
Highly collaborative effort
• 11 days recording session, 10 experimenters
• Subjects instrumented for 5-7 hours
• ~10 hours of annotation effort for 30mn of data
55. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Challenge 1: Data recording with heterogeneous sensor networks
Integration at system level
+ Central control & monitoring
+ Synchronized data acquisition
- Internals of sensor systems
- Fixed real-time merge
Integration at data level
+ Independent data recorders
+ Robustness, flexibility
- Complex control & monitoring
- Offline synchronization
Obtain synchronized data streams for further processing
• 7 computers recording sensor data
– Store data and data reception time
– Coarse NTP synchronisation
– Fine synchronisation with specific gestures (“jump and clap”)
56. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Challenge 2: data handling after recording
Burst equalization w/ streaming sensors Missing data represented as NaNs
Stream alignment to video footage
57. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Challenge 3: flexible activity annotations, at all levels
~60
instances
~30'000
instances
Solution: annotation on multiple tracks, hand-action-object representation
58. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Some lessons learned and best practices
• Complex recordings will see heterogeneous WSN
– System level integration is desired…
– … however data level integration may be sufficient
• Developing custom tools can be beneficial
– Signal/video alignment, annotation
– Now: students can do the work
• Plan for a repository for raw data
– Required already before data can be stored in a database
– Can be huge!
59. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Some lessons learned and best practices
• Test infrastructure locally
– Placement of antennas
– Integration & optimization of the wireless links
• Expect, plan for, and accept data loss or failures
– Independent systems
– Strategy: ignore problem, fix&restart, fix&continue
• Data integrity checks
– Tradeoff: frequency / time of test-subjects
• Develop a rapid on-body sensor deployment solution
– Clothing attached/integrated sensors
– Reproducible placement
– More convenient
60. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Some lessons learned and best practices
• Avoid wireless links altogether :)
– Use local data storage
– Wireless only for synchronization
• Do not underestimate logistics
– 10 people during 11 days
– Room rented for 11 days, 3 days setup
– Setup: 3 days
61. © Daniel Roggen www.danielroggen.net droggen@gmail.com
• http://www.intersense.com
• Nintendo WiiMote
• Open source BlueSense
acceleration/gyro sensors
(http://www.wearable.ethz.
ch/resources/educationkit)
Some readily available sensors
• http://www.phidgets.com/
• http://www.sparkfun.com/
• LilyPad
(http://hlt.media.mit.edu/?
p=34)
• http://www.xsens.com/
62. © Daniel Roggen www.danielroggen.net droggen@gmail.com
Summary
• User requirements
– Small / unobtrusive
– Invisible to the outside
– Comfortable
– Privacy issues
• Context-recognition requirements
– Must be discriminative of the context to recognize
– Minimize subsequent processing (simpler modalities favored)
• Hardware requirements
– Low-power
– Low-computational complexity
• Usually multimodal approaches
– Multiple activities affect a sensor
– Several sensors can disambiguate the activity
63. © Daniel Roggen www.danielroggen.net droggen@gmail.com
For further reading
Context
• Dey, Understanding and Using Context, Personal and Ubiquitous Computing,2001
Activity-aware computing
• Lukowicz et al., On-Body Sensing: From Gesture-Based Input to Activity-Driven Interaction, IEEE Computer, 43(10), pp. 92-96, 2010
• N. Davies, D. Siewiorek, R. Sukthankar, Special Issue: Activity-Based Computing, IEEE Pervasive Computing, 7(2), pp. 20-21, 2008
• Stiefmeier et al, Wearable Activity Tracking in Car Manufacturing, PCM, 2008
Cognitive-affective computing
• R. Picard, Affective computing, MIT Press, 1997
• R. Picard, E. Vyzas, J. Healey, Toward Machine Emotional Intelligence: Analysis of Affective Physiological State, IEEE Transactions on
Pattern Analysis and Machine Intelligence, 23(10), pp. 1175-1191, 2001
• Bulling et al., What’s in the Eyes for Context-Awareness? Pervasive computing magazine, 2011
Location-awareness
• Hightower, Borriello, Location systems for ubiquitous computing, IEEE Computer, pp. 57-66, 2001
• Kjaergaard, Blunck, Godsk, Toftkjaer, Christensen, Gro}nb{ae}k, Indoor Positioning Using GPS Revisited, Pervasive, 2010
• Koo, Cha, Autonomous Construction of a WiFi Access Point Map Using Multidimensional Scaling, Pervasive, 2011
Social context
• Wirz, Roggen, Tröster, Decentralized Detection of Group Formations from Wearable Acceleration Sensors, Int. Conf. Social Computing,
2009
Datasets
• Roggen et al., Collecting complex activity data sets in highly rich networked sensor environments, INSS, 2010
Context and Sensing frameworks
• Roggen et al., Titan: An Enabling Framework for Activity-Aware ``PervasiveApps'' in Opportunistic Personal Area Networks, EURASIP
Journal on Wireless Communications and Networking, 2011
• Kukkonen, Lagerspetz, Nurmi, Andersson, BeTelGeuse: A Platform for Gathering and Processing Situational Data, IEEE Pervasive
Computing Magazine, 8(2): 49-56, 2009
• Fortino, Guerrieri, Bellifemine, Giannantonio, SPINE2: Developing BSN Applications on Heterogeneous Sensor Nodes, Proc. IEEE
Symposium on Industrial Embedded Systems SIES2009, 2009
• Bannach et al., Rapid Prototyping of Activity Recognition Applications, IEEE Pervasive Computing Magazine, 7(2):22-31, 2008
• Kurz et al., The OPPORTUNITY Framework and Data Processing Ecosystem for Opportunistic Activity and Context Recognition,
International Journal of Sensors, Wireless Communications and Control, 2011
Editor's Notes Now, let‘s have a look how the incorporation of technology that is sensitive and responsive to the presence of people can help in real life scenarios. Let‘s assume there‘s a fire in a bulding and people are trying to escape. One person walks towards an exit and sees that the corridor is blocked. Why? What‘s ahead? What should he do? In such a situation, a mobile device could provide relevant information to come to a decsion. Another person can be guided efficiently out of the building. In this example, it might be faster to take the much longer escape route since it is not jammed.