Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Handling displacement effects in on-body sensor-based activity recognition

594 views

Published on

So far little attention has been paid to activity recognition systems limitations during out-of-lab daily usage. Sensor displacement is one of these major issues, particularly deleterious for inertial on-body sensing. The eff ect of the displacement normally translates into a drift on the signal space that further propagates to the feature level, thus modifying the expected behavior of the prede ned recognition systems. On the use of several sensors and diverse motion-sensing modalities, in this paper we compare two fusion methods to evaluate the importance
of decoupling the combination process at feature and classi cation levels under realistic sensor con gurations. In particular a 'feature fusion' and a 'multi-sensor hierarchical-classifi er' are considered. The results reveal that the aggregation of sensor-based decisions may overcome the
difficulties introduced by the displacement and confi rm the gyroscope as possibly the most displacement-robust sensor modality.

This presentation illustrates part of the work described in the following articles:

* Banos, O., Damas, M., Pomares, H., Rojas, I.: Handling displacement effects in on-body sensor-based activity recognition. In: Proceedings of the 5th International Work-conference on Ambient Assisted Living an Active Ageing (IWAAL 2013), San José, Costa Rica, December 2-6, (2013)
* Banos, O., Damas, M., Pomares, H., Rojas, I. On the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognition. Sensors, vol. 12, no. 6, pp. 8039-8054 (2012)

Published in: Science, Technology, Business
  • Be the first to comment

  • Be the first to like this

Handling displacement effects in on-body sensor-based activity recognition

  1. 1. Handling displacement effects in on-body sensor- based activity recognition IWAAL 2013, San José (Costa Rica) Oresti Baños, Miguel Damas, Héctor Pomares, and Ignacio Rojas Department of Computer Architecture and Computer Technology, CITIC-UGR, University of Granada, SPAIN oresti@ugr.es DG-Research Grant #228398
  2. 2. Context • On-body activity recognition is becoming true… – Portable/Wearable – Unobtrusive – Fashionable 2
  3. 3. Context • On-body activity recognition is becoming true… Really? – Reliability • Different performance depending on who uses the system (age, height, gender,…) and due to people changes during the lifelong use (conditions, ageing,…)  Reliable? Perdurable? – Usability/Applicability • Application-specific systems require to put on several diverse systems to provide different functionalities  Portable? Unobtrusive? Fashionable? Tractable? – Robustness • Sensor anomalies (decalibration, loose of attachment, displacement,…)  Robust? 3
  4. 4. Problem statement Collect a training dataset Train and test the model The AR system is “ready” 4
  5. 5. Problem statement INVARIANT SENSOR SETUP  (IDEALLY) GOOD RECOGNITION 5
  6. 6. Problem statement SENSOR SETUP CHANGES  RECOGNITION PERFORMANCE MAY DROP 6
  7. 7. Concept of sensor displacement • Categories of sensor displacement – Static: position changes can remain static across the execution of many activity instances, e.g. when sensors are attached with a displacement each day – Dynamic: effect of loose fitting of the sensors, e.g. when attached to cloths • Sensor displacement  new sensor position  signal space change • Sensor displacement effect depends on – Original/end position and body part – Activity/gestures/movements performed – Sensor modality (ACC, GYR, MAG) 7 Sensor displacement = rotation + translation (angular displacement) (linear displacement)
  8. 8. Sensor displacement effects Changes in the signal space forward propagates on the activity recognition process (e.g., variations in the feature space) RCIDEAL LCIDEAL= LCSELF 8 RCSELF
  9. 9. Dealing with sensor displacement: Feature Fusion 9
  10. 10. Dealing with sensor displacement: Decision Fusion 10
  11. 11. Multi-Sensor Hierarchical Classifier 11 SM S2 S1 α11 ∑ C12 C1N C11 ∑ C21 C22 C2N ∑ CM1 CM2 CMN ∑ Decision Class level Source level Fusion β11 α12 β12 α1N β1N α21 β21 α22 β22 α2N β2N αM1 βM1 αM2 βM2 αMN βMN γ11,…,1N δ11,…,1N γ21,…,2N δ21,…,2N γM1,…,MN δM1,…,MN [-0.14,3.41,4,21,…,6.11] [-0.84,3.21,4.21,…,6.11] [-0.81,5.71,4.21,…,6.22] [-0.14,3.92,4.23,…,7.82] S1 S2 SM u1 p1 s11,s12,…,s1k fℝ(s11,s12,…,s1k) u2 p2 s21,s22,…,s2k fℝ(s21,s22,…,s2k) uM pM sM1,sM2,…,sMk fℝ(sM1,sM2,…,sMk) O. Banos, M. Damas, H. Pomares, F. Rojas, B. Delgado-Marquez, and O. Valenzuela. Human activity recognition based on a sensor weighting hierarchical classifier. Soft Computing, 17:333-343, 2013.
  12. 12. Study of sensor displacement effects • Analyze – Variability introduced by sensors self-positioning with respect to an ideal setup – Effects of large sensor displacements (extreme de-positioning) – Robustness of sensor fusion to displacement • Scenarios – Ideal-placement – Self-placement – Induced-displacement Ideal Self Induced 12 O. Banos, M. Damas, H. Pomares, I. Rojas, M. Attila Toth, and O. Amft. A benchmark dataset to evaluate sensor displacement in activity recognition. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, pages 1026-1035, New York, NY, USA, 2012.
  13. 13. Dataset: activity set • Activities intended for: – Body-general motion: Translation | Jumps | Fitness – Body-part-specific motion: Trunk | Upper-extremities | Lower-extremities 13
  14. 14. Dataset: Study setup • Cardio-fitness room • 9 IMUs (XSENS)  ACC, GYR, MAG • Laptop  data storage and labeling • Camera  offline data validation http://crnt.sourceforge.net/CRN_Toolbox/Home.html14
  15. 15. Dataset: Experimental protocol • Scenario description • Protocol Round Sensor Deployment #subjects #anomalous sensors 1st Self-placement 17 3/9 2nd Ideal-placement 17 0/9 - Mutual-displacement 3 {4,5,6 or 7}/9 15 Preparation phase (sensor positioning & wiring, Xsens-Laptop bluetooth connection, camera set up) Exercises execution (20 times/1 min. each) Battery replacement, data downloading Data postprocessing (relabeling, visual inspection, evaluation) Round
  16. 16. Experimental setup • Data considerations – Data domain: ACC, GYR, MAG and combinations (ACC-GYR, ACC-MAG, GYR-MAG, ACC-GYR-MAG) – ALL sensors • Activity recognition methods – No preprocessing – Segmentation: 6 seconds sliding window – Features: MEAN, STD, MAX, MIN, MCR – Reasoner: C4.5 decision tree 16 • Sensor displacement scenarios – Ideal (no displacement) – Self (3 out of all sensors) – Induced (7 out of all sensors) • Evaluation – Ideal: 5-fold cross validation, 100 times – Self/Mutual: tested on a system trained on ideal- placement data
  17. 17. Fusion systems performance 17 Feature Fusion Decision Fusion (MSHC)
  18. 18. Fusion systems performance 18 40% 30%15% 20% 40% 15% 35% 20%15% 15% 15%10%5% 10% Feature Fusion Decision Fusion (MSHC)
  19. 19. Fusion systems performance 19 60% 45%25% 50% 55% 35% 45% 35%30% 55% 35%30%10% 30% Feature Fusion Decision Fusion (MSHC)
  20. 20. Conclusions and final remarks • Sensor anomalies (here displacement) may seriously damage the performance of activity recognition systems, especially single sensor based systems • Sensor fusion is proposed to deal with these anomalies • Feature fusion approaches (the most widely used) has been demonstrated to be very sensitive to sensor displacement • Decision fusion cope much better with the effects of sensor displacement, even when a majority of the sensors are highly de- positioned • From the analyzed sensor magnitudes, GYR outstands as the most robust modality to displacement with a performance drop of less than 5% for the self-placement scenario and 10% for the extreme displacement 20
  21. 21. Thank you for your attention. Questions? Oresti Baños Legrán Dep. Computer Architecture & Computer Technology Faculty of Computer & Electrical Engineering (ETSIIT) University of Granada, Granada (SPAIN) Email: oresti@ugr.es Phone: +34 958 241 778 Fax: +34 958 248 993 Work supported in part by the HPC-Europa2 project funded by the European Commission - DG Research in the Seventh Framework Programme under grant agreement no. 228398 and the FPU Spanish grant AP2009-2244. 21

×