Advertisement
Advertisement

More Related Content

Advertisement

More from Oresti Banos(20)

Recently uploaded(20)

Advertisement

Handling displacement effects in on-body sensor-based activity recognition

  1. Handling displacement effects in on-body sensor- based activity recognition IWAAL 2013, San José (Costa Rica) Oresti Baños, Miguel Damas, Héctor Pomares, and Ignacio Rojas Department of Computer Architecture and Computer Technology, CITIC-UGR, University of Granada, SPAIN oresti@ugr.es DG-Research Grant #228398
  2. Context • On-body activity recognition is becoming true… – Portable/Wearable – Unobtrusive – Fashionable 2
  3. Context • On-body activity recognition is becoming true… Really? – Reliability • Different performance depending on who uses the system (age, height, gender,…) and due to people changes during the lifelong use (conditions, ageing,…)  Reliable? Perdurable? – Usability/Applicability • Application-specific systems require to put on several diverse systems to provide different functionalities  Portable? Unobtrusive? Fashionable? Tractable? – Robustness • Sensor anomalies (decalibration, loose of attachment, displacement,…)  Robust? 3
  4. Problem statement Collect a training dataset Train and test the model The AR system is “ready” 4
  5. Problem statement INVARIANT SENSOR SETUP  (IDEALLY) GOOD RECOGNITION 5
  6. Problem statement SENSOR SETUP CHANGES  RECOGNITION PERFORMANCE MAY DROP 6
  7. Concept of sensor displacement • Categories of sensor displacement – Static: position changes can remain static across the execution of many activity instances, e.g. when sensors are attached with a displacement each day – Dynamic: effect of loose fitting of the sensors, e.g. when attached to cloths • Sensor displacement  new sensor position  signal space change • Sensor displacement effect depends on – Original/end position and body part – Activity/gestures/movements performed – Sensor modality (ACC, GYR, MAG) 7 Sensor displacement = rotation + translation (angular displacement) (linear displacement)
  8. Sensor displacement effects Changes in the signal space forward propagates on the activity recognition process (e.g., variations in the feature space) RCIDEAL LCIDEAL= LCSELF 8 RCSELF
  9. Dealing with sensor displacement: Feature Fusion 9
  10. Dealing with sensor displacement: Decision Fusion 10
  11. Multi-Sensor Hierarchical Classifier 11 SM S2 S1 α11 ∑ C12 C1N C11 ∑ C21 C22 C2N ∑ CM1 CM2 CMN ∑ Decision Class level Source level Fusion β11 α12 β12 α1N β1N α21 β21 α22 β22 α2N β2N αM1 βM1 αM2 βM2 αMN βMN γ11,…,1N δ11,…,1N γ21,…,2N δ21,…,2N γM1,…,MN δM1,…,MN [-0.14,3.41,4,21,…,6.11] [-0.84,3.21,4.21,…,6.11] [-0.81,5.71,4.21,…,6.22] [-0.14,3.92,4.23,…,7.82] S1 S2 SM u1 p1 s11,s12,…,s1k fℝ(s11,s12,…,s1k) u2 p2 s21,s22,…,s2k fℝ(s21,s22,…,s2k) uM pM sM1,sM2,…,sMk fℝ(sM1,sM2,…,sMk) O. Banos, M. Damas, H. Pomares, F. Rojas, B. Delgado-Marquez, and O. Valenzuela. Human activity recognition based on a sensor weighting hierarchical classifier. Soft Computing, 17:333-343, 2013.
  12. Study of sensor displacement effects • Analyze – Variability introduced by sensors self-positioning with respect to an ideal setup – Effects of large sensor displacements (extreme de-positioning) – Robustness of sensor fusion to displacement • Scenarios – Ideal-placement – Self-placement – Induced-displacement Ideal Self Induced 12 O. Banos, M. Damas, H. Pomares, I. Rojas, M. Attila Toth, and O. Amft. A benchmark dataset to evaluate sensor displacement in activity recognition. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, pages 1026-1035, New York, NY, USA, 2012.
  13. Dataset: activity set • Activities intended for: – Body-general motion: Translation | Jumps | Fitness – Body-part-specific motion: Trunk | Upper-extremities | Lower-extremities 13
  14. Dataset: Study setup • Cardio-fitness room • 9 IMUs (XSENS)  ACC, GYR, MAG • Laptop  data storage and labeling • Camera  offline data validation http://crnt.sourceforge.net/CRN_Toolbox/Home.html14
  15. Dataset: Experimental protocol • Scenario description • Protocol Round Sensor Deployment #subjects #anomalous sensors 1st Self-placement 17 3/9 2nd Ideal-placement 17 0/9 - Mutual-displacement 3 {4,5,6 or 7}/9 15 Preparation phase (sensor positioning & wiring, Xsens-Laptop bluetooth connection, camera set up) Exercises execution (20 times/1 min. each) Battery replacement, data downloading Data postprocessing (relabeling, visual inspection, evaluation) Round
  16. Experimental setup • Data considerations – Data domain: ACC, GYR, MAG and combinations (ACC-GYR, ACC-MAG, GYR-MAG, ACC-GYR-MAG) – ALL sensors • Activity recognition methods – No preprocessing – Segmentation: 6 seconds sliding window – Features: MEAN, STD, MAX, MIN, MCR – Reasoner: C4.5 decision tree 16 • Sensor displacement scenarios – Ideal (no displacement) – Self (3 out of all sensors) – Induced (7 out of all sensors) • Evaluation – Ideal: 5-fold cross validation, 100 times – Self/Mutual: tested on a system trained on ideal- placement data
  17. Fusion systems performance 17 Feature Fusion Decision Fusion (MSHC)
  18. Fusion systems performance 18 40% 30%15% 20% 40% 15% 35% 20%15% 15% 15%10%5% 10% Feature Fusion Decision Fusion (MSHC)
  19. Fusion systems performance 19 60% 45%25% 50% 55% 35% 45% 35%30% 55% 35%30%10% 30% Feature Fusion Decision Fusion (MSHC)
  20. Conclusions and final remarks • Sensor anomalies (here displacement) may seriously damage the performance of activity recognition systems, especially single sensor based systems • Sensor fusion is proposed to deal with these anomalies • Feature fusion approaches (the most widely used) has been demonstrated to be very sensitive to sensor displacement • Decision fusion cope much better with the effects of sensor displacement, even when a majority of the sensors are highly de- positioned • From the analyzed sensor magnitudes, GYR outstands as the most robust modality to displacement with a performance drop of less than 5% for the self-placement scenario and 10% for the extreme displacement 20
  21. Thank you for your attention. Questions? Oresti Baños Legrán Dep. Computer Architecture & Computer Technology Faculty of Computer & Electrical Engineering (ETSIIT) University of Granada, Granada (SPAIN) Email: oresti@ugr.es Phone: +34 958 241 778 Fax: +34 958 248 993 Work supported in part by the HPC-Europa2 project funded by the European Commission - DG Research in the Seventh Framework Programme under grant agreement no. 228398 and the FPU Spanish grant AP2009-2244. 21
Advertisement