Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Activity recognition based on a multi-sensor meta-classifier


Published on

Ensuring ubiquity, robustness and continuity of monitoring
is of key importance in activity recognition. To that end, multiple sensor con gurations and fusion techniques are ever more used. In this paper we present a multi-sensor meta-classi er that aggregates the knowledge of several sensor-based decision entities to provide a unique and reliable activity classi cation. This model introduces a new weighting scheme which improves the rating of the impact that each entity has on the decision fusion process. Sensitivity and speci city are particularly considered as insertion and rejection weighting metrics instead of the overall accuracy classi cation performance proposed in a previous work. For the sake of comparison, both new and previous weighting models together with feature fusion models are tested on an extensive activity recognition
benchmark dataset. The results demonstrate that the new weighting scheme enhances the decision aggregation thus leading to an improved recognition system.

This presentation illustrates part of the work described in the following articles:

* Banos, O., Damas, M., Pomares, H., Rojas, F., Delgado-Marquez, B. & Valenzuela, O.
Human activity recognition based on a sensor weighting hierarchical classifier.
Soft Computing - A Fusion of Foundations, Methodologies and Applications, Springer Berlin / Heidelberg, vol. 17, pp. 333-343 (2013)
* Banos, O., Damas, M., Pomares, H., Rojas, I.: Activity recognition based on a multi-sensor meta-classifier. In: Proceedings of the 2013 International Work Conference on Neural Networks (IWANN 2013), Tenerife, Spain, June 12-14, (2013)

Published in: Science, Technology, Business
  • Be the first to comment

  • Be the first to like this

Activity recognition based on a multi-sensor meta-classifier

  1. 1. Activity recognition based on a multi-sensor hierarchical- classifier IWANN 2013, 12-14 June, Tenerife (Spain) Oresti Baños, Miguel Damas, Héctor Pomares and Ignacio Rojas Department of Computer Architecture and Computer Technology, CITIC-UGR, University of Granada, SPAIN DG-Research Grant #228398
  2. 2. Introduction • Activity recognition concept – “Recognize the actions and goals of one or more agents from a series of observations on the agents' actions and the environmental conditions” • Applications (among others) – eHealth (AAL, telerehabilation) – Sports (performance improvement, injury-free pose) – Industrial (assembly tasks, avoidance of risk situations) – Gaming (Kinect, Wii Mote, PlayStationMove) • Categorization by sensor modality – Ambient – On-body 2
  3. 3. Sensing Activity 3 • Ambient sensors
  4. 4. Sensing Activity • Ambient sensors Limitations*
  5. 5. 3rd Generation (and beyond…) 2nd Generation1st Generation Sensing Activity 5 • On-body sensors
  6. 6. Activity Recognition Chain (ARC) 6
  7. 7. Activity Recognition Chain (ARC) 7
  8. 8. Activity Recognition Chain (ARC) 8
  9. 9. Activity Recognition Chain (ARC) 9
  10. 10. Activity Recognition Chain (ARC) 10
  11. 11. Activity Recognition Chain (ARC) 11
  12. 12. Activity Recognition Chain (ARC) 12
  13. 13. Activity Recognition Chain (ARC) 13
  14. 14. Activity Recognition Chain (ARC) 14
  15. 15. Activity Recognition Chain (ARC) 15
  16. 16. Activity Recognition Chain (ARC) 16 SENSOR FUSION
  17. 17. ARC Fusion: Feature Fusion 17
  18. 18. ARC Fusion: Decision Fusion 18
  19. 19. Multi-Sensor Hierarchical Classifier 19 SM S2 S1 α11 ∑ C12 C1N C11 ∑ C21 C22 C2N ∑ CM1 CM2 CMN ∑ Decision Class level Source level Fusion β11 α12 β12 α1N β1N α21 β21 α22 β22 α2N β2N αM1 βM1 αM2 βM2 αMN βMN γ11,…,1N δ11,…,1N γ21,…,2N δ21,…,2N γM1,…,MN δM1,…,MN [-0.14,3.41,4,21,…,6.11] [-0.84,3.21,4.21,…,6.11] [-0.81,5.71,4.21,…,6.22] [-0.14,3.92,4.23,…,7.82] S1 S2 SM u1 p1 s11,s12,…,s1k fℝ(s11,s12,…,s1k) u2 p2 s21,s22,…,s2k fℝ(s21,s22,…,s2k) uM pM sM1,sM2,…,sMk fℝ(sM1,sM2,…,sMk)
  20. 20. Multi-Sensor Hierarchical Classifier 20 N activities M sensors&Class level Source level Fusion
  21. 21. Multi-Sensor Hierarchical Classifier 21 N activities M sensors&Class level Source level Fusion
  22. 22. Multi-Sensor Hierarchical Classifier 22 N activities M sensors&Class level Source level Fusion
  23. 23. Multi-Sensor Hierarchical Classifier 23 N activities M sensors&Class level Source level Fusion
  24. 24. Experimental setup: dataset • Fitness benchmark dataset • Up to 33 activities • 9 IMUs (XSENS)  ACC, GYR, MAG • 17 subjects 24 Baños, O., Toth M. A., Damas, M., Pomares, H., Rojas, I., Amft, O.: A benchmark dataset to evaluate sensor displacement in activity recognition. In: 14th International Conference on Ubiquitous Computing (Ubicomp 2012), Pittsburgh, USA, September 5-8, (2012)
  25. 25. Results • Segmentation: sliding window (6 seconds) • Feature extraction: FS1={mean}, FS2={mean,std}, FS3={mean,std,max,min,cr} • Classification: Decision tree (C4.5) (10-fold cross-validated, 100 repetitions) 2510 activities 20 activities 33 activities FS1 FS2 FS3 FS1 FS2 FS3 FS1 FS2 FS3 60 65 70 75 80 85 90 95 100 Accuracy(%) Feature Fusion Weighted Majority Voting Multi-Sensor Hierarchical Classifier Experimental Parameters
  26. 26. Conclusions • We propose a multi-sensor hierarchical classifier that allows data fusion of multiple sensors – Its assymetric decision weighting (SEinsertions/SPrejections) leverages the potential of the classifiers either for classification/rejection or both – Specially suited for complex scenarios • Feature Fusion and MSHC are quite in line in terms of performance however – Our method outperforms the former when a more informative feature set is used – Particularly notable for complex recognition scenarios • Our model is expected to be particularly suited to deal with sensor anomalies (work-in-progress) 26
  27. 27. On-going work… • Our model is expected to be particularly suited to deal with sensor anomalies (work-in-progress) 27 FEAT-FUSION MSHC 0 20 40 60 80 100 Accuracy(%) Ideal Self Induced
  28. 28. Thank you for your attention. Questions? Oresti Baños Legrán Dep. Computer Architecture & Computer Technology Faculty of Computer & Electrical Engineering (ETSIIT) University of Granada, Granada (SPAIN) Email: Phone: +34 958 241 516 Fax: +34 958 248 993 Work supported in part by the HPC-Europa2 project funded by the European Commission - DG Research in the Seventh Framework Programme under grant agreement no. 228398, the Spanish CICYT Project SAF2010-20558, Junta de Andalucia Project P09-TIC-175476 and the FPU Spanish grant AP2009-2244. 28