Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

PAMS: A new position-aware multi-sensor dataset for human activity recognition using smartphones

339 views

Published on

Nowadays smartphones are ubiquitous in various aspects of our lives. The processing power, communication bandwidth, and the memory capacity of these devices have surged considerably in recent years. Besides, the variety of sensor types, such as accelerometer, gyroscope, humidity sensor, and bio-sensors, which are embedded in these devices, opens a new horizon in self-monitoring of physical daily activities. One of the primary steps for any research in the area of detecting daily life activities is to test a detection method on benchmark datasets. Most of the early datasets limited their work to collecting only a single type of sensor data such as accelerometer data. While some others do not consider age, weight, and gender of the subjects who have participated in collecting their activity data. Finally, part of the previous works collected data without considering the smartphone's position. In this paper, we introduce a new dataset, called Position-Aware Multi-Sensor (PAMS). The dataset contains both accelerometer and gyroscope data. The gyroscope data boosts the accuracy of activity recognition methods as well as enabling them to detect a wider range of activities. We also take the user information into account. Based on the biometric attributes of the participants, a separate learned model is generated to analyze their activities. We concentrate on several major activities, including sitting, standing, walking, running, ascending/descending stairs, and cycling. To evaluate the dataset, we use various classifiers, and the outputs are compared to the WISDM. The results show that using the aforementioned classifiers, the average precision for all activities is above 88.5%. Besides, we measure the CPU, memory, and bandwidth usage of the application collecting data on the smartphone.
https://ieeexplore.ieee.org/document/8310680/

Published in: Engineering
  • Be the first to comment

PAMS: A new position-aware multi-sensor dataset for human activity recognition using smartphones

  1. 1. Hadi Tabatabaee Malazi , Pegah Esfehani PAMS: A new position-aware multi-sensor dataset for human activity recognition using smartphones Faculty of Computer Science and Engineering, GC Shahid Beheshti University, Tehran, Iran
  2. 2. 2009 (Arm A8,1 Core) 2015(Arm A37,4 +4 Core) Why Mobile Sensing? mobile cloud computing systems Any time and Any where 3-axial/accelerometers, GPS, humidity sensor, temperature sensor and compass Smart Phones are used ubiquitously The growth of cpu technology Ability to connect other deivces Buttery Efficient A wide range of biosensors More than 1 Day PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 1-24
  3. 3. APPLICATIONS OF ACTIVITY RECOGNITION SYSTEMS : PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 2-24
  4. 4. 03 02 01 Activity Recognition Position Aware Context Aware Several human activity recognition systems are published in recent years. This systems aim at determining the activities of a person or a group of persons based on sensor and/or video observation data. Activity Recognition 01 A kind of Activity recognition Systems that understand other user’s context,3 including the location, activities (gestures, body posture, modes of locomotion), cognitive/affective states and position of devices , … Context Aware 02 The position of the smartphone plays a key role in detection accuracy of various activities. Position Aware 03 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 3-24 Position awareness in activity recognition
  5. 5. No. of Instances : 72272 2014 Specification of HAR Datasets 2015 2012 2012 2016 2016 Sensor Type: Ambient Sensors No. of Attribute: 561D No. of Instances : 72272 Activity classification using realistic data from wearable sensors C B A 2006 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 4-24 E
  6. 6. 2014 Specification of HAR Datasets 2015 2012 2012 2016 2016 2006 2012 2006 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 5-24 Sensor Type: Smart Phones No. of Participants 30 Subjects No. of Attribute: 561 D No. of Instances : 10299 Human activity recognition on smartphones using a multiclass hardware-friendlysupport vector machine,” in International workshop n ambient assisted living C B A E
  7. 7. 2014 Specification of HAR Datasets 2015 2012 2012 2016 2016 2006 2012 2006 2012 2012 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 6-24 Sensor Type: Smart Phones No. of Participants 29 Subjects No. of Attribute: 43 D No. of Instances : 4526 Activity recognition using cell phone accelerometers C B A E
  8. 8. 2014 Specification of HAR Datasets 2015 2012 2012 2016 2016 2006 2012 2006 2012 2012 2012 2014 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 7-24 Sensor Type: Wearable Sensors No. of Participants 9 Subjects No. of Attribute: 120 D No. of Instances : 1419 Activity recognition using cell phone accelerometers C B A E
  9. 9. Specification of HAR Datasets 2015 2012 2012 2016 2016 2006 2012 2006 2012 2012 2012 2015 2014 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 8-24 Sensor Type: Smart Phones No. of Participants 9 Subjects No. of Attribute: 16 D No. of Instances : 43930257 Smart devices are different: Assessing and mitigating mobile sensing heterogeneities for activity recognition C B A E
  10. 10. Specification of HAR Datasets 2015 2012 2012 2016 2016 2006 2012 2006 2012 2012 2012 2014 2015 2016 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 9-24 Sensor Type: Smart Phones No. of Participants 10 Subjects No. of Attribute: 7 D No. of Instances : 4658 Complex human activity recognition using smartphone and wrist- worn motion sensors C B A E
  11. 11. Specification of HAR Datasets 2015 2012 2012 2016 2016 2006 2012 2006 2012 2012 2012 2014 2015 2016 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 10-24 Sensor Type: Smart Phones and Smart Watch No. of Participants 16 Subjects No. of Attribute: 13 D No. of Instances : 70782 Position-aware activity recognition with wearable devices C B A E
  12. 12. Specification of HAR Datasets 2015 2012 2012 2016 2017 2016 2006 2012 2006 2012 2012 2012 2014 2015 2017 PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 11-24 Sensor Type: Smart Phones No. of Participants 20 Subjects No. of Attribute: 120 D No. of Instances : 51300 PAMS: A new position- aware multi-sensor dataset for human activity recognition using smartphones C B A E
  13. 13. Related Work Cont. ProfessionalIMAGE The overview of PAMS dataset collection proc ess Description • 3D- accelerometer and gyroscope data gathered from Smartphone • The data collected in 10 segment each segment sent to Cloud Storage • Extract rotation around each axis and other information • Use weka tools to Creating learning model and predicted activity PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 12-24
  14. 14. Android Application : 01 Android APP 02 MEMORY Usage 04 CPU Usage 03 NETWORK Usage PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 13-24
  15. 15. General Information  use 20 participants in the range of 21 to 89 years old  Height, Weight, Age, Gender Bio Information Data  Samsung Galaxy s4 and s6 Samsung galaxy note 5 , …  sensor max range, sensor resolution, sensor vendor, sensor version, sensor name Sensor Information Data  Segmented raw data  Featured data Sensory Data PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 14-24
  16. 16. Activity Information: in the activity of walking each iteration indicates a walking step, when the phone is in the dominant thigh WALKING ACTIVITY: For the sitting activity, the value of accelerometer Z is close to earth gravity SITTING ACTIVITY: the frequency of the pattern is higher compared to walking Running ACTIVITY: PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 15-24
  17. 17. What Activity? 7.56% %5.26 %24.56 DESCENDING STAIRS TALKING ON MOBILE RUNNING %31.83 WALKING %9.58 ASCENDING STAIRS Item E %6.20 CYCLING %14.82 STANDING %24.74 SITTING 1 MET<3 Low Level Activity: 2 3<MET<6 Mediom Level Activity: 3 MET>6 High Level Activity: PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 16-24
  18. 18. Feature Extraction PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 8 4 2 7 5 6 3 1 Non over lapping sized base windowing with 1 0 instnances Rotation Values 𝑅𝑜𝑙𝑙𝑖𝑛𝑟𝑎𝑑𝑖𝑎𝑛 𝑃𝑖𝑡𝑐ℎ𝑖𝑛𝑟𝑎𝑑𝑖𝑎𝑛 𝑌𝑎𝑤𝑖𝑛𝑟𝑎𝑑𝑖𝑎𝑛 Centroid for each axis Centroid for each axis Magnitude for each segment Density for each segment Standard deviation for each axis Absolute difference Mean magnitude Max value for each axis 9 17-24
  19. 19. Classification Results PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 18-24 01 • 74.4% Distribution of Dataset instances for the non- dominant hand. • Non-iterative activities, such as sitting and standing have the best precision in all classifiers due to their steady pattern. Data Analysis One 02 • in most of the iterative activities, Bayes network has the weakest performance. • classifiers, random forest has the best average precision performance. Data Analysis Two Random ForestKNNMultilayer Perceptron BaggingLogitBoostBayes Network %99.7% 97.9%99.1%99.6% 99.4%99.7Sitting %99.6% 94.7%99.1% 97.9% 99.3% 100Standing %87.1% 98.4%90.7%88.7% 83.0% 77.9Running %94.5% 89.3%95.4%93.1% 93.5% 95.0Walking %89.8% 94.8%92.6%86.8% 87.2% 79.6Cycling %82.5%81.9%75.8%73.2%68.4%43.4Descending Stairs %89.0%83.5%83.0%80.2%76.1%77.7Ascending Stairs %91.74%91.5%90.81%88.5%86.7%81.9Over all
  20. 20. Classification Results Cont. PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 01 25.5% Distribution of Dataset instances for the non- dominant hand. Data Analysis One 02 • Multilayer perceptron and random forest have the best precisions and f-measures among others. It is worth • pointing out that the difference between the best and the worst classifier is just a few percent. Data Analysis Two 19-24 Random ForestKNNMultilayer Perceptron BaggingLogitBoostBayes Network %99.5% 99.5%99.0%99.5% 99.5%99.5Sitting %99.7% 97.2%99.7% 98.3% 99.7% 100Standing %98.5% 100%100%97.8% 99.2% 94.3Running %99.7% 99.3%99.4%97.8% 99.0% 96.3Walking %100%100%100%100%99.4%99.1Talking on the phone 99.48%99.20%99.62%98.68%99.36%97.84%Over All
  21. 21. PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones  We compared our dataset PAMS, with WISDM using random forest classifier.  The major advantage is achieved in ascending stair s activity with 12.3% and lo west improvement belongs to sitting activity with 1% g ain.  The non-iterative activities, have the least improvement from WISDM are also accurate 96.30% 98.70% 87.30% 72.20% 76.70% 99.60% 99.70% 94.50% 82.50% 89.00% 0.00% 20.00% 40.00% 60.00% 80.00% 100.00% 120.00% Standing Sitting Walking Descending Stairs Ascending Stairs Precision Activity type Random forest WISDM Dataset PAMS Dataset  Since the multilayer perce ptron classifier has the hig hest precision in the WISD M, we compared the datas et with ours using that clas sifier.  The standing activity with 7.1% and the descending s tairs activity with 31.5% ha ve the lowest and highest r ise respectively Comparison and Evaluation 20-24 91.90% 95.00% 91.70% 44.30% 61.50% 99.10% 99.10% 95.40% 75.80% 83.00% 0.00% 20.00% 40.00% 60.00% 80.00% 100.00% 120.00% Standing Sitting Walking Descending Stairs Ascending Stairs Precision Activity type Multilayer Perceptron WISDM Dataset PAMS Dataset
  22. 22. Dataset profiles 14% 24% 62% PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 21-24 Wieght.AvgHeight.AvgAge.AvgGender 74.6618173Male Wieght.AvgHeight.AvgAge.AvgGender 79.8177.230.2Male Wieght.AvgHeight.AvgAge.AvgGender 63.54164.233.7Female
  23. 23. Profiler Results PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 22-24 100 98.3 90.2 87.1 0 88 83.8 91.23 100 100 89.8 90.4 96.8 76.6 91.5 92.15 99.8 100 90.4 96.3 93.4 84.8 86.8 93.07 99.7 99.6 87.1 94.5 89.8 82.5 89 91.74 0 20 40 60 80 100 Sitting Standing Running Walking Cycling Descending Stairs Ascending Stairs Over all precision Activity Type Random Forest P0 P1 P2 Over All Desktop Software Lemon drops oat cake oat cake sugar plum Recycle Product Lemon drops oat cake oat cake sugar plum we used random forest classifier for each profile and compared it with the results achieved by total dataset We showed the average precision are above over 90 % and improvement in to profile p2 and p3
  24. 24. Conclusion PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones We introduced a new dataset (PAMS) that acquired data from multiple smartphone sensors Type A We analyzed the performance of different classifier on the dataset and found out that random forest, KNN and multilayer perceptron achieve the promising results compared to the other classifiers Type B We compared the precision of our dataset with WISDM using random forest and multilayer perceptron classifiers. The results showed improvements in detecting all the activities. Type C Type A Type C Type B 23-24
  25. 25. PAMS:Anewposition-awaremulti-sensordatasetforhumanactivityrecognitionusigsmartphones 24-24

×