M1 gp用 20120511ver2

526 views
482 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
526
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

M1 gp用 20120511ver2

  1. 1. PBN: Towards Practical ActivityRecognition Using Smartphone- based Body Sensor Networks The real authors …Matt Keally, Gang Zhou, Guoliang Xing1, Jianxin Wu2, and Andrew PylesCollege of William and Mary, 1Michigan State University, 2Nanyang Technological University Sensys2011 Nakagawa Keijiro The University of Tokyo, Sezaki Lab
  2. 2. Outline!   Related Work!   Proposal!   Abstract!   Experimentation!   The most important point! AdaBoost!   Sensor Selection!   Results!   Future 2
  3. 3. Related WorkBody Sensor Networks v  Athletic Performance v  Health Care v  Activity Recognition LAN
  4. 4. Related Work!   No mobile or on-body aggregator !   (Ganti, MobiSys ‘06), (Lorincz, SenSys ‘09), (Zappi, EWSN ‘08)!   Use of backend servers !   (Miluzzo, MobiSys ‘10), (Miluzzo, SenSys ‘08)!   Single sensor modality or separate classifier per modality !   (Azizyan, MobiCom ‘09), (Kim, SenSys ‘10), (Lu, SenSys ‘10)!   Do not provide online training !   (Wang, MobiSys ‘09), (Wachuri, UbiComp ‘10)
  5. 5. ProposalPBN: Practical Body Networking
  6. 6. Abstract Activity recognition Hardware: TinyOS + Android Software: Android Application USB
  7. 7. Abstract Sensor Node Sample Local Agg. Sensor Sensor Sensor Controller Agg. Data Start/Stop 802.15.4 Base Station Node USB Phone Sample TinyOS GUI Comm. StackController Labeled Data Activity Decision, Request Ground Truth Agg. Data Sensor Sensor Ground Truth Selection Sensor Sensor Management Training Data Agg. Data Retraining Activity SensorLocal Agg. Classification Detection Selection Activity Prob, Agg. Data Input Sensors 7
  8. 8. Experimentationv  2 subjects, 2 weeksv  Android Phone 3-axis accelerometer, WiFi/GPS Localizationv  5 IRIS Sensor Motes 2-axis accelerometer, light, temperature, acoustic, RSSI Node ID Location 0 BS/Phone 1 L. Wrist 2 R. Wrist 3 L. Ankle 4 R. Ankle 5 Head
  9. 9. Abstract Sensor Node Sample Local Agg. Sensor Sensor Sensor Controller Agg. Data Start/Stop 802.15.4 Base Station Node USB Phone Sample TinyOS GUI Comm. StackController Labeled Data Activity Decision, Request Ground Truth Agg. Data Sensor Sensor Ground Truth Selection Sensor Sensor Management Training Data Agg. Data Retraining Activity SensorLocal Agg. Classification Detection Selection Activity Prob, Agg. Data Input Sensors 9
  10. 10. The most important point AdaBoost (Activity Classification) Sensor Selection
  11. 11. AdaBoostEnsemble Learning: AdaBoost.M2 (Freund, JCSS ‘97)v  Lightweight and accuratev  Maximizes training accuracy for all activitiesv  Many classifiers (GMM, HMM) are more demandingIteratively train an ensemble of weak classifiersv  Training observations are weighted by misclassificationsv  At each iteration: l  Train Naïve Bayes classifiers for each sensor l  Choose the classifier with the least weighted error l  Update weighted observationsThe ensemble makes decisions based on the weighted decisions ofeach weak classifier
  12. 12. Sensor Selection v  Identify both helpful and redundant sensorsv  Train fewer weak classifiers per AdaBoost iterationv  Bonus: use even fewer sensors Raw Data Correlation Sensors
  13. 13. Sensor Selection Goal: determine the sensors that AdaBoost chooses using correlationFind the correlation of each pair of sensors selected by AdaBoostUse average correlation as a threshold for choosing sensors 2,MIC 2,MIC Unused 3,TEMP 1,ACC 1,ACC All Sensors AdaBoost 3,MIC 1,LIGHT 3,MIC 2,TEMP 3,TEMP Selected 2,TEMP 1,LIGHT
  14. 14. Sensor Selection Goal: determine the sensors that AdaBoost chooses using correlationFind the correlation of each pair of sensors selected by AdaBoostUse average correlation as a threshold for choosing sensors 2,MIC 2,MIC Unused 3,TEMP 1,ACC 1,ACC All Sensors AdaBoost 3,MIC 1,LIGHT 3,MIC 2,TEMP 3,TEMP correlation(2,TEMP;  1,LIGHT)   Selected 2,TEMP 1,LIGHT
  15. 15. Sensor Selection Goal: determine the sensors that AdaBoost chooses using correlationFind the correlation of each pair of sensors selected by AdaBoostUse average correlation as a threshold for choosing sensors 2,MIC 2,MIC Unused 3,TEMP 1,ACC 1,ACC All Sensors AdaBoost 3,MIC 1,LIGHT 3,MIC 2,TEMP 3,TEMP correlation(2,TEMP;  1,LIGHT)   Selected 2,TEMP correlation(3,MIC;  3,TEMP)   1,LIGHT
  16. 16. Sensor Selection Goal: determine the sensors that AdaBoost chooses using correlationFind the correlation of each pair of sensors selected by AdaBoostUse average correlation as a threshold for choosing sensors 2,MIC 2,MIC Unused 3,TEMP 1,ACC 1,ACC All Sensors AdaBoost 3,MIC 1,LIGHT 3,MIC 2,TEMP 3,TEMP correlation(2,TEMP;  1,LIGHT)   Selected 2,TEMP correlation(3,MIC;  3,TEMP)   1,LIGHT …   Set threshold α based on average correlation:α  =  μcorr  +  σcorr  
  17. 17. Sensor Selection Unused 2,MIC 3,TEMP 1,ACC All Sensors 3,MIC 1,LIGHT 2,TEMP Selected AdaBoostcorrelation(1,ACC;  1,LIGHT)  ≤  α  
  18. 18. Sensor Selection Unused 2,MIC 3,TEMP 1,ACC All Sensors 3,MIC 1,LIGHT 1,ACC 2,TEMP 1,LIGHT Selected AdaBoostcorrelation(2,TEMP;  1,ACC)  >  α  acc(2,TEMP)  >  acc(1,ACC)  
  19. 19. Sensor Selection Unused 2,MIC 3,TEMP 1,ACC 1,ACC All Sensors 3,MIC 1,LIGHT 2,TEMP 1,LIGHT Selected AdaBoost 2,TEMPcorrelation(1,ACC;  3,TEMP)  ≤  α  
  20. 20. Sensor Selection Unused 2,MIC3,TEMP 1,ACC 1,ACC All Sensors 3,MIC 1,LIGHT 3,TEMP 2,TEMP 1,LIGHT Selected AdaBoost 2,TEMP
  21. 21. ResultsWe improved excluding redundant sensors more about 15% than others by“AdaBoost+SS”
  22. 22. Conclution!   PBN: Towards practical BSN daily activity recognition!   PBN provides: !   Strong classification performance !   Retraining detection to reduce invasiveness !   Identification of redundant resources!   Future Work !   Extensive usability study !   Improve phone energy usage
  23. 23. Thank you for your listening 23
  24. 24. Appendix
  25. 25. Kullback–Leibler divergence 確率論と情報理論における2つの確率分布の差異を計る尺度 P 、 Q を離散確率分布とするとき、P の Q に対するカルバック・ライブラー情報量は以下のように定義される。 ここでP(i) 、Q(i) はそれぞれ確率分布P 、 Q に従って選ばれた値がi になる確率。
  26. 26. AdaBoost training AlgorihmInput: Max iterations T, training obs. vector Oj foreach sensor s j ∈ S, obs. ground truth labelsOutput: Set of weak classifiers HInitialize observation weights D1 to 1/n for all obs.for t=1toT doforsensorsj∈Sdo Train weak classifier ht,j usingobs. Oj, weights Dt Get weighted error εt, j for ht,j using labels [8]end forAdd the ht, j with least error εt to H by choosinght, j with least error εt Set Dt+1 using Dt ,misclassifications made by ht [8]endfor
  27. 27. Raw Correlation Threshold forSensor SelectionInput: Set of sensors S selected by AdaBoost, training ob-servations for all sensors O, multiplier nOutput: Sensor selection threshold αR = 0/ // set of correlation coefficients for allcombinations of sensors si and s j in S doCompute correlation coefficient r = |rOi ,O j |R=R∪{r} end for// compute threshold as avg + (n * std. dev. ) of R α =μR + nσR
  28. 28. Sensor Selection Using RawCorrelationInput: SetofallsensorsS,training observations for all sensors O,threshold αOutput: Selected sensors S∗ to give as input to AdaBoostS ∗ = 0/ E = 0/ // set of sensors we exclude for allcombinations of sensors si and s j in S doCompute correlation coefficient r = |rOi ,O j | if r < α theni f s i ∈/ E t h e n S ∗ = S ∗ ∪ { s i }i f s j ∈/ E t h e n S ∗ = S ∗ ∪ { s j } else if r ≥ α and acc(si) >acc(sj) then// use accuracy to decide which to add to S∗ i f s i ∈/ E t h e nS ∗ = S ∗ ∪ { s i } E = E ∪{sj}; S∗ = S∗ {sj}else i f s j ∈/ E t h e n S ∗ = S ∗ ∪ { s j } E = E ∪{si}; S∗ = S∗{si}end if endfor
  29. 29. Personal Sensing Applications !   Body Sensor Networks !   Athletic Performance !   Health Care !   Activity Recognition Heart Rate Monitor Pulse Oximeter Mobile Phone Aggregator 29
  30. 30. A Practical Solution to Activity Recognition!   Portable!   Entirely user controlled!   Computationally lightweight!   Accurate On-Body Sensors Phone +Sensing Accuracy +User Interface +Energy Efficiency +Computational Power +Additional Sensors 30
  31. 31. Challenges to Practical Activity Recognition!   User-friendly !   Hardware configuration !   Software configuration!   Accurate classification !   Classify difficult activities in the presence of dynamics!   Efficient classification !   Computation and energy efficiency!   Less reliance on ground truth !   Labeling sensor data is invasive 31
  32. 32. PBN: Practical Body Networking! TinyOS-based motes + Android phone!   Lightweight activity recognition appropriate for motes and phones!   Retraining detection to reduce invasiveness!   Identify redundant sensors to reduce training costs!   Classify difficult activities with nearly 90% accuracy 32
  33. 33. Hardware: TinyOS + Android!   IRIS on-body motes, TelosB base station, G1 phone!   Enable USB host mode support in Android kernel!   Android device manager modifications! TinyOS JNI compiled for Android 33
  34. 34. Software: Android ApplicationSensor Configuration Runtime Control and Feedback Ground Truth Logging 34
  35. 35. Data Collection Setup!   Classify typical daily activities, postures, and environment!   Previous work (Lester, et. al.) identifies some activities as hard to classify!   Classification Categories: Environment Indoors, Outdoors Posture Cycling, Lying Down, Sitting, Standing, Walking Activity Cleaning, Cycling, Driving, Eating, Meeting, Reading, Walking, Watching TV, Working 35
  36. 36. PBN Architecture Sensor Node Sample Local Agg. Sensor Sensor Sensor Controller Agg. Data Start/Stop 802.15.4 Base Station Node USB Phone Sample TinyOS GUI Comm. StackController Labeled Data Activity Decision, Request Ground Truth Agg. Data Sensor Sensor Ground Truth Selection Sensor Sensor Management Training Data Agg. Data Retraining Activity SensorLocal Agg. Classification Detection Selection Activity Prob, Agg. Data Input Sensors 36
  37. 37. Retraining Detection!   Body Sensor Network Dynamics affects accuracy during runtime: !   Changing physical location !   User biomechanics !   Variable sensor orientation !   Background noise!   How to detect that retraining is needed without asking for ground truth? !   Constantly nagging the user for ground truth is annoying !   Perform with limited initial training data !   Maintain high accuracy 37
  38. 38. Retraining Detection!   Measure the discriminative power of each sensor: K-L divergence !   Quantify the difference between sensor reading distributions Sensors!   Retraining detection with K-L divergence: !   Compare training data to runtime data for each sensor 38
  39. 39. Retraining Detection!   Training !   Compute “one vs. rest” K-L divergence for each sensor and activity Walking Driving Working Training Data Ground Truth: For each sensor: Sensors 1,LIGHT Walking Data Partition: 1,ACC 2,MIC DKL(Twalking,Tother)  =   vs. Walking {Driving, Working} … Training Data Distribution Training Data Distribution 39
  40. 40. Retraining Detection!   Runtime !   At each interval, sensors compare runtime data to training data for current classified activity Sensors Current AdaBoost Classified Activity: Walking For each sensor: 1,LIGHT 1,ACC DKL(Rwalking,Twalking)  =   vs. 2,MIC Walking Walking Runtime Data Distribution Training Data Distribution … 40
  41. 41. Retraining Detection !   Runtime !   At each interval, sensors compare runtime data to training data for current classified activity !   Each individual sensor determines retraining is needed when: DKL(Rwalking,Twalking)              >            DKL(Twalking,Tother)     Intra-activity divergence Inter-activity divergence vs. vs. Walking Walking Walking {Driving, Working}Runtime Data Distribution Training Data Distribution Training Data Distribution Training Data Distribution 41
  42. 42. Retraining Detection!   Runtime !   At each interval, sensors compare runtime data to training data for current classified activity !   Each individual sensor determines retraining is needed !   The ensemble retrains when a weighted majority of sensors demand retraining 42
  43. 43. Ground Truth Management!   Retraining: How much new labeled data to collect? !   Capture changes in BSN dynamics !   Too much labeling is intrusive!   Balance number of observations per activity !   Loose balance hurts classification accuracy !   Restrictive balance prevents adding new data !   Balance multiplier !   Each activity has no more than δ times the average !   Balance enforcement: random replacement 43
  44. 44. Sensor Selection! AdaBoost training can be computationally demanding !   Train a weak classifier for each sensor at each iteration !   > 100 iterations to achieve maximum accuracy!   Can we give only the most helpful sensors to AdaBoost? !   Identify both helpful and redundant sensors !   Train fewer weak classifiers per AdaBoost iteration !   Bonus: use even fewer sensors 44
  45. 45. Sensor SelectionChoosing sensors with slight correlation yields the highest accuracy 45
  46. 46. Evaluation Setup!   Classify typical daily activities, postures, and environment!   2 subjects over 2 weeks!   Classification Categories: Environment Indoors, Outdoors Posture Cycling, Lying Down, Sitting, Standing, Walking Activity Cleaning, Cycling, Driving, Eating, Meeting, Reading, Walking, Watching TV, Working 46
  47. 47. Classification Performance 47
  48. 48. Classification Performance 48
  49. 49. Classification Performance 49
  50. 50. Retraining Performance 50
  51. 51. Sensor Selection Performance 51
  52. 52. Application Performance!   Power Benchmarks!   Training Overhead!   Battery Life Mode CPU Memory Power Idle (No PBN) <1% 4.30MB 360.59mW Sampling (WiFi) 19% 8.16MB 517.74mW Sampling (GPS) 21% 8.47MB 711.74mW Sampling (Wifi) + Train 100% 9.48MB 601.02mW Sampling (WiFi) + Classify 21% 9.45MB 513.57mW 52

×