Robust Expert Systems for more Flexible Real-World Activity Recognition

Oresti Banos
Oresti BanosAssociate Professor
β€’β€’
Robust Expert Systems for
more Flexible Real-World
Activity Recognition
Granada, Friday, April 25, 2014
Presented by: Oresti BaΓ±os
Supervised by: Miguel Damas, HΓ©ctor Pomares and Ignacio Rojas
Department of Computer Architecture and Computer Technology,
CITIC-UGR, University of Granada, SPAIN
Human Activity
2
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Health
Abnormal
behavior
detection
Proactive
Assistance
Labour risk
prevention
Wellness
Sports
Gaming
Human Activity
β€’ Why is identifying human activity interesting?
3
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition (AR)
β€’ Activity recognition concept
β€œRecognize the actions and goals of one or more agents from a series of
observations on the agents' actions and the environmental conditions”
β€’ Activity recognition process
4
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Phenomena
Human activity
(body motion)
Measurement
Sensing
(ambient/wearables)
Processing
Data adequation
and knowledge
inference Recognized
Activity
Wearable Activity Recognition
5
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
β€’ Wearable activity recognition systems are ready!
The first system capable of fully
recognize your daily routine.
AtlasWearables (2014)
The simplest way to understand
your day and night.
Jawbone Up (2014)
The best activity tracker on the
market. Fitbit Force (2014)
The device that tracks your active
life and measures all kind of
activities. Nike Fuel (2014)
Wearable Activity Recognition
6
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
β€’ But… do wearable activity recognition systems meet
people’s expectance?
Challenges for Real-World Activity Recognition
β€’ Actively investigated:
– Reliability
– Simplicity
– Latency
β€’ Barely addressed:
– Privacy
– Fault-tolerance
– Usability
– Unobtrusiveness
– Fashionability
– Self-configuration
– Auto-adaptation
– Evolvability
7
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Challenges for Real-World Activity Recognition
8
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
 
β€’ Actively investigated:
– Reliability
– Simplicity
– Latency
β€’ Barely addressed:
– Privacy
– Fault-tolerance
– Usability
– Unobtrusiveness
– Fashionability
– Self-configuration
– Auto-adaptation
– Evolvability
Thesis Motivation and Objectives
β€’ Motivation:
β€œCreate more advanced systems capable of handling real-world AR issues as well
as to incorporate more intelligent capabilities to transform experimental
prototypes into actual usable applications”
β€’ Objectives:
– O1: β€œInvestigate the tolerance of standard AR systems to unforeseen sensor
failures and faults, as well as contribute with an alternate approach to cope
with these technological anomalies” οƒ  Fault-tolerance
– O2: β€œResearch the robustness of standard AR systems to unforeseen variations
in the sensor deployment, as well as contribute with an alternate approach to
cope with these practical anomalies” οƒ  Usability, Unobtrusiveness
– O3: β€œStudy the capacity of standard AR systems to support unforeseen changes
in the sensor network, as well as contribute with an alternate approach to cope
with these topological variations” οƒ  Self-configuration, auto-adaptation,
evolvability
9
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition Process
10
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Phenomena
Human activity
(body motion)
Measurement
Sensing
(ambient/wearables)
Processing
Data adequation
and knowledge
inference Recognized
Activity
How does it work exactly?
Activity Recognition Process
11
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
The Activity
Recognition
Chain (ARC)
Phenomena
Human activity
(body motion)
Measurement
Sensing
(ambient/wearables)
Processing
Data adequation
and knowledge
inference Recognized
Activity
Activity Recognition Chain (ARC)
12
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition Chain (ARC)
13
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition Chain (ARC)
14
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition Chain (ARC)
15
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition Chain (ARC)
16
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition Chain (ARC)
17
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Activity Recognition Chain (ARC)
18
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Tolerance of AR systems to
sensor faults and failures
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Objective: β€œInvestigate the tolerance of standard AR systems to unforeseen
sensor failures and faults, as well as contribute with an alternate approach
to cope with these technological anomalies”
Problem Statement
20
SENSOR ERRORS
Are standard activity recognition systems
prepared to cope with sensor
technological anomalies?
Is it possible to keep the systems
functioning under the effects of sensor
errors?
Activity
recognition
process
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Body motion
sensing
Signal
processing and
reasoning
Recognition of
activities
Signal effects
Sensor Technological Anomalies
21
β€’ Faults (overheating,
environmental changes,
decalibration)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
β€’ Failures (outages,
breakdowns, disconnection,
battery depletion)
Sensor Technological Anomalies in AR: Related Work
β€’ Detection of sensor anomalies
– Sensor query (Rost06)
– Neighborhood data correlation
β€’ Signal level (Yao10)
β€’ Feature level (Ramanathan09)
β€’ Reasoning (Rajasegarar07,
Ganeriwal08)
β€’ Counteraction of sensor anomalies
– Data imputation (Uchida13)
– Sensor fusion (Sagha13)
S. Rost and H. Balakrishnan. Memento: A health monitoring system for wireless sensor
networks. In 3rd Annual IEEE Communications Society on Sensor and Ad Hoc
Communications and Networks, volume 2, pp. 575-584, 2006.
Y. Yao, A. Sharma, L. Golubchik, and R. Govindan. Online anomaly detection for sensor
systems: A simple and efficient approach. Performance Evaluation, 67(11):1059-1075,
November 2010.
N. Ramanathan, T. Schoellhammer, E. Kohler, K. Whitehouse, T. Harmon, and D. Estrin.
Suelo: human-assisted sensing for exploratory soil monitoring studies. In Proceedings of
the 7th ACM Conference on Embedded Networked Sensor Systems, pp. 197-210, 2009.
S. Rajasegarar, C. Leckie, M. Palaniswami, and J. C. Bezdek. Quarter sphere based
distributed anomaly detection in wireless sensor networks. In IEEE International
Conference on Communications, pp. 3864-3869, June 2007.
S. Ganeriwal, L. K. Balzano, and M. B. Srivastava. Reputationbased framework for high
integrity sensor networks. ACM Transaction on Sensor Networks, 4(3):1-37, June 2008.
R. Uchida, H. Horino, and R. Ohmura. Improving fault tolerance of wearable wearable
sensor-based activity recognition techniques. In Proceedings of the 2013 ACM
Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 633-644,
2013.
H. Sagha, H. Bayati, J. del R. Millan, and R. Chavarriaga. On-line anomaly detection and
resilience in classifier ensembles. Pattern Recognition Letters, 34(15):1916-1927, 2013
22
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Sensor Failures in Classic AR Systems
β€’ Single-sensor ARC (SARC)
23
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
β€’ Single-sensor ARC (SARC)
24
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
If the sensor fails,
the complete system fails
Solution
Use more sensors for redundancy
(multi-sensor ARC or MARC)
Sensor Failures in Standard AR Systems
β€’ Feature fusion multi-sensor ARC (FFMARC)
25
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Sensor Failures in Standard AR Systems
β€’ Feature fusion multi-sensor ARC (FFMARC)
26
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
If a sensor fails,
the complete system fails
Solution
Independent ARCs
+ decision fusion
Sensor Failures in Standard AR Systems
β€’ Decision fusion multi-sensor ARC (DFMARC)
27
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Sensor Failures in Standard AR Systems
β€’ Decision fusion multi-sensor ARC (DFMARC)
28
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
If a sensor fails, the system is still
capable of functioning
but…
Is it capable of recognition?
Sensor Failures in Standard AR Systems
β€’ Hierarchical decision (HD)
– Information from some sensors more valuable
than from others (e.g., body part for a certain
activity) οƒ  Ranking of decisions
– Decisions mainly made on top (recognition
relies on a sensor or few sensors) οƒ  Problem
when top-ranked sensors get unavailable
β€’ Majority voting (MV)
– Equality scheme (all sensors have the same
importance) οƒ  Fairness, decisiveness
– A plurality of weak decisors may prevail over
the rest οƒ  Tyranny of the majority
29
Sensor Failures in Standard AR Systems
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
…
c11,c12,…,c1k
c1=Ο†(c11,c21,…, cM1),
c2=Ο†(c12,c22,…, cM2),
…
ck=Ο†(c1k,c2k,…, cMk)
c21,c22,…, c2k
cM1,cM2,…,cMk
…
…
…
…
…
…
c11,c12,…,c1k
c1=Ο†(c11,c21,…, cM1),
c2=Ο†(c12,c22,…, cM2),
…
ck=Ο†(c1k,c2k,…, cMk)
c21,c22,…, c2k
cM1,cM2,…,cMk
…
A Novel Method: Hierarchical Weighted Classifier
30
Sensor M
Sensor 2
SM
S2
S1
Ξ±11
Ξ¨
C12
C1N
C11
Ξ¨
C21
C22
C2N
Ξ¨
CM1
CM2
CMN
Ξ¨
Decision
Activity level
(base classifier)
Sensor level
(sensor classifier)
Network level
(sensor fusion)
Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
Ξ±21
Ξ²21
Ξ±22
Ξ²22
Ξ±2N
Ξ²2N
Ξ±M1
Ξ²M1
Ξ±M2
Ξ²M2
Ξ±MN
Ξ²MN
Ξ³11,…,1N
Ξ΄11,…,1N
Ξ³21,…,2N
Ξ΄21,…,2N
Ξ³M1,…,MN
Ξ΄M1,…,MN
Sensor 1
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
31
Sensor M
Sensor 2
SM
S2
S1
Ξ±11
Ξ¨
C12
C1N
C11
Ξ¨
C21
C22
C2N
Ξ¨
CM1
CM2
CMN
Ξ¨
Decision
Activity level
(base classifier)
Sensor level
(sensor classifier)
Network level
(sensor fusion)
Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
Ξ±21
Ξ²21
Ξ±22
Ξ²22
Ξ±2N
Ξ²2N
Ξ±M1
Ξ²M1
Ξ±M2
Ξ²M2
Ξ±MN
Ξ²MN
Ξ³11,…,1N
Ξ΄11,…,1N
Ξ³21,…,2N
Ξ΄21,…,2N
Ξ³M1,…,MN
Ξ΄M1,…,MN
Sensor 1
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
A Novel Method: Hierarchical Weighted Classifier
N activities & M sensors
32
Sensor M
Sensor 2
SM
S2
S1
Ξ±11
Ξ¨
C12
C1N
C11
Ξ¨
C21
C22
C2N
Ξ¨
CM1
CM2
CMN
Ξ¨
Decision
Activity level
(base classifier)
Sensor level
(sensor classifier)
Network level
(sensor fusion)
Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
Ξ±21
Ξ²21
Ξ±22
Ξ²22
Ξ±2N
Ξ²2N
Ξ±M1
Ξ²M1
Ξ±M2
Ξ²M2
Ξ±MN
Ξ²MN
Ξ³11,…,1N
Ξ΄11,…,1N
Ξ³21,…,2N
Ξ΄21,…,2N
Ξ³M1,…,MN
Ξ΄M1,…,MN
Sensor 1
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
A Novel Method: Hierarchical Weighted Classifier
N activities & M sensors
S1
Ξ±11
Ξ¨
C12
C1N
C11 Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
π‘ž π‘š π‘₯ π‘š π‘˜
= π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯
π‘ž
𝑂 π‘š π‘₯ π‘š π‘˜
𝑂 π‘š π‘₯ π‘š π‘˜
= π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜
𝑁
𝑛=1
𝛼 π‘šπ‘› =
𝑇𝑃 π‘šπ‘›
𝑇𝑃 π‘šπ‘› + 𝐹𝑁 π‘šπ‘›
𝛽 π‘šπ‘› =
𝑇𝑁 π‘šπ‘›
𝑇𝑁 π‘šπ‘› + 𝐹𝑃 π‘šπ‘›
π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜
=
𝛼 π‘šπ‘›, π‘₯ π‘š π‘˜
π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
0, π‘₯ π‘š π‘˜
π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
βˆ€π‘ž = 𝑛
𝛽 π‘šπ‘›, π‘₯ π‘š π‘˜
π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
0, π‘₯ π‘š π‘˜
π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
βˆ€π‘ž β‰  𝑛
33
Sensor M
Sensor 2
SM
S2
S1
Ξ±11
Ξ¨
C12
C1N
C11
Ξ¨
C21
C22
C2N
Ξ¨
CM1
CM2
CMN
Ξ¨
Decision
Activity level
(base classifier)
Sensor level
(sensor classifier)
Network level
(sensor fusion)
Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
Ξ±21
Ξ²21
Ξ±22
Ξ²22
Ξ±2N
Ξ²2N
Ξ±M1
Ξ²M1
Ξ±M2
Ξ²M2
Ξ±MN
Ξ²MN
Ξ³11,…,1N
Ξ΄11,…,1N
Ξ³21,…,2N
Ξ΄21,…,2N
Ξ³M1,…,MN
Ξ΄M1,…,MN
Sensor 1
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
A Novel Method: Hierarchical Weighted Classifier
N activities & M sensors
S1
Ξ±11
Ξ¨
C12
C1N
C11 Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
S1 Ξ³11,…,1N
Ξ΄11,…,1N
π‘Šπ· π‘šπ‘› π‘ž π‘š π‘₯ π‘š π‘˜
=
𝛾 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜
= 𝑛
𝛿 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜
β‰  𝑛
𝛾 π‘š = 𝛾 π‘š1, 𝛾 π‘š2,… , 𝛾 π‘šπ‘› =
𝑇𝑃 π‘š1
𝑇𝑃 π‘š1 + 𝐹𝑁 π‘š1
,
𝑇𝑃 π‘š2
𝑇𝑃 π‘š2 + 𝐹𝑁 π‘š2
, … ,
π‘‡π‘ƒπ‘šπ‘›
π‘‡π‘ƒπ‘šπ‘› + 𝐹𝑁 π‘šπ‘›
π‘ž π‘š π‘₯ π‘š π‘˜
= π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯
π‘ž
𝑂 π‘š π‘₯ π‘š π‘˜
𝑂 π‘š π‘₯ π‘š π‘˜
= π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜
𝑁
𝑛=1
𝛼 π‘šπ‘› =
𝑇𝑃 π‘šπ‘›
𝑇𝑃 π‘šπ‘› + 𝐹𝑁 π‘šπ‘›
𝛽 π‘šπ‘› =
𝑇𝑁 π‘šπ‘›
𝑇𝑁 π‘šπ‘› + 𝐹𝑃 π‘šπ‘›
π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜
=
𝛼 π‘šπ‘›, π‘₯ π‘š π‘˜
π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
0, π‘₯ π‘š π‘˜
π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
βˆ€π‘ž = 𝑛
𝛽 π‘šπ‘›, π‘₯ π‘š π‘˜
π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
0, π‘₯ π‘š π‘˜
π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
βˆ€π‘ž β‰  𝑛
𝛿 π‘š = 𝛿 π‘š1, 𝛿 π‘š2,… , 𝛿 π‘šπ‘› =
𝑇𝑁 π‘š1
𝑇𝑁 π‘š1 + 𝐹𝑃 π‘š1
,
𝑇𝑁 π‘š2
𝑇𝑁 π‘š2 + 𝐹𝑃 π‘š2
, … ,
𝑇𝑁 π‘šπ‘›
𝑇𝑁 π‘šπ‘› + πΉπ‘ƒπ‘šπ‘›
βˆ€π‘› = 1, … , 𝑁
34
Sensor M
Sensor 2
SM
S2
S1
Ξ±11
Ξ¨
C12
C1N
C11
Ξ¨
C21
C22
C2N
Ξ¨
CM1
CM2
CMN
Ξ¨
Decision
Activity level
(base classifier)
Sensor level
(sensor classifier)
Network level
(sensor fusion)
Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
Ξ±21
Ξ²21
Ξ±22
Ξ²22
Ξ±2N
Ξ²2N
Ξ±M1
Ξ²M1
Ξ±M2
Ξ²M2
Ξ±MN
Ξ²MN
Ξ³11,…,1N
Ξ΄11,…,1N
Ξ³21,…,2N
Ξ΄21,…,2N
Ξ³M1,…,MN
Ξ΄M1,…,MN
Sensor 1
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
A Novel Method: Hierarchical Weighted Classifier
S1
Ξ±11
Ξ¨
C12
C1N
C11 Ξ²11
Ξ±12
Ξ²12
Ξ±1N
Ξ²1N
S1 Ξ³11,…,1N
Ξ΄11,…,1N
Ξ¨
Decision
N activities & M sensors
π‘ž = π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯
π‘ž
𝑂 π‘₯ π‘š
𝑂 π‘₯ π‘š = 𝑂 π‘₯1 π‘˜
, π‘₯2 π‘˜
, … , π‘₯ 𝑀 π‘˜
= π‘Šπ· 𝑝 π‘ž 𝑝 π‘₯ 𝑝 π‘˜
𝑀
𝑝=1
π‘Šπ· π‘šπ‘› π‘ž π‘š π‘₯ π‘š π‘˜
=
𝛾 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜
= 𝑛
𝛿 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜
β‰  𝑛
𝛾 π‘š = 𝛾 π‘š1, 𝛾 π‘š2,… , 𝛾 π‘šπ‘› =
𝑇𝑃 π‘š1
𝑇𝑃 π‘š1 + 𝐹𝑁 π‘š1
,
𝑇𝑃 π‘š2
𝑇𝑃 π‘š2 + 𝐹𝑁 π‘š2
, … ,
π‘‡π‘ƒπ‘šπ‘›
π‘‡π‘ƒπ‘šπ‘› + 𝐹𝑁 π‘šπ‘›
π‘ž π‘š π‘₯ π‘š π‘˜
= π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯
π‘ž
𝑂 π‘š π‘₯ π‘š π‘˜
𝑂 π‘š π‘₯ π‘š π‘˜
= π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜
𝑁
𝑛=1
𝛼 π‘šπ‘› =
𝑇𝑃 π‘šπ‘›
𝑇𝑃 π‘šπ‘› + 𝐹𝑁 π‘šπ‘›
𝛽 π‘šπ‘› =
𝑇𝑁 π‘šπ‘›
𝑇𝑁 π‘šπ‘› + 𝐹𝑃 π‘šπ‘›
π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜
=
𝛼 π‘šπ‘›, π‘₯ π‘š π‘˜
π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
0, π‘₯ π‘š π‘˜
π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
βˆ€π‘ž = 𝑛
𝛽 π‘šπ‘›, π‘₯ π‘š π‘˜
π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
0, π‘₯ π‘š π‘˜
π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž
βˆ€π‘ž β‰  𝑛
𝛿 π‘š = 𝛿 π‘š1, 𝛿 π‘š2,… , 𝛿 π‘šπ‘› =
𝑇𝑁 π‘š1
𝑇𝑁 π‘š1 + 𝐹𝑃 π‘š1
,
𝑇𝑁 π‘š2
𝑇𝑁 π‘š2 + 𝐹𝑃 π‘š2
, … ,
𝑇𝑁 π‘šπ‘›
𝑇𝑁 π‘šπ‘› + πΉπ‘ƒπ‘šπ‘›
βˆ€π‘› = 1, … , 𝑁
Evaluation of the Tolerance to Sensor Technological Anomalies
β€’ Model validation
– Performance in ideal circumstances
– Tolerance to sensor failures
– Tolerance to sensor faults
35
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
β€’ Benchmark dataset:
– MIT Activities of Daily Living Dataset*
β€’ 9 acts
β€’ 5 biaxial accelerometers
β€’ 20 subjects (17-48 years old)
β€’ Out-of-lab
β€’ Experimental setup:
36
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
5 biaxial
accelerometers
(limbs and
trunk)
LP Elliptic Filter
(Fc = 20Hz)
6 seconds
sliding window
Mean, STD,
kurtosis, MCR,
(...)
DT, NB, KNN,
SVM (as base
classifiers)
Evaluation of the Tolerance to Sensor Technological Anomalies
* http://architecture.mit.edu/house_n/data/Accelerometer/BaoIntille.htm
β€’ Performance in ideal circumstances:
37
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
Evaluated models:
- SARC (≑ S)
- FFMARC (≑ FF)
- HWC
Parameters:
- Feature sets: 1, 5,
10, 20 feat.
- Base classifiers: DT,
NB, KNN, SVM
Evaluation procedure:
- 10-fold CV
- 100 iterations
β€’ Performance in ideal circumstances:
38
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
Evaluated models:
- SARC (≑ S)
- FFMARC (≑ FF)
- HWC
Parameters:
- Feature sets: 1, 5,
10, 20 feat.
- Base classifiers: DT,
NB, KNN, SVM
Evaluation procedure:
- 10-fold CV
- 100 iterations
β€’ Performance in ideal circumstances:
39
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
Evaluated models:
- SARC (≑ S)
- FFMARC (≑ FF)
- HWC
Parameters:
- Feature sets: 1, 5,
10, 20 feat.
- Base classifiers: DT,
NB, KNN, SVM
Evaluation procedure:
- 10-fold CV
- 100 iterations
β€’ Tolerance to sensor failures:
40
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
Evaluated model:
- HWC
Parameters:
- Feature set: 10
feat.
- Classifier: KNN
Evaluation procedure:
- 10-fold CV
- 100 iterations
Legend:
H (hip), W (wrist),
A (arm), K (ankle),
T (thigh)
Baseline Accuracy
(Ideal conditions) =
96.34%
β€’ Tolerance to sensor failures:
41
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
1 missing sensor
Evaluated model:
- HWC
Parameters:
- Feature set: 10
feat.
- Classifier: KNN
Evaluation procedure:
- 10-fold CV
- 100 iterations
Legend:
H (hip), W (wrist),
A (arm), K (ankle),
T (thigh)
Baseline Accuracy
(Ideal conditions) =
96.34%
β€’ Tolerance to sensor failures:
42
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
1 missing sensor
2 missing sensors
Evaluated model:
- HWC
Parameters:
- Feature set: 10
feat.
- Classifier: KNN
Evaluation procedure:
- 10-fold CV
- 100 iterations
Legend:
H (hip), W (wrist),
A (arm), K (ankle),
T (thigh)
Baseline Accuracy
(Ideal conditions) =
96.34%
β€’ Tolerance to sensor failures:
43
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
1 missing sensor
2 missing sensors
3 missing sensors
4 missing sensors
Evaluated model:
- HWC
Parameters:
- Feature set: 10
feat.
- Classifier: KNN
Evaluation procedure:
- 10-fold CV
- 100 iterations
Legend:
H (hip), W (wrist),
A (arm), K (ankle),
T (thigh)
Baseline Accuracy
(Ideal conditions) =
96.34%
β€’ Tolerance to sensor faults:
44
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
Ideal case Dynamic range shortening
β€’ Tolerance to sensor faults:
45
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Evaluation of the Tolerance to Sensor Technological Anomalies
Evaluated models:
- SARC
- FFMARC
- HWC
Parameters:
- Feature set: 10 feat.
- Classifier: KNN
Evaluation procedure:
- 10-fold CV
- 100 iterations
AR model/
#faulty sensors 0 1 2 3 4 5
New dynamic range= 30% original dynamic range οƒ  [-3g,3g]
SARC (hip) 82Β±5 66Β±4 - - - -
SARC (wrist) 88Β±5 54Β±6 - - - -
SARC (arm) 80Β±3 58Β±7 - - - -
SARC (ankle) 83Β±4 58Β±8 - - - -
SARC (thigh) 89Β±2 72Β±4 - - - -
FFMARC 97Β±2 88Β±4 76Β±5 61Β±8 42Β±11 39Β±13
HD 90Β±3 85Β±4 80Β±9 68Β±13 59Β±16 53Β±20
MV 82Β±6 79Β±5 67Β±7 43Β±10 36Β±14 31Β±19
HWC 96Β±2 96Β±2 93Β±3 86Β±5 73Β±8 65Β±14
New dynamic range= 10% original dynamic range οƒ  [-1g,1g]
SARC (hip) 82Β±5 21Β±11 - - - -
SARC (wrist) 88Β±5 18Β±9 - - - -
SARC (arm) 80Β±3 26Β±14 - - - -
SARC (ankle) 83Β±4 21Β±7 - - - -
SARC (thigh) 89Β±2 20Β±6 - - - -
FFMARC 97Β±2 70Β±5 41Β±8 17Β±15 21Β±11 18Β±9
HD 90Β±3 80Β±6 59Β±13 42Β±12 30Β±17 21Β±16
MV 82Β±6 77Β±6 46Β±11 38Β±10 27Β±13 26Β±8
HWC 96Β±2 94Β±2 87Β±6 53Β±2 27Β±17 25Β±19
Conclusions
β€’ Assuming a lifelong invariant sensor setup is unrealistic and may
lead to a malfunctioning of the activity recognition system
β€’ Body-worn sensors are subject to faults (signal degradation) and
failures (absence of signal) normally unforeseen at design and
runtime
β€’ Classic activity recognition approaches (SARC, FFMARC) are not
capable of dealing with sensor failures and are of limited utility
under the effect of sensor faults
β€’ The proposed alternate model (HWC) renders similar performance
to standard activity recognition models in ideal conditions, proves
to be robust to sensor failures and a relevant tolerance to sensor
faults
46
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Robustness of AR systems to
sensor deployment variations
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Objective: β€œResearch the robustness of standard AR systems to unforeseen
variations in the sensor deployment, as well as contribute with an alternate
approach to cope with these practical anomalies”
Problem Statement
48
SENSOR DEPLOYMENT CHANGES
Are activity recognition systems flexible enough to
allow users to wear the sensors on their
own?
Is it possible to keep the systems
functioning under the effects of sensor
displacement?
Activity
recognition
process
Body motion
sensing
Signal
processing and
reasoning
Recognition of
activities
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Sensor Displacement
β€’ Categories of sensor displacement
– Static: position changes can remain static across the execution of many activity
instances, e.g. when sensors are attached with a displacement each day
– Dynamic: effect of loose fitting of the sensors, e.g. when embedded into clothes
β€’ Sensor displacement οƒ  new sensor position οƒ  signal space change
β€’ Sensor displacement effects depends on
– Original/end position and body part
– Activity/gestures/movements performed
– Sensor modality
49
Sensor displacement = rotation + translation
(angular displacement) (linear displacement)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Sensor Displacement Effects
Changes in the signal
space propagates
through the activity
recognition chain (e.g.,
variations in the feature
space)
RCIDEAL LCIDEAL= LCSELF
50
RCSELF β‰  RCIDEAL
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Sensor Displacement in AR: Related Work
β€’ Features invariant to sensor displacement
– Heuristics (Kunze08)
– Genetic algorithm for feature selection
(FΓΆrster09a)
β€’ Feature distribution adaptation
– Covariate shift unsupervised adaptation
(Bayati09)
– Online-supervised user-based calibration
(FΓΆrster09b)
β€’ Classification (dis)similarity
– Output classifiers correlation (Sagha11)
K. Kunze and P. Lukowicz. Dealing with sensor displacement in motion-based
onbody activity recognition systems. In 10th international conference on
Ubiquitous computing, pp. 20–29, 2008.
K. FΓΆrster, P. Brem, D. Roggen, and G. TrΓΆster. Evolving discriminative features
robust to sensor displacement for activity recognition in body area sensor
networks. In Intelligent Sensors, Sensor Networks and Information Processing
(ISSNIP), 2009 5th International Conference on, pp. 43–48, 2009.
H. Bayati, J. del R Millan, and R. Chavarriaga. Unsupervised adaptation to on-body
sensor displacement in acceleration-based activity recognition. In Wearable
Computers (ISWC), 2011 15th Annual International Symposium on, pp. 71–78,
June 2011.
K. FΓΆrster, D. Roggen, and G. TrΓΆster. Unsupervised classifier self-calibration
through repeated context occurrences: Is there robustness against sensor
displacement to gain? In Proc. 13th IEEE Int. Symposium on Wearable Computers
(ISWC), pp. 77–84, 2009.
H. Sagha, J. R. del MillΓ‘n, and R. Chavarriaga. Detecting and rectifying anomalies
in Opportunistic sensor networks 8th Int. Conf. on Networked Sensing Systems,
pp. 162-–167, 2011
51
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Approaches to Investigate on Sensor Displacement
Synthetically
Modeled
Sensor
Displacement
Realistic
Sensor
Displacement
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
52
Approaches to Investigate on Sensor Displacement
Synthetically
Modeled
Sensor
Displacement
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
53
Synthetically Modeled Sensor Displacement
β€’ Sensor rotation οƒ  Rotational noise (RN)
β€’ Sensor translation οƒ  Additive noise (AN)
β€’ Examples:
54
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
𝑀 𝑅𝑁 =
𝑐 πœƒ 𝑐 πœ“ βˆ’ 𝑐 πœ™ 𝑠 πœ“ + 𝑠 πœ™ 𝑠 πœƒ 𝑐(πœ“) 𝑠 πœ™ 𝑠 πœ“ + 𝑐 πœ™ 𝑠 πœƒ 𝑐(πœ“)
𝑐 πœƒ 𝑠 πœ“ 𝑐 πœ™ 𝑐 πœ“ + 𝑠 πœ™ 𝑠 πœƒ 𝑠(πœ“) βˆ’π‘  πœ™ 𝑐 πœ“ + 𝑐 πœ™ 𝑠 πœƒ 𝑠(πœ“)
βˆ’ 𝑠 πœƒ 𝑠 πœ™ 𝑐 πœƒ 𝑐 πœ™ 𝑐 πœƒ
π‘₯ π‘Ÿπ‘œπ‘‘
𝑦 π‘Ÿπ‘œπ‘‘
𝑧 π‘Ÿπ‘œπ‘‘
= 𝑀 𝑅𝑁 Γ—
π‘₯ π‘Ÿπ‘Žπ‘€
π‘¦π‘Ÿπ‘Žπ‘€
𝑧 π‘Ÿπ‘Žπ‘€
π‘₯π‘‘π‘Ÿ
π‘¦π‘‘π‘Ÿ
π‘§π‘‘π‘Ÿ
= 𝑇𝐴𝑁 +
π‘₯ π‘Ÿπ‘Žπ‘€
π‘¦π‘Ÿπ‘Žπ‘€
𝑧 π‘Ÿπ‘Žπ‘€
𝑇𝐴𝑁 = πœ‡ 𝐴𝑁 + πœŽπ΄π‘
2 + π‘Ÿπ‘Žπ‘›π‘‘_π‘›π‘œπ‘Ÿπ‘šπ‘Žπ‘™_π‘‘π‘–π‘ π‘‘π‘Ÿπ‘–π‘π‘’π‘‘π‘–π‘œπ‘› (πœ‡ 𝐴𝑁=0)
0 1 2 3
-2
0
2
Acceleration(g)
Time (s)
Original
0 1 2 3
-2
0
2
Time (s)
RN
=15ΒΊ
0 1 2 3
-2
0
2
Time (s)
RN
=90ΒΊ
0 1 2 3
-2
0
2
Time (s)
AN
=0.1g
0 1 2 3
-2
0
2
Time (s)
AN
=0.5g
Original RN
=15º RN
=90º AN
=0.1g AN
=0.5g
0 1 2 3
-2
0
2
Acceleration(g)
Time (s)
Original
0 1 2 3
-2
0
2
Time (s)
RN
=15ΒΊ
0 1 2 3
-2
0
2
Time (s)
RN
=90ΒΊ
0 1 2 3
-2
0
2
Time (s)
AN
=0.1g
0 1 2 3
-2
0
2
Time (s)
AN
=0.5g
0 1 2 3
-2
0
2
Acceleration(g)
Time (s)
Original
0 1 2 3
-2
0
2
Time (s)
RN
=15ΒΊ
0 1 2 3
-2
0
2
Time (s)
RN
=90ΒΊ
0 1 2 3
-2
0
2
Time (s)
AN
=0.1g
0 1 2 3
-2
0
2
Time (s)
AN
=0.5g
Walking Sitting
Proposed in: H. Sagha, J. R. del MillΓ‘n, and R. Chavarriaga. Detecting and rectifying anomalies in
Opportunistic sensor networks. 8th Int. Conf. on Networked Sensing Systems, pp. 162 – 167, 2011
β€’ Benchmark dataset:
– MIT Activities of Daily Living Dataset*
β€’ 9 acts
β€’ 5 biaxial accelerometers
β€’ 20 subjects (17-48 years old)
β€’ Out-of-lab
β€’ Experimental setup:
55
Evaluation of the Robustness to Sensor Displacement (Synthetic)
* http://architecture.mit.edu/house_n/data/Accelerometer/BaoIntille.htm
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
5 biaxial
accelerometers
(limbs and
trunk)
LP Elliptic Filter
(Fc = 20Hz)
6 seconds
sliding window
10 feat. set
KNN (as base
classifier)
HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor)
β€’ Performance drop under the effects of sensor rotation and translation:
56
Evaluation of the Robustness to Sensor Displacement (Synthetic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
RotationTranslation
β€’ Performance drop under the effects of sensor rotation and translation:
HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor)
57
Evaluation of the Robustness to Sensor Displacement (Synthetic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
30%
RotationTranslation
23% 3%
8%
8%
20%
15% 25% 5%
15%
1%
4%
HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor)
β€’ Performance drop under the effects of sensor rotation and translation:
58
Evaluation of the Robustness to Sensor Displacement (Synthetic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
70%
RotationTranslation
50%
20%
55%
6%
15%3%
8%
8%
20%
15%
45%
75%
5%
15% 43%
30%
1%
4%
4%
10%25%
30%
23%
Approaches to Investigate on Sensor Displacement
Realistic
Sensor
Displacement
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
59
β€’ No dataset for studying the effects of sensor displacement! οƒ 
β€’ Observe
– Variability introduced with respect to the ideal setup when the sensors are
self-placed by the users
– Effects of large sensor displacements (extreme de-positioning)
β€’ Scenarios
– Ideal-placement
– Self-placement
– Induced-displacement
Implementing Realistic Sensor Displacement
Ideal Self Induced
60
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
NEW DATASET*
(REALDISP)
*Freely available at:
www.ugr.es/~oresti/datasets
REALDISP Dataset: Study Setup
β€’ Cardio-fitness room
β€’ 9 IMUs (9DoF) οƒ  ACC, GYR, MAG
β€’ Laptop οƒ  data storage and labeling*
β€’ Camera οƒ  offline data validation
β€’ 17 volunteers (22-37 years old)
*Annotation tool:
http://crnt.sourceforge.net/CRN_Toolbox/Home.html 61
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
REALDISP Dataset: Activity Set
β€’ Activities intended for:
– Body-general motion: Translation | Jumps | Fitness
– Body-part-specific motion: Trunk | Upper-extremities | Lower-extremities
62
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
β€’ Experimental setup:
β€’ Studies:
– AR systems: SARC, FFMARC, HWC
– Settings: Ideal-placement, Self-placement, Induced-displacement
– Scenarios: 10 activities, 20 activities, 33 activities (all)
β€’ Evaluation procedure
– 10-fold CV, 100 iterations
63
Evaluation of the Robustness to Sensor Displacement (Realistic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
9 triaxial
accelerometers
(all limbs and
trunk)
No
preprocessing
(raw data)
6 seconds
sliding window
FS1=mean
FS2=mean,std
FS3=mean,std,
max,min,mcr
DT, KNN, NB (as
base classifiers)
HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor)
64
Evaluation of the Robustness to Sensor Displacement (Realistic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
IdealSelfInduced
HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor)
65
Evaluation of the Robustness to Sensor Displacement (Realistic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
IdealSelfInduced
13%
25%
25%
45%
3%
13%
HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor)
66
Evaluation of the Robustness to Sensor Displacement (Realistic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
IdealSelfInduced
13%
25%
15%
50%
25%
45%
30%
45%
3%
13%
5%
15%
HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor)
67
Evaluation of the Robustness to Sensor Displacement (Realistic)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
IdealSelfInduced
13%
25%
15%
50% 50%
15%25%
45%
30%
45%
45%
30%
3%
13%
5%
15%
5%
25%
Conclusions
β€’ Classic activity-aware systems assume a predefined sensor deployment that
further remains unchanged during runtime, which are not lifelike assumptions
β€’ Body-worn inertial sensors are subject to deployment changes (displacement)
in real-world contexts, potentially leading to signal variations with respect to
ideal patterns
β€’ Activity recognition systems proves to be more sensitive to sensor rotations
than translations, specially when located on body parts of reduced mobility
β€’ Standard models (SARC,FFMARC) suffer from a critical performance worsening
when the sensors are largely depositioned or self-placed by the users
β€’ The HWC significantly outperforms the tolerance of standard activity
recognition models (up to 30%), effectively showing outstanding capabilities
to assimilate the changes introduced during the self-placement of the sensors
and to moderately overcome the situation of largely depositioned sensors
68
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Supporting AR systems network
changes: instruction of
newcomer sensors
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Objective: β€œStudy the capacity of standard AR systems to support
unforeseen changes in the sensor network, as well as contribute with an
alternate approach to cope with these topological variations”
Problem Statement
70
SENSOR INFRASTRUCTURE CHANGES
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Collect a
training
dataset
Train and test
the model
The AR system
is β€œready”
Do we need to collecta new dataset each time
the sensor topology changes?
Is it possible to leverage the knowledge of a
functional system to instructa system to
operate on a newcomer sensor?
Activity
recognition
system design
Infraestructure Changes: Newcomer Sensors
71
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Sensor replacement
(repair/upgrade)
Sensor addition
(redundancy)
Sensor discovery
(opportunistic use)
Transfer learning
Instruction of Newcomer Sensors
72
Teacher
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Classic approach
Limitations:
- Predefined setup and deployments
- System designer involvement
- User/s involvement
Learner
β€œMechanism, ability or means to
recognize and apply knowledge and
skills learned in previous tasks or
domains to novel tasks or domains”
Collection of a new dataset
for each possible scenario
Transfer Learning in AR: Related Work
β€’ Transfer between wearable sensors
– Translation of locomotion recognition
capabilities (Calatroni11)
β€’ Model parameters
β€’ Labels
β€’ Transfer between ambient sensors
– Translation among smart homes through
meta-featuring (van Kasteren10)
β€’ Common meta-feature space
β€’ Limitations
– Long time scales operation
– Incomplete transfer
– Difficult transfer across modalities
A. Calatroni, D. Roggen, and G. TrΓΆster, β€œAutomatic transfer of activity recognition
capabilities between body-worn motion sensors: Training newcomers to recognize
locomotion,” in Proc. 8th Int Conf on Networked Sensing Systems, 2011.
T. van Kasteren, G. Englebienne, and B. KrΓΆse, β€œTransferring knowledge of activity
recognition across sensor networks,” in Proc. 8th Int. Conf on Pervasive Computing, 2010.
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
73
Multimodal Transfer Methods
β€’ System identification (signal level)
β€’ Transfer methods (reasoning level)
Ψ𝐴→𝐡 𝑑
Sensor
Domain
A
Sensor
Domain
B0 1 2 3
-0.5
0
0.5
1
1.5
Time (s)
Acceleration(G)
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of activity models
(features + labels, classification models)
Transfer of activity templates
(patterns + labels)
74
Transfer of Activity Templates
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
β€’ Transfer of the recognition capabilities of an existing source system (S) that
operates on activity templates (patterns) to an untrained target system (T)
that lacks from these capabilities
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
75
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Coexistence… (T)
0 20 40
-1
0
1
2
Time (s)
Position(m)
0 20 40
-1
0
1
2
Time (s)Acceleration(G)
Transfer of Activity Templates
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
(1) Both systems coexists during a certain period of time
76
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
(2) A mapping function between source and target domains is discovered
through system identification (MIMO model)
Transfer of Activity Templates
Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑)
77
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
(3) The activity templates are translated from source to target domain
Transfer of Activity Templates
78
Signal
level
Reasoning
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑)
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Templates
(3) The activity templates are translated from source to target domain
System S (source domain) System T (target domain)
L3
79
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑)
0 1 2 3
-0.5
0
0.5
1
1.5
Time (s)
Acceleration(G)
^X
^Y
^Z
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Templates
(3) The activity templates are translated from source to target domain
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
L3 L3
80
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
L1 L2 L3
Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Templates
β€’ System identification
– 3) The activity templates are translated from source to target domain
0
0.5
1
1.5
Acceleration(G)
X
Y
Z
^X
^Y
^Z
81
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑)
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
L1 L2 L3
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Templates
(4) Once the templates have been translated, the target system is ready for
activity detection
82
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
L1 L2 L3
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
L1 L2 L3
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Templates
(4) Once the templates have been translated, the target system is ready for
activity detection
Instruction
completed!
83
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Models
β€’ Transfer of the recognition capabilities of an existing source system (S) that
operates on activity models (features + classification model) to an untrained
target system (T) that lacks from these capabilities
84
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
Coexistence… (T)
0 20 40
-1
0
1
2
Time (s)
Position(m)
0 20 40
-1
0
1
2
Time (s)
Acceleration(G)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Models
(1) Both systems coexists during a certain period of time
85
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Models
(2) A mapping function between target and source domains is discovered
through system identification (MIMO model)
86
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Models
(3) The source activity models are translated to the target domain so both use
the same activity models
87
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Models
(3) The source activity models are translated to the target domain so both use
the same activity models; these activity models also define the target activity
recognition system
88
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Models
(4) The target system continuously translate its signals into the source domain to
operate on the transferred recognition system
0 1 2 3 4
-0.5
0
0.5
1
1.5
X
Y
Z
89
𝑋𝑆(𝑑) 𝑋 𝑇(𝑑)
System S (source domain) System T (target domain)
Signal
level
Reasoning
level
Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Transfer of Activity Models
(4) The target system continuously translate its signals into the source domain to
operate on the transferred recognition system; since then it is ready for activity
detection
Instruction
completed!
0 1 2 3 4
-1
0
1
2
^X
^Y
^Z
90
Evaluation of Multimodal Transfer
β€’ Models validation
– Transfer between IMU and IMU (Identical Domain Transfer)
– Transfer between Kinect and IMU (Cross Domain Transfer)
91
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
L1 L2 L3
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
Transfer of Activity Templates
Transfer of Activity Models
0 1 2 3 4
-1
0
1
2
^X
^Y
^Z
Multimodal Kinect-IMU Dataset: Study Setup
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
*Freely available at: www.ugr.es/~oresti/datasets 92
Multimodal Kinect-IMU Dataset: Study Setup
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
MTx XSENS IMUs
- 3D ACC +
(3D GYR, 3D MAG,
4D QUA)
- Sampling rate 30Hz
Applications
93Xsens data logger οƒ  http://crnt.sourceforge.net/CRN_Toolbox/References.html
Multimodal Kinect-IMU Dataset: Study Setup
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
MICROSOFT KINECT
- RGB cam + IR cam + IR led
- Depth map (0.5-6m)
- 15 joints skeleton tracking
- 3D position
- Tracking range (1.2-3.5m)
- Sampling rate 30Hz
Applications
94
Kinect data logger οƒ  http://code.google.com/p/qtkinectwrapper/
Multimodal Kinect-IMU Dataset: Scenarios
Geometric Gestures (HCI)
48 instances per gesture
Other scenarios were also collected as part of this dataset (more info at www.ugr.es/~oresti/datasets) 95
Multimodal Kinect-IMU Dataset: Scenarios
Geometric Gestures (HCI) Idle (Background)
~5 min of data48 instances per gesture
Other scenarios were also collected as part of this dataset (more info at www.ugr.es/~oresti/datasets) 96
Transfer between IMU and IMU
β€’ Analyzed transfers
– Transfer of Activity Templates and Activity Models from:
β€’ RLA (3D acceleration) to RUA (3D acceleration)
β€’ RUA (3D acceleration) to RLA (3D acceleration)
β€’ RUA (3D acceleration) to BACK (3D acceleration)
β€’ BACK (3D acceleration) to RUA (3D acceleration)
β€’ RLA (3D acceleration) to BACK (3D acceleration)
β€’ BACK (3D acceleration) to RLA (3D acceleration)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
97
Evaluation of Transfer between IMU and IMU
β€’ Mapping:
– Model οƒ  MIMO3x3 mapping with 10 tap delay
– Types
β€’ Problem-domain mapping (PDM)
β€’ Gesture-specific mapping (GSM)
β€’ Unrelated-domain mapping (UDM)
– Learning οƒ  100 samples (~3.3s)
β€’ Activity recognition model:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Triaxial
acceleration
(IMU)
No
preprocessing
(raw data)
Instance based
segmentation
FS=max,min
KNN (standard
classifier)
98
Evaluation of Transfer between IMU and IMU
β€’ Transfer of Activity Templates:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
99BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
LA=lower arm
UA=upper arm
B=back
Evaluation of Transfer between IMU and IMU
β€’ Transfer of Activity Templates:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
100BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
LA=lower arm
UA=upper arm
B=back
<1% <3%
Evaluation of Transfer between IMU and IMU
β€’ Transfer of Activity Templates:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
101BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
LA=lower arm
UA=upper arm
B=back
<1% <3%
35%
15% 17%
28% 35%
28%
10% 10%
Evaluation of Transfer between IMU and IMU
β€’ Transfer of Activity Templates:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
102BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
LA=lower arm
UA=upper arm
B=back
<1% <3%
12%
20%
35%
55%
15% 17%
28% 35%
60%
28%
30%
50%
10% 10%
Evaluation of Transfer between IMU and IMU
β€’ Transfer of Activity Models:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
103BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
LA=lower arm
UA=upper arm
B=back
Evaluation of Transfer between IMU and IMU
β€’ Transfer of Activity Models:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
104BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
LA=lower arm
UA=upper arm
B=back
Transfer between Kinect and IMU
β€’ Analyzed transfers
– Transfer of Activity Templates (Kinect to IMU) :
β€’ HAND (3D position) οƒ  RLA (3D acceleration)
β€’ HAND (3D position) οƒ  RUA (3D acceleration)
β€’ HAND (3D position) οƒ  BACK (3D acceleration)
– Transfer of Activity Models (IMU to Kinect):
β€’ RLA (3D acceleration) οƒ  HAND (3D position)
β€’ RUA (3D acceleration) οƒ  HAND (3D position)
β€’ BACK (3D acceleration) οƒ  HAND (3D position)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
105
Evaluation of Transfer between Kinect and IMU
β€’ Mapping:
– Model οƒ  MIMO3x3 mapping with 10 tap delay
– Types
β€’ Problem-domain mapping (PDM)
β€’ Gesture-specific mapping (GSM)
β€’ Unrelated-domain mapping (UDM)
– Learning οƒ  100 samples (~3.3s)
β€’ Activity recognition model:
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
106
Triaxial
acceleration
(IMU) / Triaxial
position
(KINECT)
No
preprocessing
(raw data)
Instance based
segmentation
FS=max,min
KNN (standard
classifier)
Evaluation of Transfer between Kinect and IMU
β€’ Transfer of Activity Templates (From Kinect to IMU)
β€’ Transfer of Activity Models (From IMU to Kinect)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
107BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
RLA= right lower arm
RUA= right upper arm
BACK=back
KINECT=hand
Evaluation of Transfer between Kinect and IMU
β€’ Transfer of Activity Templates (From Kinect to IMU)
β€’ Transfer of Activity Models (From IMU to Kinect)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
108BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
RLA= right lower arm
RUA= right upper arm
BACK=back
KINECT=hand
<4% <4%
<8%<6%
Evaluation of Transfer between Kinect and IMU
β€’ Transfer of Activity Templates (From Kinect to IMU)
β€’ Transfer of Activity Models (From IMU to Kinect)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
109BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
RLA= right lower arm
RUA= right upper arm
BACK=back
KINECT=hand
<4% <4%
<8%<6%
30%
45%
35%
60%
Evaluation of Transfer between Kinect and IMU
β€’ Transfer of Activity Templates (From Kinect to IMU)
β€’ Transfer of Activity Models (From IMU to Kinect)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
110BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping
RLA= right lower arm
RUA= right upper arm
BACK=back
KINECT=hand
<4%
35%
50%
<4%
<8%<6%
30%
45%
35%
60%
55%
30%35%35%
Evaluation of Transfer between Kinect and IMU
β€’ Transfer of Activity Templates (From Kinect to IMU)
β€’ Transfer of Activity Models (From IMU to Kinect)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
From Kinect to IMU (RLA) From IMU (RLA) to Kinect
FS1=mean
FS2=max,min
11130 samples = 1 s
Evaluation of Transfer between Kinect and IMU
β€’ Transfer of Activity Templates (From Kinect to IMU)
β€’ Transfer of Activity Models (From IMU to Kinect)
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
From Kinect to IMU (RLA) From IMU (RLA) to Kinect
FS1=mean
FS2=max,min
11230 samples = 1 s
25% 20%
Conclusions
β€’ Classical training procedures are not practical to instruct newcomer sensors
in dynamically varying and evolvable activity recognition setups
β€’ A novel multimodal transfer learning model is proposed to translate the
recognition capabilities of an existing system to a new untrained system, at
runtime and without expert or user intervention
β€’ As few as a single gesture (β‰ˆ3 seconds) of data is enough to learn a mapping
model that captures the underlying relation between systems of identical or
different modality
β€’ The transfer between IMUs across close-by limbs achieves a recognition
accuracy superior to 97% (>2% below baseline), and 95% (>4% below
baseline) for the transfer between Kinect and IMU, independently of the
direction of the transfer
β€’ Low-variance data unrelated to the activities of interest can be also used to
learn a mapping, albeit with more data 113
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Conclusions and future work
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Contributions
β€’ Identification of the requirements and challenges posed by AR systems in real-
world conditions
β€’ Evaluation of the tolerance of standard AR systems to sensor technological
anomalies, particularly sensor failures and faults
β€’ Definition and development of a novel model, so-called HWC, to overcome the
effects of sensor failures and faults. Evaluation of the robustness of the proposed
HWC model to the effects of sensor failures and faults
β€’ Evaluation of the tolerance of standard AR systems to sensor deployment
variations, particularly static and dynamic sensor displacements
β€’ Evaluation of the robustness of the proposed HWC model to the effects of sensor
displacements
β€’ Definition, development and validation of a novel multimodal transfer learning
method that operates at runtime, with low overhead and without user or system
designer intervention
115
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Contributions
β€’ Collection and curation of an innovative benchmark dataset to investigate the
effects of sensor displacement, introducing the concept of ideal-placement,
self-placement and induced-displacement. This dataset includes a wide range
of physical activities, sensor modalities and participants. Apart from
investigating sensor displacement, the dataset lend itself for benchmarking
activity recognition techniques in ideal conditions. The dataset is publicly
available to the research community at http://www.ugr.es/~oresti/datasets
β€’ Collection and curation of a novel multimodal dataset to investigate transfer
learning among ambient sensing and wearable sensing systems. The dataset
could be also used for gesture spotting and continuous activity recognition.
The dataset is publicly available to the research community at
http://www.ugr.es/~oresti/datasets
116
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Selected Publications
β€’ International Journals (SCI-indexed)
– Banos, O., Toth M. A., Damas, M., Pomares, H., Rojas, I. Dealing with the effects of
sensor displacement in wearable activity recognition. Sensors, MDPI (2014) [Under
review]
– Banos, O., Damas, M., Guillen, A., Herrera, L.J., Pomares, H., Rojas, I. Multi-sensor
fusion based on asymmetric decision weighting for robust activity recognition.
Neural Processing Letters, Springer (2014) [Under review]
– Banos, O., Galvez, J. M., Damas, M., Pomares, H., Rojas, I. Window size impact in
activity recognition. Sensors, MDPI, vol. 14, no. 4, pp. 6474-6499 (2014)
– Banos, O., Damas, M., Pomares, H., Rojas, F., Delgado-Marquez, B., Valenzuela, O.
Human activity recognition based on a sensor weighting hierarchical classifier. Soft
Computing, Springer, vol. 17, pp. 333-343 (2013)
– Banos, O., Damas, M., Pomares, H., Rojas, I. On the Use of Sensor Fusion to Reduce
the Impact of Rotational and Additive Noise in Human Activity Recognition. Sensors,
MDPI, vol. 12, no. 6, pp. 8039-8054 (2012)
– Banos, O., Damas, M., Pomares, H., Prieto, A., Rojas, I.: Daily Living Activity
Recognition based on Statistical Feature Quality Group Selection. Expert Systems
with Applications, Elsevier, vol. 39, no. 9, pp. 8013-8021 (2012)
117
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Selected Publications
β€’ Book chapters
– Banos, O., Toth M. A., Damas, M., Pomares, H., Rojas, I. Amft, O.: Evaluation of inertial sensor
displacement effects in activity recognition systems. Science and Supercomputing in Europe
(Information & Communication Technologies), HPC-Europe 2 (2013) ISBN: 978-84-338-5400-1
β€’ Conference papers
– Banos, O., Damas, M., Pomares, H., Rojas, I.: Handling displacement effects in on-body sensor-
based activity recognition. In: Proceedings of the 5th International Work-conference on
Ambient Assisted Living an Active Ageing (IWAAL 2013), San Jose, Costa Rica, December 2-6,
(2013) [BEST PAPER AWARD]
– Banos, O., Damas, M., Pomares, H., Rojas, I.: Activity recognition based on a multi-sensor
meta-classifier. In: Proceedings of the 2013 International Work Conference on Neural
Networks (IWANN 2013), Tenerife, June 12-14, (2013)
– Banos, O., Toth, M. A., Damas, M., Pomares, H., Rojas, I., Amft, O.: A benchmark dataset to
evaluate sensor displacement in activity recognition. In: Proceedings of the 14th International
Conference on Ubiquitous Computing (Ubicomp 2012), Pittsburgh, USA, September 5-8,
(2012)
118
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Selected Publications
β€’ Conference papers (cont.)
– Banos, O., Calatroni, A., Damas, M., Pomares, H., Rojas, I., Troester, G., Sagha, H., Millan, J. del
R., Chavarriaga, R., Roggen, D.: Kinect=IMU? Learning MIMO Signal Mappings to Automatically
Translate Activity Recognition Systems Across Sensor Modalities. In: Proceedings of the 16th
annual International Symposium on Wearable Computers (ISWC 2012), Newcastle, United
Kingdom, June 18-22 (2012)
– Banos, O., Damas, M., Pomares, H., Rojas, I.: Human multisource activity recognition for AAL
problems. In: Proceedings of the 5th International Symposium on Ubiquitous Computing and
Ambient Intelligence (UCAmI 2011), Riviera Maya, Mexico, December 5-9, (2011)
– Banos, O., Damas, M., Pomares, H., Rojas, I.: Recognition of Human Physical Activity based on
a novel Hierarchical Weighted Classification scheme. In: Proceedings of the 2011 International
Joint Conference on Neural Networks (IJCNN 2011), IEEE, San Jose, California, July 31-August
5, (2011)
– Banos, O., Pomares, H., Rojas, I.: Ambient Living Activity Recognition based on Feature-set
Ranking Using Intelligent Systems. In: Proceedings of the 2010 International Joint Conference
on Neural Networks (IJCNN 2010), IEEE, Barcelona, July 18-23, (2010)
119
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Future Work
β€’ Collection of new large standard
datasets
β€’ Dynamic reconfiguration of the HWC
β€’ Self-adaptive HWC
β€’ Tolerance to other sensor
technological and topological
anomalies
β€’ Multiple trainers and complex
modalities in transfer learning
β€’ Integration in commercial systems
and end-user applications
120
INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
Β‘Gracias a todos!
121
1 of 121

Recommended

Handling displacement effects in on-body sensor-based activity recognition by
Handling displacement effects in on-body sensor-based activity recognitionHandling displacement effects in on-body sensor-based activity recognition
Handling displacement effects in on-body sensor-based activity recognitionOresti Banos
790 viewsβ€’21 slides
Activity recognition based on a multi-sensor meta-classifier by
Activity recognition based on a multi-sensor meta-classifierActivity recognition based on a multi-sensor meta-classifier
Activity recognition based on a multi-sensor meta-classifierOresti Banos
1.1K viewsβ€’28 slides
Human activity recognition by
Human activity recognitionHuman activity recognition
Human activity recognitionRandhir Gupta
15.6K viewsβ€’44 slides
fmelleHumanActivityRecognitionWithMobileSensors by
fmelleHumanActivityRecognitionWithMobileSensorsfmelleHumanActivityRecognitionWithMobileSensors
fmelleHumanActivityRecognitionWithMobileSensorsFridtjof Melle
313 viewsβ€’8 slides
Physical activity recognition using a wearbale accelerometer by
Physical activity recognition using a wearbale accelerometerPhysical activity recognition using a wearbale accelerometer
Physical activity recognition using a wearbale accelerometeralbbonomi
2.2K viewsβ€’18 slides
On the Development of A Real-Time Multi-Sensor Activity Recognition System by
On the Development of A Real-Time Multi-Sensor Activity Recognition SystemOn the Development of A Real-Time Multi-Sensor Activity Recognition System
On the Development of A Real-Time Multi-Sensor Activity Recognition SystemOresti Banos
846 viewsβ€’16 slides

More Related Content

Viewers also liked

Temporal Activity Detection in Untrimmed Videos with Recurrent Neural Networks by
Temporal Activity Detection in Untrimmed Videos with Recurrent Neural NetworksTemporal Activity Detection in Untrimmed Videos with Recurrent Neural Networks
Temporal Activity Detection in Untrimmed Videos with Recurrent Neural NetworksUniversitat Politècnica de Catalunya
16.3K viewsβ€’98 slides
Human Activity Recognition in Android by
Human Activity Recognition in AndroidHuman Activity Recognition in Android
Human Activity Recognition in AndroidSurbhi Jain
9.9K viewsβ€’32 slides
Human Activity Recognition (HAR) using HMM based Intermediate matching kernel... by
Human Activity Recognition (HAR) using HMM based Intermediate matching kernel...Human Activity Recognition (HAR) using HMM based Intermediate matching kernel...
Human Activity Recognition (HAR) using HMM based Intermediate matching kernel...Rupali Bhatnagar
1.6K viewsβ€’40 slides
Analysis of the Innovation Outputs in mHealth for Patient Monitoring by
Analysis of the Innovation Outputs in mHealth for Patient MonitoringAnalysis of the Innovation Outputs in mHealth for Patient Monitoring
Analysis of the Innovation Outputs in mHealth for Patient MonitoringOresti Banos
436 viewsβ€’20 slides
Smart Move: Intelligent Technologies Make Their Mark on Public Service by
Smart Move: Intelligent Technologies Make Their Mark on Public ServiceSmart Move: Intelligent Technologies Make Their Mark on Public Service
Smart Move: Intelligent Technologies Make Their Mark on Public Serviceaccenture
4.9K viewsβ€’23 slides
Recurrent Neural Networks, LSTM and GRU by
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRUananth
29.8K viewsβ€’38 slides

Viewers also liked(6)

Human Activity Recognition in Android by Surbhi Jain
Human Activity Recognition in AndroidHuman Activity Recognition in Android
Human Activity Recognition in Android
Surbhi Jainβ€’9.9K views
Human Activity Recognition (HAR) using HMM based Intermediate matching kernel... by Rupali Bhatnagar
Human Activity Recognition (HAR) using HMM based Intermediate matching kernel...Human Activity Recognition (HAR) using HMM based Intermediate matching kernel...
Human Activity Recognition (HAR) using HMM based Intermediate matching kernel...
Rupali Bhatnagarβ€’1.6K views
Analysis of the Innovation Outputs in mHealth for Patient Monitoring by Oresti Banos
Analysis of the Innovation Outputs in mHealth for Patient MonitoringAnalysis of the Innovation Outputs in mHealth for Patient Monitoring
Analysis of the Innovation Outputs in mHealth for Patient Monitoring
Oresti Banosβ€’436 views
Smart Move: Intelligent Technologies Make Their Mark on Public Service by accenture
Smart Move: Intelligent Technologies Make Their Mark on Public ServiceSmart Move: Intelligent Technologies Make Their Mark on Public Service
Smart Move: Intelligent Technologies Make Their Mark on Public Service
accentureβ€’4.9K views
Recurrent Neural Networks, LSTM and GRU by ananth
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRU
ananthβ€’29.8K views

More from Oresti Banos

Measuring human behaviour to inform e-coaching actions by
Measuring human behaviour to inform e-coaching actionsMeasuring human behaviour to inform e-coaching actions
Measuring human behaviour to inform e-coaching actionsOresti Banos
256 viewsβ€’19 slides
Measuring human behaviour by sensing everyday mobile interactions by
Measuring human behaviour by sensing everyday mobile interactionsMeasuring human behaviour by sensing everyday mobile interactions
Measuring human behaviour by sensing everyday mobile interactionsOresti Banos
95 viewsβ€’36 slides
Emotion AI: Concepts, Challenges and Opportunities by
Emotion AI: Concepts, Challenges and OpportunitiesEmotion AI: Concepts, Challenges and Opportunities
Emotion AI: Concepts, Challenges and OpportunitiesOresti Banos
571 viewsβ€’90 slides
Biodata analysis by
Biodata analysisBiodata analysis
Biodata analysisOresti Banos
1.9K viewsβ€’49 slides
Biosignal Processing by
Biosignal ProcessingBiosignal Processing
Biosignal ProcessingOresti Banos
15.4K viewsβ€’59 slides
Automatic mapping of motivational text messages into ontological entities for... by
Automatic mapping of motivational text messages into ontological entities for...Automatic mapping of motivational text messages into ontological entities for...
Automatic mapping of motivational text messages into ontological entities for...Oresti Banos
318 viewsβ€’15 slides

More from Oresti Banos(20)

Measuring human behaviour to inform e-coaching actions by Oresti Banos
Measuring human behaviour to inform e-coaching actionsMeasuring human behaviour to inform e-coaching actions
Measuring human behaviour to inform e-coaching actions
Oresti Banosβ€’256 views
Measuring human behaviour by sensing everyday mobile interactions by Oresti Banos
Measuring human behaviour by sensing everyday mobile interactionsMeasuring human behaviour by sensing everyday mobile interactions
Measuring human behaviour by sensing everyday mobile interactions
Oresti Banosβ€’95 views
Emotion AI: Concepts, Challenges and Opportunities by Oresti Banos
Emotion AI: Concepts, Challenges and OpportunitiesEmotion AI: Concepts, Challenges and Opportunities
Emotion AI: Concepts, Challenges and Opportunities
Oresti Banosβ€’571 views
Biodata analysis by Oresti Banos
Biodata analysisBiodata analysis
Biodata analysis
Oresti Banosβ€’1.9K views
Biosignal Processing by Oresti Banos
Biosignal ProcessingBiosignal Processing
Biosignal Processing
Oresti Banosβ€’15.4K views
Automatic mapping of motivational text messages into ontological entities for... by Oresti Banos
Automatic mapping of motivational text messages into ontological entities for...Automatic mapping of motivational text messages into ontological entities for...
Automatic mapping of motivational text messages into ontological entities for...
Oresti Banosβ€’318 views
Enabling remote assessment of cognitive behaviour through mobile experience s... by Oresti Banos
Enabling remote assessment of cognitive behaviour through mobile experience s...Enabling remote assessment of cognitive behaviour through mobile experience s...
Enabling remote assessment of cognitive behaviour through mobile experience s...
Oresti Banosβ€’658 views
Ontological Modeling of Motivational Messages for Physical Activity Coaching by Oresti Banos
Ontological Modeling of Motivational Messages for Physical Activity CoachingOntological Modeling of Motivational Messages for Physical Activity Coaching
Ontological Modeling of Motivational Messages for Physical Activity Coaching
Oresti Banosβ€’314 views
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen... by Oresti Banos
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...
Oresti Banosβ€’120 views
First Approach to Automatic Performance Status Evaluation and Physical Activi... by Oresti Banos
First Approach to Automatic Performance Status Evaluation and Physical Activi...First Approach to Automatic Performance Status Evaluation and Physical Activi...
First Approach to Automatic Performance Status Evaluation and Physical Activi...
Oresti Banosβ€’985 views
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur... by Oresti Banos
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...
Oresti Banosβ€’596 views
High-Level Context Inference for Human Behavior Identi cation by Oresti Banos
High-Level Context Inference for Human Behavior IdenticationHigh-Level Context Inference for Human Behavior Identication
High-Level Context Inference for Human Behavior Identi cation
Oresti Banosβ€’765 views
Facilitating Trunk Endurance Assessment by means of Mobile Health Technologies by Oresti Banos
Facilitating Trunk Endurance Assessment by means of Mobile Health TechnologiesFacilitating Trunk Endurance Assessment by means of Mobile Health Technologies
Facilitating Trunk Endurance Assessment by means of Mobile Health Technologies
Oresti Banosβ€’425 views
Mining Human Behavior for Health Promotion by Oresti Banos
Mining Human Behavior for Health PromotionMining Human Behavior for Health Promotion
Mining Human Behavior for Health Promotion
Oresti Banosβ€’266 views
Multiwindow Fusion for Wearable Activity Recognition by Oresti Banos
Multiwindow Fusion for Wearable Activity RecognitionMultiwindow Fusion for Wearable Activity Recognition
Multiwindow Fusion for Wearable Activity Recognition
Oresti Banosβ€’289 views
Mining Minds: an innovative framework for personalized health and wellness su... by Oresti Banos
Mining Minds: an innovative framework for personalized health and wellness su...Mining Minds: an innovative framework for personalized health and wellness su...
Mining Minds: an innovative framework for personalized health and wellness su...
Oresti Banosβ€’761 views
A Novel Watermarking Scheme for Image Authentication in Social Networks by Oresti Banos
A Novel Watermarking Scheme for Image Authentication in Social NetworksA Novel Watermarking Scheme for Image Authentication in Social Networks
A Novel Watermarking Scheme for Image Authentication in Social Networks
Oresti Banosβ€’531 views
mHealthDroid: a novel framework for agile development of mobile health appli... by Oresti Banos
mHealthDroid: a novel framework for agile development of mobile health appli...mHealthDroid: a novel framework for agile development of mobile health appli...
mHealthDroid: a novel framework for agile development of mobile health appli...
Oresti Banosβ€’1.6K views
Sistema automΓ‘tico para la estimaciΓ³n de la presiΓ³n arterial a partir de parΓ‘... by Oresti Banos
Sistema automΓ‘tico para la estimaciΓ³n de la presiΓ³n arterial a partir de parΓ‘...Sistema automΓ‘tico para la estimaciΓ³n de la presiΓ³n arterial a partir de parΓ‘...
Sistema automΓ‘tico para la estimaciΓ³n de la presiΓ³n arterial a partir de parΓ‘...
Oresti Banosβ€’793 views
DiseΓ±o e implementaciΓ³n de tΓ©cnicas de monitorizaciΓ³n indoor en e-salud by Oresti Banos
DiseΓ±o e implementaciΓ³n de tΓ©cnicas de monitorizaciΓ³n indoor en e-saludDiseΓ±o e implementaciΓ³n de tΓ©cnicas de monitorizaciΓ³n indoor en e-salud
DiseΓ±o e implementaciΓ³n de tΓ©cnicas de monitorizaciΓ³n indoor en e-salud
Oresti Banosβ€’729 views

Recently uploaded

Metatheoretical Panda-Samaneh Borji.pdf by
Metatheoretical Panda-Samaneh Borji.pdfMetatheoretical Panda-Samaneh Borji.pdf
Metatheoretical Panda-Samaneh Borji.pdfsamanehborji
16 viewsβ€’29 slides
Nitrosamine & NDSRI.pptx by
Nitrosamine & NDSRI.pptxNitrosamine & NDSRI.pptx
Nitrosamine & NDSRI.pptxNileshBonde4
17 viewsβ€’22 slides
Chromatography ppt.pptx by
Chromatography ppt.pptxChromatography ppt.pptx
Chromatography ppt.pptxvarshachandgudesvpm
18 viewsβ€’1 slide
CSF -SHEEBA.D presentation.pptx by
CSF -SHEEBA.D presentation.pptxCSF -SHEEBA.D presentation.pptx
CSF -SHEEBA.D presentation.pptxSheebaD7
14 viewsβ€’13 slides
MILK LIPIDS 2.pptx by
MILK LIPIDS 2.pptxMILK LIPIDS 2.pptx
MILK LIPIDS 2.pptxabhinambroze18
7 viewsβ€’15 slides
Artificial Intelligence Helps in Drug Designing and Discovery.pptx by
Artificial Intelligence Helps in Drug Designing and Discovery.pptxArtificial Intelligence Helps in Drug Designing and Discovery.pptx
Artificial Intelligence Helps in Drug Designing and Discovery.pptxabhinashsahoo2001
126 viewsβ€’22 slides

Recently uploaded(20)

Metatheoretical Panda-Samaneh Borji.pdf by samanehborji
Metatheoretical Panda-Samaneh Borji.pdfMetatheoretical Panda-Samaneh Borji.pdf
Metatheoretical Panda-Samaneh Borji.pdf
samanehborjiβ€’16 views
Nitrosamine & NDSRI.pptx by NileshBonde4
Nitrosamine & NDSRI.pptxNitrosamine & NDSRI.pptx
Nitrosamine & NDSRI.pptx
NileshBonde4β€’17 views
CSF -SHEEBA.D presentation.pptx by SheebaD7
CSF -SHEEBA.D presentation.pptxCSF -SHEEBA.D presentation.pptx
CSF -SHEEBA.D presentation.pptx
SheebaD7β€’14 views
Artificial Intelligence Helps in Drug Designing and Discovery.pptx by abhinashsahoo2001
Artificial Intelligence Helps in Drug Designing and Discovery.pptxArtificial Intelligence Helps in Drug Designing and Discovery.pptx
Artificial Intelligence Helps in Drug Designing and Discovery.pptx
abhinashsahoo2001β€’126 views
How to be(come) a successful PhD student by Tom Mens
How to be(come) a successful PhD studentHow to be(come) a successful PhD student
How to be(come) a successful PhD student
Tom Mensβ€’491 views
Pollination By Nagapradheesh.M.pptx by MNAGAPRADHEESH
Pollination By Nagapradheesh.M.pptxPollination By Nagapradheesh.M.pptx
Pollination By Nagapradheesh.M.pptx
MNAGAPRADHEESHβ€’16 views
ELECTRON TRANSPORT CHAIN by DEEKSHA RANI
ELECTRON TRANSPORT CHAINELECTRON TRANSPORT CHAIN
ELECTRON TRANSPORT CHAIN
DEEKSHA RANIβ€’7 views
DATABASE MANAGEMENT SYSTEM by Dr. GOPINATH D
DATABASE MANAGEMENT SYSTEMDATABASE MANAGEMENT SYSTEM
DATABASE MANAGEMENT SYSTEM
Dr. GOPINATH Dβ€’7 views
"How can I develop my learning path in bioinformatics? by Bioinformy
"How can I develop my learning path in bioinformatics?"How can I develop my learning path in bioinformatics?
"How can I develop my learning path in bioinformatics?
Bioinformyβ€’24 views
himalay baruah acid fast staining.pptx by HimalayBaruah
himalay baruah acid fast staining.pptxhimalay baruah acid fast staining.pptx
himalay baruah acid fast staining.pptx
HimalayBaruahβ€’7 views
Synthesis and Characterization of Magnetite-Magnesium Sulphate-Sodium Dodecyl... by GIFT KIISI NKIN
Synthesis and Characterization of Magnetite-Magnesium Sulphate-Sodium Dodecyl...Synthesis and Characterization of Magnetite-Magnesium Sulphate-Sodium Dodecyl...
Synthesis and Characterization of Magnetite-Magnesium Sulphate-Sodium Dodecyl...
GIFT KIISI NKINβ€’26 views
1978 NASA News Release Log by purrterminator
1978 NASA News Release Log1978 NASA News Release Log
1978 NASA News Release Log
purrterminatorβ€’10 views
Light Pollution for LVIS students by CWBarthlmew
Light Pollution for LVIS studentsLight Pollution for LVIS students
Light Pollution for LVIS students
CWBarthlmewβ€’7 views
A Ready-to-Analyze High-Plex Spatial Signature Development Workflow for Cance... by InsideScientific
A Ready-to-Analyze High-Plex Spatial Signature Development Workflow for Cance...A Ready-to-Analyze High-Plex Spatial Signature Development Workflow for Cance...
A Ready-to-Analyze High-Plex Spatial Signature Development Workflow for Cance...
InsideScientificβ€’58 views
TF-FAIR.pdf by Dirk Roorda
TF-FAIR.pdfTF-FAIR.pdf
TF-FAIR.pdf
Dirk Roordaβ€’6 views
Open Access Publishing in Astrophysics by Peter Coles
Open Access Publishing in AstrophysicsOpen Access Publishing in Astrophysics
Open Access Publishing in Astrophysics
Peter Colesβ€’906 views
Conventional and non-conventional methods for improvement of cucurbits.pptx by gandhi976
Conventional and non-conventional methods for improvement of cucurbits.pptxConventional and non-conventional methods for improvement of cucurbits.pptx
Conventional and non-conventional methods for improvement of cucurbits.pptx
gandhi976β€’19 views

Robust Expert Systems for more Flexible Real-World Activity Recognition

  • 1. Robust Expert Systems for more Flexible Real-World Activity Recognition Granada, Friday, April 25, 2014 Presented by: Oresti BaΓ±os Supervised by: Miguel Damas, HΓ©ctor Pomares and Ignacio Rojas Department of Computer Architecture and Computer Technology, CITIC-UGR, University of Granada, SPAIN
  • 2. Human Activity 2 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 3. Health Abnormal behavior detection Proactive Assistance Labour risk prevention Wellness Sports Gaming Human Activity β€’ Why is identifying human activity interesting? 3 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 4. Activity Recognition (AR) β€’ Activity recognition concept β€œRecognize the actions and goals of one or more agents from a series of observations on the agents' actions and the environmental conditions” β€’ Activity recognition process 4 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Phenomena Human activity (body motion) Measurement Sensing (ambient/wearables) Processing Data adequation and knowledge inference Recognized Activity
  • 5. Wearable Activity Recognition 5 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS β€’ Wearable activity recognition systems are ready! The first system capable of fully recognize your daily routine. AtlasWearables (2014) The simplest way to understand your day and night. Jawbone Up (2014) The best activity tracker on the market. Fitbit Force (2014) The device that tracks your active life and measures all kind of activities. Nike Fuel (2014)
  • 6. Wearable Activity Recognition 6 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS β€’ But… do wearable activity recognition systems meet people’s expectance?
  • 7. Challenges for Real-World Activity Recognition β€’ Actively investigated: – Reliability – Simplicity – Latency β€’ Barely addressed: – Privacy – Fault-tolerance – Usability – Unobtrusiveness – Fashionability – Self-configuration – Auto-adaptation – Evolvability 7 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 8. Challenges for Real-World Activity Recognition 8 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS   β€’ Actively investigated: – Reliability – Simplicity – Latency β€’ Barely addressed: – Privacy – Fault-tolerance – Usability – Unobtrusiveness – Fashionability – Self-configuration – Auto-adaptation – Evolvability
  • 9. Thesis Motivation and Objectives β€’ Motivation: β€œCreate more advanced systems capable of handling real-world AR issues as well as to incorporate more intelligent capabilities to transform experimental prototypes into actual usable applications” β€’ Objectives: – O1: β€œInvestigate the tolerance of standard AR systems to unforeseen sensor failures and faults, as well as contribute with an alternate approach to cope with these technological anomalies” οƒ  Fault-tolerance – O2: β€œResearch the robustness of standard AR systems to unforeseen variations in the sensor deployment, as well as contribute with an alternate approach to cope with these practical anomalies” οƒ  Usability, Unobtrusiveness – O3: β€œStudy the capacity of standard AR systems to support unforeseen changes in the sensor network, as well as contribute with an alternate approach to cope with these topological variations” οƒ  Self-configuration, auto-adaptation, evolvability 9 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 10. Activity Recognition Process 10 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Phenomena Human activity (body motion) Measurement Sensing (ambient/wearables) Processing Data adequation and knowledge inference Recognized Activity How does it work exactly?
  • 11. Activity Recognition Process 11 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS The Activity Recognition Chain (ARC) Phenomena Human activity (body motion) Measurement Sensing (ambient/wearables) Processing Data adequation and knowledge inference Recognized Activity
  • 12. Activity Recognition Chain (ARC) 12 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 13. Activity Recognition Chain (ARC) 13 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 14. Activity Recognition Chain (ARC) 14 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 15. Activity Recognition Chain (ARC) 15 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 16. Activity Recognition Chain (ARC) 16 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 17. Activity Recognition Chain (ARC) 17 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 18. Activity Recognition Chain (ARC) 18 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 19. Tolerance of AR systems to sensor faults and failures INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Objective: β€œInvestigate the tolerance of standard AR systems to unforeseen sensor failures and faults, as well as contribute with an alternate approach to cope with these technological anomalies”
  • 20. Problem Statement 20 SENSOR ERRORS Are standard activity recognition systems prepared to cope with sensor technological anomalies? Is it possible to keep the systems functioning under the effects of sensor errors? Activity recognition process INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Body motion sensing Signal processing and reasoning Recognition of activities
  • 21. Signal effects Sensor Technological Anomalies 21 β€’ Faults (overheating, environmental changes, decalibration) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS β€’ Failures (outages, breakdowns, disconnection, battery depletion)
  • 22. Sensor Technological Anomalies in AR: Related Work β€’ Detection of sensor anomalies – Sensor query (Rost06) – Neighborhood data correlation β€’ Signal level (Yao10) β€’ Feature level (Ramanathan09) β€’ Reasoning (Rajasegarar07, Ganeriwal08) β€’ Counteraction of sensor anomalies – Data imputation (Uchida13) – Sensor fusion (Sagha13) S. Rost and H. Balakrishnan. Memento: A health monitoring system for wireless sensor networks. In 3rd Annual IEEE Communications Society on Sensor and Ad Hoc Communications and Networks, volume 2, pp. 575-584, 2006. Y. Yao, A. Sharma, L. Golubchik, and R. Govindan. Online anomaly detection for sensor systems: A simple and efficient approach. Performance Evaluation, 67(11):1059-1075, November 2010. N. Ramanathan, T. Schoellhammer, E. Kohler, K. Whitehouse, T. Harmon, and D. Estrin. Suelo: human-assisted sensing for exploratory soil monitoring studies. In Proceedings of the 7th ACM Conference on Embedded Networked Sensor Systems, pp. 197-210, 2009. S. Rajasegarar, C. Leckie, M. Palaniswami, and J. C. Bezdek. Quarter sphere based distributed anomaly detection in wireless sensor networks. In IEEE International Conference on Communications, pp. 3864-3869, June 2007. S. Ganeriwal, L. K. Balzano, and M. B. Srivastava. Reputationbased framework for high integrity sensor networks. ACM Transaction on Sensor Networks, 4(3):1-37, June 2008. R. Uchida, H. Horino, and R. Ohmura. Improving fault tolerance of wearable wearable sensor-based activity recognition techniques. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 633-644, 2013. H. Sagha, H. Bayati, J. del R. Millan, and R. Chavarriaga. On-line anomaly detection and resilience in classifier ensembles. Pattern Recognition Letters, 34(15):1916-1927, 2013 22 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 23. Sensor Failures in Classic AR Systems β€’ Single-sensor ARC (SARC) 23 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 24. β€’ Single-sensor ARC (SARC) 24 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS If the sensor fails, the complete system fails Solution Use more sensors for redundancy (multi-sensor ARC or MARC) Sensor Failures in Standard AR Systems
  • 25. β€’ Feature fusion multi-sensor ARC (FFMARC) 25 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Sensor Failures in Standard AR Systems
  • 26. β€’ Feature fusion multi-sensor ARC (FFMARC) 26 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS If a sensor fails, the complete system fails Solution Independent ARCs + decision fusion Sensor Failures in Standard AR Systems
  • 27. β€’ Decision fusion multi-sensor ARC (DFMARC) 27 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Sensor Failures in Standard AR Systems
  • 28. β€’ Decision fusion multi-sensor ARC (DFMARC) 28 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS If a sensor fails, the system is still capable of functioning but… Is it capable of recognition? Sensor Failures in Standard AR Systems
  • 29. β€’ Hierarchical decision (HD) – Information from some sensors more valuable than from others (e.g., body part for a certain activity) οƒ  Ranking of decisions – Decisions mainly made on top (recognition relies on a sensor or few sensors) οƒ  Problem when top-ranked sensors get unavailable β€’ Majority voting (MV) – Equality scheme (all sensors have the same importance) οƒ  Fairness, decisiveness – A plurality of weak decisors may prevail over the rest οƒ  Tyranny of the majority 29 Sensor Failures in Standard AR Systems INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS … c11,c12,…,c1k c1=Ο†(c11,c21,…, cM1), c2=Ο†(c12,c22,…, cM2), … ck=Ο†(c1k,c2k,…, cMk) c21,c22,…, c2k cM1,cM2,…,cMk … … … … … … c11,c12,…,c1k c1=Ο†(c11,c21,…, cM1), c2=Ο†(c12,c22,…, cM2), … ck=Ο†(c1k,c2k,…, cMk) c21,c22,…, c2k cM1,cM2,…,cMk …
  • 30. A Novel Method: Hierarchical Weighted Classifier 30 Sensor M Sensor 2 SM S2 S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ¨ C21 C22 C2N Ξ¨ CM1 CM2 CMN Ξ¨ Decision Activity level (base classifier) Sensor level (sensor classifier) Network level (sensor fusion) Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N Ξ±21 Ξ²21 Ξ±22 Ξ²22 Ξ±2N Ξ²2N Ξ±M1 Ξ²M1 Ξ±M2 Ξ²M2 Ξ±MN Ξ²MN Ξ³11,…,1N Ξ΄11,…,1N Ξ³21,…,2N Ξ΄21,…,2N Ξ³M1,…,MN Ξ΄M1,…,MN Sensor 1 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 31. 31 Sensor M Sensor 2 SM S2 S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ¨ C21 C22 C2N Ξ¨ CM1 CM2 CMN Ξ¨ Decision Activity level (base classifier) Sensor level (sensor classifier) Network level (sensor fusion) Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N Ξ±21 Ξ²21 Ξ±22 Ξ²22 Ξ±2N Ξ²2N Ξ±M1 Ξ²M1 Ξ±M2 Ξ²M2 Ξ±MN Ξ²MN Ξ³11,…,1N Ξ΄11,…,1N Ξ³21,…,2N Ξ΄21,…,2N Ξ³M1,…,MN Ξ΄M1,…,MN Sensor 1 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS A Novel Method: Hierarchical Weighted Classifier N activities & M sensors
  • 32. 32 Sensor M Sensor 2 SM S2 S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ¨ C21 C22 C2N Ξ¨ CM1 CM2 CMN Ξ¨ Decision Activity level (base classifier) Sensor level (sensor classifier) Network level (sensor fusion) Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N Ξ±21 Ξ²21 Ξ±22 Ξ²22 Ξ±2N Ξ²2N Ξ±M1 Ξ²M1 Ξ±M2 Ξ²M2 Ξ±MN Ξ²MN Ξ³11,…,1N Ξ΄11,…,1N Ξ³21,…,2N Ξ΄21,…,2N Ξ³M1,…,MN Ξ΄M1,…,MN Sensor 1 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS A Novel Method: Hierarchical Weighted Classifier N activities & M sensors S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N π‘ž π‘š π‘₯ π‘š π‘˜ = π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯ π‘ž 𝑂 π‘š π‘₯ π‘š π‘˜ 𝑂 π‘š π‘₯ π‘š π‘˜ = π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜ 𝑁 𝑛=1 𝛼 π‘šπ‘› = 𝑇𝑃 π‘šπ‘› 𝑇𝑃 π‘šπ‘› + 𝐹𝑁 π‘šπ‘› 𝛽 π‘šπ‘› = 𝑇𝑁 π‘šπ‘› 𝑇𝑁 π‘šπ‘› + 𝐹𝑃 π‘šπ‘› π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜ = 𝛼 π‘šπ‘›, π‘₯ π‘š π‘˜ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž 0, π‘₯ π‘š π‘˜ π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž βˆ€π‘ž = 𝑛 𝛽 π‘šπ‘›, π‘₯ π‘š π‘˜ π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž 0, π‘₯ π‘š π‘˜ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž βˆ€π‘ž β‰  𝑛
  • 33. 33 Sensor M Sensor 2 SM S2 S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ¨ C21 C22 C2N Ξ¨ CM1 CM2 CMN Ξ¨ Decision Activity level (base classifier) Sensor level (sensor classifier) Network level (sensor fusion) Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N Ξ±21 Ξ²21 Ξ±22 Ξ²22 Ξ±2N Ξ²2N Ξ±M1 Ξ²M1 Ξ±M2 Ξ²M2 Ξ±MN Ξ²MN Ξ³11,…,1N Ξ΄11,…,1N Ξ³21,…,2N Ξ΄21,…,2N Ξ³M1,…,MN Ξ΄M1,…,MN Sensor 1 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS A Novel Method: Hierarchical Weighted Classifier N activities & M sensors S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N S1 Ξ³11,…,1N Ξ΄11,…,1N π‘Šπ· π‘šπ‘› π‘ž π‘š π‘₯ π‘š π‘˜ = 𝛾 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜ = 𝑛 𝛿 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜ β‰  𝑛 𝛾 π‘š = 𝛾 π‘š1, 𝛾 π‘š2,… , 𝛾 π‘šπ‘› = 𝑇𝑃 π‘š1 𝑇𝑃 π‘š1 + 𝐹𝑁 π‘š1 , 𝑇𝑃 π‘š2 𝑇𝑃 π‘š2 + 𝐹𝑁 π‘š2 , … , π‘‡π‘ƒπ‘šπ‘› π‘‡π‘ƒπ‘šπ‘› + 𝐹𝑁 π‘šπ‘› π‘ž π‘š π‘₯ π‘š π‘˜ = π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯ π‘ž 𝑂 π‘š π‘₯ π‘š π‘˜ 𝑂 π‘š π‘₯ π‘š π‘˜ = π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜ 𝑁 𝑛=1 𝛼 π‘šπ‘› = 𝑇𝑃 π‘šπ‘› 𝑇𝑃 π‘šπ‘› + 𝐹𝑁 π‘šπ‘› 𝛽 π‘šπ‘› = 𝑇𝑁 π‘šπ‘› 𝑇𝑁 π‘šπ‘› + 𝐹𝑃 π‘šπ‘› π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜ = 𝛼 π‘šπ‘›, π‘₯ π‘š π‘˜ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž 0, π‘₯ π‘š π‘˜ π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž βˆ€π‘ž = 𝑛 𝛽 π‘šπ‘›, π‘₯ π‘š π‘˜ π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž 0, π‘₯ π‘š π‘˜ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž βˆ€π‘ž β‰  𝑛 𝛿 π‘š = 𝛿 π‘š1, 𝛿 π‘š2,… , 𝛿 π‘šπ‘› = 𝑇𝑁 π‘š1 𝑇𝑁 π‘š1 + 𝐹𝑃 π‘š1 , 𝑇𝑁 π‘š2 𝑇𝑁 π‘š2 + 𝐹𝑃 π‘š2 , … , 𝑇𝑁 π‘šπ‘› 𝑇𝑁 π‘šπ‘› + πΉπ‘ƒπ‘šπ‘› βˆ€π‘› = 1, … , 𝑁
  • 34. 34 Sensor M Sensor 2 SM S2 S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ¨ C21 C22 C2N Ξ¨ CM1 CM2 CMN Ξ¨ Decision Activity level (base classifier) Sensor level (sensor classifier) Network level (sensor fusion) Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N Ξ±21 Ξ²21 Ξ±22 Ξ²22 Ξ±2N Ξ²2N Ξ±M1 Ξ²M1 Ξ±M2 Ξ²M2 Ξ±MN Ξ²MN Ξ³11,…,1N Ξ΄11,…,1N Ξ³21,…,2N Ξ΄21,…,2N Ξ³M1,…,MN Ξ΄M1,…,MN Sensor 1 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS A Novel Method: Hierarchical Weighted Classifier S1 Ξ±11 Ξ¨ C12 C1N C11 Ξ²11 Ξ±12 Ξ²12 Ξ±1N Ξ²1N S1 Ξ³11,…,1N Ξ΄11,…,1N Ξ¨ Decision N activities & M sensors π‘ž = π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯ π‘ž 𝑂 π‘₯ π‘š 𝑂 π‘₯ π‘š = 𝑂 π‘₯1 π‘˜ , π‘₯2 π‘˜ , … , π‘₯ 𝑀 π‘˜ = π‘Šπ· 𝑝 π‘ž 𝑝 π‘₯ 𝑝 π‘˜ 𝑀 𝑝=1 π‘Šπ· π‘šπ‘› π‘ž π‘š π‘₯ π‘š π‘˜ = 𝛾 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜ = 𝑛 𝛿 π‘šπ‘›, π‘ž π‘š π‘₯ π‘š π‘˜ β‰  𝑛 𝛾 π‘š = 𝛾 π‘š1, 𝛾 π‘š2,… , 𝛾 π‘šπ‘› = 𝑇𝑃 π‘š1 𝑇𝑃 π‘š1 + 𝐹𝑁 π‘š1 , 𝑇𝑃 π‘š2 𝑇𝑃 π‘š2 + 𝐹𝑁 π‘š2 , … , π‘‡π‘ƒπ‘šπ‘› π‘‡π‘ƒπ‘šπ‘› + 𝐹𝑁 π‘šπ‘› π‘ž π‘š π‘₯ π‘š π‘˜ = π‘Žπ‘Ÿπ‘”π‘šπ‘Žπ‘₯ π‘ž 𝑂 π‘š π‘₯ π‘š π‘˜ 𝑂 π‘š π‘₯ π‘š π‘˜ = π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜ 𝑁 𝑛=1 𝛼 π‘šπ‘› = 𝑇𝑃 π‘šπ‘› 𝑇𝑃 π‘šπ‘› + 𝐹𝑁 π‘šπ‘› 𝛽 π‘šπ‘› = 𝑇𝑁 π‘šπ‘› 𝑇𝑁 π‘šπ‘› + 𝐹𝑃 π‘šπ‘› π‘Šπ· π‘šπ‘› π‘₯ π‘š π‘˜ = 𝛼 π‘šπ‘›, π‘₯ π‘š π‘˜ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž 0, π‘₯ π‘š π‘˜ π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž βˆ€π‘ž = 𝑛 𝛽 π‘šπ‘›, π‘₯ π‘š π‘˜ π‘›π‘œπ‘‘ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž 0, π‘₯ π‘š π‘˜ π‘π‘™π‘Žπ‘ π‘ π‘–π‘“π‘–π‘’π‘‘ π‘Žπ‘  π‘ž βˆ€π‘ž β‰  𝑛 𝛿 π‘š = 𝛿 π‘š1, 𝛿 π‘š2,… , 𝛿 π‘šπ‘› = 𝑇𝑁 π‘š1 𝑇𝑁 π‘š1 + 𝐹𝑃 π‘š1 , 𝑇𝑁 π‘š2 𝑇𝑁 π‘š2 + 𝐹𝑃 π‘š2 , … , 𝑇𝑁 π‘šπ‘› 𝑇𝑁 π‘šπ‘› + πΉπ‘ƒπ‘šπ‘› βˆ€π‘› = 1, … , 𝑁
  • 35. Evaluation of the Tolerance to Sensor Technological Anomalies β€’ Model validation – Performance in ideal circumstances – Tolerance to sensor failures – Tolerance to sensor faults 35 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 36. β€’ Benchmark dataset: – MIT Activities of Daily Living Dataset* β€’ 9 acts β€’ 5 biaxial accelerometers β€’ 20 subjects (17-48 years old) β€’ Out-of-lab β€’ Experimental setup: 36 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 5 biaxial accelerometers (limbs and trunk) LP Elliptic Filter (Fc = 20Hz) 6 seconds sliding window Mean, STD, kurtosis, MCR, (...) DT, NB, KNN, SVM (as base classifiers) Evaluation of the Tolerance to Sensor Technological Anomalies * http://architecture.mit.edu/house_n/data/Accelerometer/BaoIntille.htm
  • 37. β€’ Performance in ideal circumstances: 37 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies Evaluated models: - SARC (≑ S) - FFMARC (≑ FF) - HWC Parameters: - Feature sets: 1, 5, 10, 20 feat. - Base classifiers: DT, NB, KNN, SVM Evaluation procedure: - 10-fold CV - 100 iterations
  • 38. β€’ Performance in ideal circumstances: 38 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies Evaluated models: - SARC (≑ S) - FFMARC (≑ FF) - HWC Parameters: - Feature sets: 1, 5, 10, 20 feat. - Base classifiers: DT, NB, KNN, SVM Evaluation procedure: - 10-fold CV - 100 iterations
  • 39. β€’ Performance in ideal circumstances: 39 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies Evaluated models: - SARC (≑ S) - FFMARC (≑ FF) - HWC Parameters: - Feature sets: 1, 5, 10, 20 feat. - Base classifiers: DT, NB, KNN, SVM Evaluation procedure: - 10-fold CV - 100 iterations
  • 40. β€’ Tolerance to sensor failures: 40 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies Evaluated model: - HWC Parameters: - Feature set: 10 feat. - Classifier: KNN Evaluation procedure: - 10-fold CV - 100 iterations Legend: H (hip), W (wrist), A (arm), K (ankle), T (thigh) Baseline Accuracy (Ideal conditions) = 96.34%
  • 41. β€’ Tolerance to sensor failures: 41 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies 1 missing sensor Evaluated model: - HWC Parameters: - Feature set: 10 feat. - Classifier: KNN Evaluation procedure: - 10-fold CV - 100 iterations Legend: H (hip), W (wrist), A (arm), K (ankle), T (thigh) Baseline Accuracy (Ideal conditions) = 96.34%
  • 42. β€’ Tolerance to sensor failures: 42 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies 1 missing sensor 2 missing sensors Evaluated model: - HWC Parameters: - Feature set: 10 feat. - Classifier: KNN Evaluation procedure: - 10-fold CV - 100 iterations Legend: H (hip), W (wrist), A (arm), K (ankle), T (thigh) Baseline Accuracy (Ideal conditions) = 96.34%
  • 43. β€’ Tolerance to sensor failures: 43 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies 1 missing sensor 2 missing sensors 3 missing sensors 4 missing sensors Evaluated model: - HWC Parameters: - Feature set: 10 feat. - Classifier: KNN Evaluation procedure: - 10-fold CV - 100 iterations Legend: H (hip), W (wrist), A (arm), K (ankle), T (thigh) Baseline Accuracy (Ideal conditions) = 96.34%
  • 44. β€’ Tolerance to sensor faults: 44 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies Ideal case Dynamic range shortening
  • 45. β€’ Tolerance to sensor faults: 45 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Evaluation of the Tolerance to Sensor Technological Anomalies Evaluated models: - SARC - FFMARC - HWC Parameters: - Feature set: 10 feat. - Classifier: KNN Evaluation procedure: - 10-fold CV - 100 iterations AR model/ #faulty sensors 0 1 2 3 4 5 New dynamic range= 30% original dynamic range οƒ  [-3g,3g] SARC (hip) 82Β±5 66Β±4 - - - - SARC (wrist) 88Β±5 54Β±6 - - - - SARC (arm) 80Β±3 58Β±7 - - - - SARC (ankle) 83Β±4 58Β±8 - - - - SARC (thigh) 89Β±2 72Β±4 - - - - FFMARC 97Β±2 88Β±4 76Β±5 61Β±8 42Β±11 39Β±13 HD 90Β±3 85Β±4 80Β±9 68Β±13 59Β±16 53Β±20 MV 82Β±6 79Β±5 67Β±7 43Β±10 36Β±14 31Β±19 HWC 96Β±2 96Β±2 93Β±3 86Β±5 73Β±8 65Β±14 New dynamic range= 10% original dynamic range οƒ  [-1g,1g] SARC (hip) 82Β±5 21Β±11 - - - - SARC (wrist) 88Β±5 18Β±9 - - - - SARC (arm) 80Β±3 26Β±14 - - - - SARC (ankle) 83Β±4 21Β±7 - - - - SARC (thigh) 89Β±2 20Β±6 - - - - FFMARC 97Β±2 70Β±5 41Β±8 17Β±15 21Β±11 18Β±9 HD 90Β±3 80Β±6 59Β±13 42Β±12 30Β±17 21Β±16 MV 82Β±6 77Β±6 46Β±11 38Β±10 27Β±13 26Β±8 HWC 96Β±2 94Β±2 87Β±6 53Β±2 27Β±17 25Β±19
  • 46. Conclusions β€’ Assuming a lifelong invariant sensor setup is unrealistic and may lead to a malfunctioning of the activity recognition system β€’ Body-worn sensors are subject to faults (signal degradation) and failures (absence of signal) normally unforeseen at design and runtime β€’ Classic activity recognition approaches (SARC, FFMARC) are not capable of dealing with sensor failures and are of limited utility under the effect of sensor faults β€’ The proposed alternate model (HWC) renders similar performance to standard activity recognition models in ideal conditions, proves to be robust to sensor failures and a relevant tolerance to sensor faults 46 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 47. Robustness of AR systems to sensor deployment variations INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Objective: β€œResearch the robustness of standard AR systems to unforeseen variations in the sensor deployment, as well as contribute with an alternate approach to cope with these practical anomalies”
  • 48. Problem Statement 48 SENSOR DEPLOYMENT CHANGES Are activity recognition systems flexible enough to allow users to wear the sensors on their own? Is it possible to keep the systems functioning under the effects of sensor displacement? Activity recognition process Body motion sensing Signal processing and reasoning Recognition of activities INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 49. Sensor Displacement β€’ Categories of sensor displacement – Static: position changes can remain static across the execution of many activity instances, e.g. when sensors are attached with a displacement each day – Dynamic: effect of loose fitting of the sensors, e.g. when embedded into clothes β€’ Sensor displacement οƒ  new sensor position οƒ  signal space change β€’ Sensor displacement effects depends on – Original/end position and body part – Activity/gestures/movements performed – Sensor modality 49 Sensor displacement = rotation + translation (angular displacement) (linear displacement) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 50. Sensor Displacement Effects Changes in the signal space propagates through the activity recognition chain (e.g., variations in the feature space) RCIDEAL LCIDEAL= LCSELF 50 RCSELF β‰  RCIDEAL INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 51. Sensor Displacement in AR: Related Work β€’ Features invariant to sensor displacement – Heuristics (Kunze08) – Genetic algorithm for feature selection (FΓΆrster09a) β€’ Feature distribution adaptation – Covariate shift unsupervised adaptation (Bayati09) – Online-supervised user-based calibration (FΓΆrster09b) β€’ Classification (dis)similarity – Output classifiers correlation (Sagha11) K. Kunze and P. Lukowicz. Dealing with sensor displacement in motion-based onbody activity recognition systems. In 10th international conference on Ubiquitous computing, pp. 20–29, 2008. K. FΓΆrster, P. Brem, D. Roggen, and G. TrΓΆster. Evolving discriminative features robust to sensor displacement for activity recognition in body area sensor networks. In Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP), 2009 5th International Conference on, pp. 43–48, 2009. H. Bayati, J. del R Millan, and R. Chavarriaga. Unsupervised adaptation to on-body sensor displacement in acceleration-based activity recognition. In Wearable Computers (ISWC), 2011 15th Annual International Symposium on, pp. 71–78, June 2011. K. FΓΆrster, D. Roggen, and G. TrΓΆster. Unsupervised classifier self-calibration through repeated context occurrences: Is there robustness against sensor displacement to gain? In Proc. 13th IEEE Int. Symposium on Wearable Computers (ISWC), pp. 77–84, 2009. H. Sagha, J. R. del MillΓ‘n, and R. Chavarriaga. Detecting and rectifying anomalies in Opportunistic sensor networks 8th Int. Conf. on Networked Sensing Systems, pp. 162-–167, 2011 51 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 52. Approaches to Investigate on Sensor Displacement Synthetically Modeled Sensor Displacement Realistic Sensor Displacement INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 52
  • 53. Approaches to Investigate on Sensor Displacement Synthetically Modeled Sensor Displacement INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 53
  • 54. Synthetically Modeled Sensor Displacement β€’ Sensor rotation οƒ  Rotational noise (RN) β€’ Sensor translation οƒ  Additive noise (AN) β€’ Examples: 54 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 𝑀 𝑅𝑁 = 𝑐 πœƒ 𝑐 πœ“ βˆ’ 𝑐 πœ™ 𝑠 πœ“ + 𝑠 πœ™ 𝑠 πœƒ 𝑐(πœ“) 𝑠 πœ™ 𝑠 πœ“ + 𝑐 πœ™ 𝑠 πœƒ 𝑐(πœ“) 𝑐 πœƒ 𝑠 πœ“ 𝑐 πœ™ 𝑐 πœ“ + 𝑠 πœ™ 𝑠 πœƒ 𝑠(πœ“) βˆ’π‘  πœ™ 𝑐 πœ“ + 𝑐 πœ™ 𝑠 πœƒ 𝑠(πœ“) βˆ’ 𝑠 πœƒ 𝑠 πœ™ 𝑐 πœƒ 𝑐 πœ™ 𝑐 πœƒ π‘₯ π‘Ÿπ‘œπ‘‘ 𝑦 π‘Ÿπ‘œπ‘‘ 𝑧 π‘Ÿπ‘œπ‘‘ = 𝑀 𝑅𝑁 Γ— π‘₯ π‘Ÿπ‘Žπ‘€ π‘¦π‘Ÿπ‘Žπ‘€ 𝑧 π‘Ÿπ‘Žπ‘€ π‘₯π‘‘π‘Ÿ π‘¦π‘‘π‘Ÿ π‘§π‘‘π‘Ÿ = 𝑇𝐴𝑁 + π‘₯ π‘Ÿπ‘Žπ‘€ π‘¦π‘Ÿπ‘Žπ‘€ 𝑧 π‘Ÿπ‘Žπ‘€ 𝑇𝐴𝑁 = πœ‡ 𝐴𝑁 + πœŽπ΄π‘ 2 + π‘Ÿπ‘Žπ‘›π‘‘_π‘›π‘œπ‘Ÿπ‘šπ‘Žπ‘™_π‘‘π‘–π‘ π‘‘π‘Ÿπ‘–π‘π‘’π‘‘π‘–π‘œπ‘› (πœ‡ 𝐴𝑁=0) 0 1 2 3 -2 0 2 Acceleration(g) Time (s) Original 0 1 2 3 -2 0 2 Time (s) RN =15ΒΊ 0 1 2 3 -2 0 2 Time (s) RN =90ΒΊ 0 1 2 3 -2 0 2 Time (s) AN =0.1g 0 1 2 3 -2 0 2 Time (s) AN =0.5g Original RN =15ΒΊ RN =90ΒΊ AN =0.1g AN =0.5g 0 1 2 3 -2 0 2 Acceleration(g) Time (s) Original 0 1 2 3 -2 0 2 Time (s) RN =15ΒΊ 0 1 2 3 -2 0 2 Time (s) RN =90ΒΊ 0 1 2 3 -2 0 2 Time (s) AN =0.1g 0 1 2 3 -2 0 2 Time (s) AN =0.5g 0 1 2 3 -2 0 2 Acceleration(g) Time (s) Original 0 1 2 3 -2 0 2 Time (s) RN =15ΒΊ 0 1 2 3 -2 0 2 Time (s) RN =90ΒΊ 0 1 2 3 -2 0 2 Time (s) AN =0.1g 0 1 2 3 -2 0 2 Time (s) AN =0.5g Walking Sitting Proposed in: H. Sagha, J. R. del MillΓ‘n, and R. Chavarriaga. Detecting and rectifying anomalies in Opportunistic sensor networks. 8th Int. Conf. on Networked Sensing Systems, pp. 162 – 167, 2011
  • 55. β€’ Benchmark dataset: – MIT Activities of Daily Living Dataset* β€’ 9 acts β€’ 5 biaxial accelerometers β€’ 20 subjects (17-48 years old) β€’ Out-of-lab β€’ Experimental setup: 55 Evaluation of the Robustness to Sensor Displacement (Synthetic) * http://architecture.mit.edu/house_n/data/Accelerometer/BaoIntille.htm INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 5 biaxial accelerometers (limbs and trunk) LP Elliptic Filter (Fc = 20Hz) 6 seconds sliding window 10 feat. set KNN (as base classifier)
  • 56. HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor) β€’ Performance drop under the effects of sensor rotation and translation: 56 Evaluation of the Robustness to Sensor Displacement (Synthetic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS RotationTranslation
  • 57. β€’ Performance drop under the effects of sensor rotation and translation: HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor) 57 Evaluation of the Robustness to Sensor Displacement (Synthetic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 30% RotationTranslation 23% 3% 8% 8% 20% 15% 25% 5% 15% 1% 4%
  • 58. HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor) β€’ Performance drop under the effects of sensor rotation and translation: 58 Evaluation of the Robustness to Sensor Displacement (Synthetic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 70% RotationTranslation 50% 20% 55% 6% 15%3% 8% 8% 20% 15% 45% 75% 5% 15% 43% 30% 1% 4% 4% 10%25% 30% 23%
  • 59. Approaches to Investigate on Sensor Displacement Realistic Sensor Displacement INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 59
  • 60. β€’ No dataset for studying the effects of sensor displacement! οƒ  β€’ Observe – Variability introduced with respect to the ideal setup when the sensors are self-placed by the users – Effects of large sensor displacements (extreme de-positioning) β€’ Scenarios – Ideal-placement – Self-placement – Induced-displacement Implementing Realistic Sensor Displacement Ideal Self Induced 60 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS NEW DATASET* (REALDISP) *Freely available at: www.ugr.es/~oresti/datasets
  • 61. REALDISP Dataset: Study Setup β€’ Cardio-fitness room β€’ 9 IMUs (9DoF) οƒ  ACC, GYR, MAG β€’ Laptop οƒ  data storage and labeling* β€’ Camera οƒ  offline data validation β€’ 17 volunteers (22-37 years old) *Annotation tool: http://crnt.sourceforge.net/CRN_Toolbox/Home.html 61 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 62. REALDISP Dataset: Activity Set β€’ Activities intended for: – Body-general motion: Translation | Jumps | Fitness – Body-part-specific motion: Trunk | Upper-extremities | Lower-extremities 62 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 63. β€’ Experimental setup: β€’ Studies: – AR systems: SARC, FFMARC, HWC – Settings: Ideal-placement, Self-placement, Induced-displacement – Scenarios: 10 activities, 20 activities, 33 activities (all) β€’ Evaluation procedure – 10-fold CV, 100 iterations 63 Evaluation of the Robustness to Sensor Displacement (Realistic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 9 triaxial accelerometers (all limbs and trunk) No preprocessing (raw data) 6 seconds sliding window FS1=mean FS2=mean,std FS3=mean,std, max,min,mcr DT, KNN, NB (as base classifiers)
  • 64. HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor) 64 Evaluation of the Robustness to Sensor Displacement (Realistic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS IdealSelfInduced
  • 65. HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor) 65 Evaluation of the Robustness to Sensor Displacement (Realistic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS IdealSelfInduced 13% 25% 25% 45% 3% 13%
  • 66. HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor) 66 Evaluation of the Robustness to Sensor Displacement (Realistic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS IdealSelfInduced 13% 25% 15% 50% 25% 45% 30% 45% 3% 13% 5% 15%
  • 67. HWC (multi-sensor)FFMARC (multi-sensor)SARC (single sensor) 67 Evaluation of the Robustness to Sensor Displacement (Realistic) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS IdealSelfInduced 13% 25% 15% 50% 50% 15%25% 45% 30% 45% 45% 30% 3% 13% 5% 15% 5% 25%
  • 68. Conclusions β€’ Classic activity-aware systems assume a predefined sensor deployment that further remains unchanged during runtime, which are not lifelike assumptions β€’ Body-worn inertial sensors are subject to deployment changes (displacement) in real-world contexts, potentially leading to signal variations with respect to ideal patterns β€’ Activity recognition systems proves to be more sensitive to sensor rotations than translations, specially when located on body parts of reduced mobility β€’ Standard models (SARC,FFMARC) suffer from a critical performance worsening when the sensors are largely depositioned or self-placed by the users β€’ The HWC significantly outperforms the tolerance of standard activity recognition models (up to 30%), effectively showing outstanding capabilities to assimilate the changes introduced during the self-placement of the sensors and to moderately overcome the situation of largely depositioned sensors 68 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 69. Supporting AR systems network changes: instruction of newcomer sensors INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Objective: β€œStudy the capacity of standard AR systems to support unforeseen changes in the sensor network, as well as contribute with an alternate approach to cope with these topological variations”
  • 70. Problem Statement 70 SENSOR INFRASTRUCTURE CHANGES INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Collect a training dataset Train and test the model The AR system is β€œready” Do we need to collecta new dataset each time the sensor topology changes? Is it possible to leverage the knowledge of a functional system to instructa system to operate on a newcomer sensor? Activity recognition system design
  • 71. Infraestructure Changes: Newcomer Sensors 71 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Sensor replacement (repair/upgrade) Sensor addition (redundancy) Sensor discovery (opportunistic use)
  • 72. Transfer learning Instruction of Newcomer Sensors 72 Teacher INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Classic approach Limitations: - Predefined setup and deployments - System designer involvement - User/s involvement Learner β€œMechanism, ability or means to recognize and apply knowledge and skills learned in previous tasks or domains to novel tasks or domains” Collection of a new dataset for each possible scenario
  • 73. Transfer Learning in AR: Related Work β€’ Transfer between wearable sensors – Translation of locomotion recognition capabilities (Calatroni11) β€’ Model parameters β€’ Labels β€’ Transfer between ambient sensors – Translation among smart homes through meta-featuring (van Kasteren10) β€’ Common meta-feature space β€’ Limitations – Long time scales operation – Incomplete transfer – Difficult transfer across modalities A. Calatroni, D. Roggen, and G. TrΓΆster, β€œAutomatic transfer of activity recognition capabilities between body-worn motion sensors: Training newcomers to recognize locomotion,” in Proc. 8th Int Conf on Networked Sensing Systems, 2011. T. van Kasteren, G. Englebienne, and B. KrΓΆse, β€œTransferring knowledge of activity recognition across sensor networks,” in Proc. 8th Int. Conf on Pervasive Computing, 2010. INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 73
  • 74. Multimodal Transfer Methods β€’ System identification (signal level) β€’ Transfer methods (reasoning level) Ψ𝐴→𝐡 𝑑 Sensor Domain A Sensor Domain B0 1 2 3 -0.5 0 0.5 1 1.5 Time (s) Acceleration(G) 0 1 2 3 -1 0 1 2 Time (s) Position(m) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of activity models (features + labels, classification models) Transfer of activity templates (patterns + labels) 74
  • 75. Transfer of Activity Templates 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) β€’ Transfer of the recognition capabilities of an existing source system (S) that operates on activity templates (patterns) to an untrained target system (T) that lacks from these capabilities INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 75
  • 76. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Coexistence… (T) 0 20 40 -1 0 1 2 Time (s) Position(m) 0 20 40 -1 0 1 2 Time (s)Acceleration(G) Transfer of Activity Templates INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS (1) Both systems coexists during a certain period of time 76
  • 77. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS (2) A mapping function between source and target domains is discovered through system identification (MIMO model) Transfer of Activity Templates Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑) 77
  • 78. System S (source domain) System T (target domain) Signal level Reasoning level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS (3) The activity templates are translated from source to target domain Transfer of Activity Templates 78
  • 79. Signal level Reasoning level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑) 0 1 2 3 -1 0 1 2 Time (s) Position(m) X Y Z INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Templates (3) The activity templates are translated from source to target domain System S (source domain) System T (target domain) L3 79
  • 80. System S (source domain) System T (target domain) Signal level Reasoning level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑) 0 1 2 3 -0.5 0 0.5 1 1.5 Time (s) Acceleration(G) ^X ^Y ^Z INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Templates (3) The activity templates are translated from source to target domain 0 1 2 3 -1 0 1 2 Time (s) Position(m) X Y Z L3 L3 80
  • 81. System S (source domain) System T (target domain) Signal level Reasoning level L1 L2 L3 Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Templates β€’ System identification – 3) The activity templates are translated from source to target domain 0 0.5 1 1.5 Acceleration(G) X Y Z ^X ^Y ^Z 81
  • 82. System S (source domain) System T (target domain) Signal level Reasoning level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑑 : 𝑋𝑆(𝑑) β†’ 𝑋 𝑇(𝑑) β‰ˆ 𝑋 𝑇(𝑑) 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z L1 L2 L3 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Templates (4) Once the templates have been translated, the target system is ready for activity detection 82
  • 83. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level L1 L2 L3 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z L1 L2 L3 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Templates (4) Once the templates have been translated, the target system is ready for activity detection Instruction completed! 83
  • 84. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Models β€’ Transfer of the recognition capabilities of an existing source system (S) that operates on activity models (features + classification model) to an untrained target system (T) that lacks from these capabilities 84
  • 85. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level Coexistence… (T) 0 20 40 -1 0 1 2 Time (s) Position(m) 0 20 40 -1 0 1 2 Time (s) Acceleration(G) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Models (1) Both systems coexists during a certain period of time 85
  • 86. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Models (2) A mapping function between target and source domains is discovered through system identification (MIMO model) 86
  • 87. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Models (3) The source activity models are translated to the target domain so both use the same activity models 87
  • 88. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Models (3) The source activity models are translated to the target domain so both use the same activity models; these activity models also define the target activity recognition system 88
  • 89. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Models (4) The target system continuously translate its signals into the source domain to operate on the transferred recognition system 0 1 2 3 4 -0.5 0 0.5 1 1.5 X Y Z 89
  • 90. 𝑋𝑆(𝑑) 𝑋 𝑇(𝑑) System S (source domain) System T (target domain) Signal level Reasoning level Ξ¨ 𝑇→𝑆 𝑑 : 𝑋 𝑇(𝑑) β†’ 𝑋𝑆(𝑑) β‰ˆ 𝑋𝑆(𝑑) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Transfer of Activity Models (4) The target system continuously translate its signals into the source domain to operate on the transferred recognition system; since then it is ready for activity detection Instruction completed! 0 1 2 3 4 -1 0 1 2 ^X ^Y ^Z 90
  • 91. Evaluation of Multimodal Transfer β€’ Models validation – Transfer between IMU and IMU (Identical Domain Transfer) – Transfer between Kinect and IMU (Cross Domain Transfer) 91 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z L1 L2 L3 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z Transfer of Activity Templates Transfer of Activity Models 0 1 2 3 4 -1 0 1 2 ^X ^Y ^Z
  • 92. Multimodal Kinect-IMU Dataset: Study Setup INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS *Freely available at: www.ugr.es/~oresti/datasets 92
  • 93. Multimodal Kinect-IMU Dataset: Study Setup INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS MTx XSENS IMUs - 3D ACC + (3D GYR, 3D MAG, 4D QUA) - Sampling rate 30Hz Applications 93Xsens data logger οƒ  http://crnt.sourceforge.net/CRN_Toolbox/References.html
  • 94. Multimodal Kinect-IMU Dataset: Study Setup INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS MICROSOFT KINECT - RGB cam + IR cam + IR led - Depth map (0.5-6m) - 15 joints skeleton tracking - 3D position - Tracking range (1.2-3.5m) - Sampling rate 30Hz Applications 94 Kinect data logger οƒ  http://code.google.com/p/qtkinectwrapper/
  • 95. Multimodal Kinect-IMU Dataset: Scenarios Geometric Gestures (HCI) 48 instances per gesture Other scenarios were also collected as part of this dataset (more info at www.ugr.es/~oresti/datasets) 95
  • 96. Multimodal Kinect-IMU Dataset: Scenarios Geometric Gestures (HCI) Idle (Background) ~5 min of data48 instances per gesture Other scenarios were also collected as part of this dataset (more info at www.ugr.es/~oresti/datasets) 96
  • 97. Transfer between IMU and IMU β€’ Analyzed transfers – Transfer of Activity Templates and Activity Models from: β€’ RLA (3D acceleration) to RUA (3D acceleration) β€’ RUA (3D acceleration) to RLA (3D acceleration) β€’ RUA (3D acceleration) to BACK (3D acceleration) β€’ BACK (3D acceleration) to RUA (3D acceleration) β€’ RLA (3D acceleration) to BACK (3D acceleration) β€’ BACK (3D acceleration) to RLA (3D acceleration) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 97
  • 98. Evaluation of Transfer between IMU and IMU β€’ Mapping: – Model οƒ  MIMO3x3 mapping with 10 tap delay – Types β€’ Problem-domain mapping (PDM) β€’ Gesture-specific mapping (GSM) β€’ Unrelated-domain mapping (UDM) – Learning οƒ  100 samples (~3.3s) β€’ Activity recognition model: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS Triaxial acceleration (IMU) No preprocessing (raw data) Instance based segmentation FS=max,min KNN (standard classifier) 98
  • 99. Evaluation of Transfer between IMU and IMU β€’ Transfer of Activity Templates: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 99BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping LA=lower arm UA=upper arm B=back
  • 100. Evaluation of Transfer between IMU and IMU β€’ Transfer of Activity Templates: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 100BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping LA=lower arm UA=upper arm B=back <1% <3%
  • 101. Evaluation of Transfer between IMU and IMU β€’ Transfer of Activity Templates: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 101BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping LA=lower arm UA=upper arm B=back <1% <3% 35% 15% 17% 28% 35% 28% 10% 10%
  • 102. Evaluation of Transfer between IMU and IMU β€’ Transfer of Activity Templates: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 102BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping LA=lower arm UA=upper arm B=back <1% <3% 12% 20% 35% 55% 15% 17% 28% 35% 60% 28% 30% 50% 10% 10%
  • 103. Evaluation of Transfer between IMU and IMU β€’ Transfer of Activity Models: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 103BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping LA=lower arm UA=upper arm B=back
  • 104. Evaluation of Transfer between IMU and IMU β€’ Transfer of Activity Models: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 104BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping LA=lower arm UA=upper arm B=back
  • 105. Transfer between Kinect and IMU β€’ Analyzed transfers – Transfer of Activity Templates (Kinect to IMU) : β€’ HAND (3D position) οƒ  RLA (3D acceleration) β€’ HAND (3D position) οƒ  RUA (3D acceleration) β€’ HAND (3D position) οƒ  BACK (3D acceleration) – Transfer of Activity Models (IMU to Kinect): β€’ RLA (3D acceleration) οƒ  HAND (3D position) β€’ RUA (3D acceleration) οƒ  HAND (3D position) β€’ BACK (3D acceleration) οƒ  HAND (3D position) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 105
  • 106. Evaluation of Transfer between Kinect and IMU β€’ Mapping: – Model οƒ  MIMO3x3 mapping with 10 tap delay – Types β€’ Problem-domain mapping (PDM) β€’ Gesture-specific mapping (GSM) β€’ Unrelated-domain mapping (UDM) – Learning οƒ  100 samples (~3.3s) β€’ Activity recognition model: INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 106 Triaxial acceleration (IMU) / Triaxial position (KINECT) No preprocessing (raw data) Instance based segmentation FS=max,min KNN (standard classifier)
  • 107. Evaluation of Transfer between Kinect and IMU β€’ Transfer of Activity Templates (From Kinect to IMU) β€’ Transfer of Activity Models (From IMU to Kinect) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 107BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping RLA= right lower arm RUA= right upper arm BACK=back KINECT=hand
  • 108. Evaluation of Transfer between Kinect and IMU β€’ Transfer of Activity Templates (From Kinect to IMU) β€’ Transfer of Activity Models (From IMU to Kinect) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 108BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping RLA= right lower arm RUA= right upper arm BACK=back KINECT=hand <4% <4% <8%<6%
  • 109. Evaluation of Transfer between Kinect and IMU β€’ Transfer of Activity Templates (From Kinect to IMU) β€’ Transfer of Activity Models (From IMU to Kinect) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 109BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping RLA= right lower arm RUA= right upper arm BACK=back KINECT=hand <4% <4% <8%<6% 30% 45% 35% 60%
  • 110. Evaluation of Transfer between Kinect and IMU β€’ Transfer of Activity Templates (From Kinect to IMU) β€’ Transfer of Activity Models (From IMU to Kinect) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS 110BS=baseline source | BT=baseline target | PDM=problem-domain mapping | GSM=gesture-specific mapping | UDM=unrelated-domain mapping RLA= right lower arm RUA= right upper arm BACK=back KINECT=hand <4% 35% 50% <4% <8%<6% 30% 45% 35% 60% 55% 30%35%35%
  • 111. Evaluation of Transfer between Kinect and IMU β€’ Transfer of Activity Templates (From Kinect to IMU) β€’ Transfer of Activity Models (From IMU to Kinect) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS From Kinect to IMU (RLA) From IMU (RLA) to Kinect FS1=mean FS2=max,min 11130 samples = 1 s
  • 112. Evaluation of Transfer between Kinect and IMU β€’ Transfer of Activity Templates (From Kinect to IMU) β€’ Transfer of Activity Models (From IMU to Kinect) INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS From Kinect to IMU (RLA) From IMU (RLA) to Kinect FS1=mean FS2=max,min 11230 samples = 1 s 25% 20%
  • 113. Conclusions β€’ Classical training procedures are not practical to instruct newcomer sensors in dynamically varying and evolvable activity recognition setups β€’ A novel multimodal transfer learning model is proposed to translate the recognition capabilities of an existing system to a new untrained system, at runtime and without expert or user intervention β€’ As few as a single gesture (β‰ˆ3 seconds) of data is enough to learn a mapping model that captures the underlying relation between systems of identical or different modality β€’ The transfer between IMUs across close-by limbs achieves a recognition accuracy superior to 97% (>2% below baseline), and 95% (>4% below baseline) for the transfer between Kinect and IMU, independently of the direction of the transfer β€’ Low-variance data unrelated to the activities of interest can be also used to learn a mapping, albeit with more data 113 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 114. Conclusions and future work INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 115. Contributions β€’ Identification of the requirements and challenges posed by AR systems in real- world conditions β€’ Evaluation of the tolerance of standard AR systems to sensor technological anomalies, particularly sensor failures and faults β€’ Definition and development of a novel model, so-called HWC, to overcome the effects of sensor failures and faults. Evaluation of the robustness of the proposed HWC model to the effects of sensor failures and faults β€’ Evaluation of the tolerance of standard AR systems to sensor deployment variations, particularly static and dynamic sensor displacements β€’ Evaluation of the robustness of the proposed HWC model to the effects of sensor displacements β€’ Definition, development and validation of a novel multimodal transfer learning method that operates at runtime, with low overhead and without user or system designer intervention 115 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 116. Contributions β€’ Collection and curation of an innovative benchmark dataset to investigate the effects of sensor displacement, introducing the concept of ideal-placement, self-placement and induced-displacement. This dataset includes a wide range of physical activities, sensor modalities and participants. Apart from investigating sensor displacement, the dataset lend itself for benchmarking activity recognition techniques in ideal conditions. The dataset is publicly available to the research community at http://www.ugr.es/~oresti/datasets β€’ Collection and curation of a novel multimodal dataset to investigate transfer learning among ambient sensing and wearable sensing systems. The dataset could be also used for gesture spotting and continuous activity recognition. The dataset is publicly available to the research community at http://www.ugr.es/~oresti/datasets 116 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 117. Selected Publications β€’ International Journals (SCI-indexed) – Banos, O., Toth M. A., Damas, M., Pomares, H., Rojas, I. Dealing with the effects of sensor displacement in wearable activity recognition. Sensors, MDPI (2014) [Under review] – Banos, O., Damas, M., Guillen, A., Herrera, L.J., Pomares, H., Rojas, I. Multi-sensor fusion based on asymmetric decision weighting for robust activity recognition. Neural Processing Letters, Springer (2014) [Under review] – Banos, O., Galvez, J. M., Damas, M., Pomares, H., Rojas, I. Window size impact in activity recognition. Sensors, MDPI, vol. 14, no. 4, pp. 6474-6499 (2014) – Banos, O., Damas, M., Pomares, H., Rojas, F., Delgado-Marquez, B., Valenzuela, O. Human activity recognition based on a sensor weighting hierarchical classifier. Soft Computing, Springer, vol. 17, pp. 333-343 (2013) – Banos, O., Damas, M., Pomares, H., Rojas, I. On the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognition. Sensors, MDPI, vol. 12, no. 6, pp. 8039-8054 (2012) – Banos, O., Damas, M., Pomares, H., Prieto, A., Rojas, I.: Daily Living Activity Recognition based on Statistical Feature Quality Group Selection. Expert Systems with Applications, Elsevier, vol. 39, no. 9, pp. 8013-8021 (2012) 117 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 118. Selected Publications β€’ Book chapters – Banos, O., Toth M. A., Damas, M., Pomares, H., Rojas, I. Amft, O.: Evaluation of inertial sensor displacement effects in activity recognition systems. Science and Supercomputing in Europe (Information & Communication Technologies), HPC-Europe 2 (2013) ISBN: 978-84-338-5400-1 β€’ Conference papers – Banos, O., Damas, M., Pomares, H., Rojas, I.: Handling displacement effects in on-body sensor- based activity recognition. In: Proceedings of the 5th International Work-conference on Ambient Assisted Living an Active Ageing (IWAAL 2013), San Jose, Costa Rica, December 2-6, (2013) [BEST PAPER AWARD] – Banos, O., Damas, M., Pomares, H., Rojas, I.: Activity recognition based on a multi-sensor meta-classifier. In: Proceedings of the 2013 International Work Conference on Neural Networks (IWANN 2013), Tenerife, June 12-14, (2013) – Banos, O., Toth, M. A., Damas, M., Pomares, H., Rojas, I., Amft, O.: A benchmark dataset to evaluate sensor displacement in activity recognition. In: Proceedings of the 14th International Conference on Ubiquitous Computing (Ubicomp 2012), Pittsburgh, USA, September 5-8, (2012) 118 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 119. Selected Publications β€’ Conference papers (cont.) – Banos, O., Calatroni, A., Damas, M., Pomares, H., Rojas, I., Troester, G., Sagha, H., Millan, J. del R., Chavarriaga, R., Roggen, D.: Kinect=IMU? Learning MIMO Signal Mappings to Automatically Translate Activity Recognition Systems Across Sensor Modalities. In: Proceedings of the 16th annual International Symposium on Wearable Computers (ISWC 2012), Newcastle, United Kingdom, June 18-22 (2012) – Banos, O., Damas, M., Pomares, H., Rojas, I.: Human multisource activity recognition for AAL problems. In: Proceedings of the 5th International Symposium on Ubiquitous Computing and Ambient Intelligence (UCAmI 2011), Riviera Maya, Mexico, December 5-9, (2011) – Banos, O., Damas, M., Pomares, H., Rojas, I.: Recognition of Human Physical Activity based on a novel Hierarchical Weighted Classification scheme. In: Proceedings of the 2011 International Joint Conference on Neural Networks (IJCNN 2011), IEEE, San Jose, California, July 31-August 5, (2011) – Banos, O., Pomares, H., Rojas, I.: Ambient Living Activity Recognition based on Feature-set Ranking Using Intelligent Systems. In: Proceedings of the 2010 International Joint Conference on Neural Networks (IJCNN 2010), IEEE, Barcelona, July 18-23, (2010) 119 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS
  • 120. Future Work β€’ Collection of new large standard datasets β€’ Dynamic reconfiguration of the HWC β€’ Self-adaptive HWC β€’ Tolerance to other sensor technological and topological anomalies β€’ Multiple trainers and complex modalities in transfer learning β€’ Integration in commercial systems and end-user applications 120 INTRODUCTION TECHNOLOGICAL ANOMALIES DEPLOYMENT VARIATIONS NETWORK CHANGES CONCLUSIONS