SlideShare a Scribd company logo
1 of 48
Kinect=IMU? Learning MIMO Signal
Mappings to Automatically Translate Activity
Recognition Systems Across Sensor
Modalities
ISWC 2012, Newcastle (UK)
Oresti Baños1, Alberto Calatroni2, Miguel Damas1, Héctor Pomares1,
Ignacio Rojas1, Hesam Sagha3, José del R. Millán3,
Gerhard Tröster2, Ricardo Chavarriaga3, and Daniel Roggen2
1Department of Computer Architecture and Computer Technology, CITIC-UGR, University of Granada, SPAIN
2Wearable Computing Laboratory, ETH Zurich, SWITZERLAND
3CNBI, Center for Neuroprosthetics, École Polytechnique Fédérale de Lausanne, SWITZERLAND
FET-Open Grant #225938
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Problem statement
• Scenario
Transfer learning in AR
• Concept of transfer learning
– Origin in ML: “Need for lifelong machine learning methods that retain and reuse
previously learned knowledge” NIPS-95 workshop on “Learning to Learn”
– Mechanism, ability or means to recognize and apply knowledge and skills
learned in previous tasks or domains to novel tasks or domains
• Intended for
– Continuity of context-awareness across different sensing environments
– Network topology redundancy
– Collective and individual knowledge enhancement
• Advantages
– Knowledge may be conserved
– Less labeled supervision is needed (ideally no additional recordings)
– ‘Online’ process
– Possibly heterogeneous
Transfer learning in AR: related work
• Selected contributions
– On-body sensors ::: Calatroni et al. (2011)
• Model parameters
• Labels
– Ambient sensors ::: van Kasteren et al.
(2010)
• Common meta-feature space
• Limitations
– Long time scales operation
– Possible incomplete transfer
– Difficult transfer across modalities
A. Calatroni,D. Roggen, and G. Tröster, “Automatic transfer of activity recognition
capabilitiesbetween body-worn motion sensors: Training newcomers to recognize
locomotion,” in Proc. 8th Int Conf on Networked Sensing Systems, 2011.
T. van Kasteren,G. Englebienne,and B. Kröse, “Transferringknowledge of activity
recognition across sensor networks,” in Proc. 8th Int. Conf on Pervasive Computing,
2010, pp. 283–300.
Translation setup (Kinect ↔ IMU)
Skeleton Tracking System
(Kinect)
Body-worn Inertial Measurement Unit
(Xsens)
Translation setup (Kinect ↔ IMU)
Skeleton Tracking System
(Kinect)
– RGB camera, IR LED, IR camera
– Depth map
– 15 joint skeleton
– 3D joint coordinates (POS in mm)
– Tracking range: 1.2-3.5m
Body-worn Inertial Measurement Unit
(Xsens)
– Accurate 3D orientation
– Several modalities (ACC, GYR,
MAG)
Translation setup (Kinect ↔ IMU)
Kinect (Position) IMU (Acceleration)
Translation setup (Kinect ↔ IMU)
Kinect (Position) IMU (Acceleration)
IMU (Acceleration)
Translation setup (Kinect ↔ IMU)
Kinect (Position)
IMU (Acceleration)
Translation setup (Kinect ↔ IMU)
Kinect (Position)
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-0.5
0
0.5
1
1.5
Time (s)
Acceleration(G)
X
Y
Z
Translation method
• System identification (signal level)
• Translation architectures (classification level)
– Template translation
– Signal translation
IMU (Acceleration)
Translation: Kinect to IMU
Kinect (Position)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
IMU (Acceleration)
Kinect to IMU (signal mapping)
Kinect (Position)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Coexistence… (T)
0 20 40
-1
0
1
2
Time (s)
Position(m)
0 20 40
-1
0
1
2
Time (s)Acceleration(G)
IMU (Acceleration)
Kinect to IMU (signal mapping)
Kinect (Position)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
Signal mapping
• Linear MIMO mapping
– Definition
• Ψ𝑆→𝑇 𝑡 ∝ 𝐵(𝑙) → 𝑋 𝑇 𝑡 = 𝐵(𝑙)𝑋𝑆(𝑡)
• 𝐵 𝑙 =
𝑏11(𝑙) 𝑏12(𝑙) ⋯ 𝑏1𝑀(𝑙)
𝑏21(𝑙) 𝑏22(𝑙) ⋯ 𝑏2𝑀(𝑙)
⋮
𝑏 𝑁1(𝑙)
⋮
𝑏 𝑁2(𝑙)
⋮
⋯
⋮
𝑏 𝑁𝑀(𝑙)
𝑏𝑖𝑘 𝑙 = 𝑏𝑖𝑘
(0)
𝑙−𝑠 𝑖𝑘 + 𝑏𝑖𝑘
(1)
𝑙−𝑠 𝑖𝑘−1 + ⋯ + 𝑏𝑖𝑘
(𝑞)
𝑙−𝑠 𝑖𝑘−𝑞 𝑙−𝑝 𝑥 𝑡 = 𝑥(𝑡 − 𝑝)
– Transformations modeling:
• Scaling  𝑏𝑖𝑘
(𝑟)
=
𝐾𝑖𝑘, 𝑟 = 0 ∧ 𝑖 = 𝑗
0, 𝑟 > 0
• Rotation  𝑏𝑖𝑘
(𝑟)
=
𝑅𝑖𝑘, 𝑟 = 0
0, 𝑟 > 0
• Differentiation of order h  𝑏𝑖𝑘
(𝑟)
=
𝐻𝑖𝑘
(𝑟)
, 𝑟 ≤ ℎ
0, 𝑟 > ℎ
Coefficients of the polynomial
obtained by means of a LS method
IMU (Acceleration)
Kinect to IMU (template translation)
Kinect (Position)
System S (source domain) System T (target domain)
Signal
level
Classification
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
IMU (Acceleration)
Kinect to IMU (template translation)
Kinect (Position)
System S (source domain) System T (target domain)
Signal
level
Classification
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
IMU (Acceleration)
Kinect to IMU (template translation)
Kinect (Position)
System S (source domain) System T (target domain)
Signal
level
Classification
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-0.5
0
0.5
1
1.5
Time (s)
Acceleration(G)
^X
^Y
^Z
IMU (Acceleration)
Translation method (Kinect  IMU)
Kinect (Position)
System S (source domain) System T (target domain)
Signal
level
Classification
level
L1 L2 L3
Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
0 1 2 3
0
0.5
1
1.5
Time (s)
Acceleration(G)
X
Y
Z
^X
^Y
^Z
IMU (Acceleration)
Kinect to IMU (template translation)
Kinect (Position)
System S (source domain) System T (target domain)
Signal
level
Classification
level
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
L1 L2 L3
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
0 2 4 6
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 2 4
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
L1 L2 L3
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
IMU (Acceleration)
Kinect to IMU (template translation)
Kinect (Position)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
L1 L2 L3
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
L1 L2 L3
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 5 10
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
0 2 4
-1
0
1
2
Time (s)
Acceleration(G)
^X
^Y
^Z
IMU (Acceleration)
Kinect to IMU (template translation)
Kinect (Position)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
L1 L2 L3
Kinect (Position)
IMU to Kinect
IMU (Acceleration)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
Kinect (Position)
IMU to Kinect (signal mapping)
IMU (Acceleration)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
Coexistence… (T)
0 20 40
-1
0
1
2
Time (s)
Position(m)
0 20 40
-1
0
1
2
Time (s)
Acceleration(G)
Kinect (Position)
IMU to Kinect (signal mapping)
IMU (Acceleration)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
Kinect (Position)
IMU to Kinect (signal translation)
IMU (Acceleration)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
Kinect (Position)
IMU to Kinect (signal translation)
IMU (Acceleration)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
Kinect (Position)
IMU to Kinect (signal translation)
IMU (Acceleration)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
0 1 2 3
-1
0
1
2
Time (s)
Position(m)
X
Y
Z
Kinect (Position)
IMU to Kinect (signal translation)
IMU (Acceleration)
𝑋𝑆(𝑡) 𝑋 𝑇(𝑡)
System S (source domain) System T (target domain)
Signal
level
Classification
level
Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
0 1 2 3
-0.5
0
0.5
1
1.5
Time (s)Acceleration(G)
^X
^Y
^Z
Experimental setup
Kinect  http://code.google.com/p/qtkinectwrapper/
Xsens  http://crnt.sourceforge.net/CRN_Toolbox/References.html
Dataset
• Two scenarios
Geometric Gestures (HCI) Idle (Background)
~5 min of data5 gestures, 48 instances per gesture
Evaluation
• Analyzed transfers
– Kinect (position):
• HAND
– IMUs (acceleration):
• RIGHT LOWER ARM (RLA)
• RIGHT UPPER ARM (RUA)
• BACK
Evaluation
• Model
– MIMO mapping with 10 tap delay
• Mapping domains
– Problem-domain mapping (PDM)
– Gesture-specific mapping (GSM)
– Unrelated-domain mapping (UDM)
• Results
– Mapping learning: 100 samples (~3.3s)
– Mapping testing: rest of unused instances
– Selection randomly repeated 20 times in an outer CV process
Translation accuracy
• Model
– 3-NN, FS = max. & min.
– 5-fold cross validation
– 100 repetitions
• Results
To RLA To RUA To BACK From RLA From RUA From BACK
0
20
40
60
80
100
Accuracy(%)
BS BT PDM GSM UDM
From Kinect … … to Kinect
Translation accuracy
• Model
– 3-NN, FS1 = mean, FS2 = max. & min.
– 5-fold cross validation
– 100 repetitions
• Results (UDM)
100 200 500 1k 2k 4k 9k
#Samples
FS1BS
FS1BT
FS1T
FS2BS
FS2BT
FS2T
100 200 500 1k 2k 4k 9k
50
60
70
80
90
100
#Samples
Accuracy(%)
From Kinect to IMU (RLA) From IMU (RLA) to Kinect
Encountered limitations
• General model challenges/limitations
– Not all the mappings might be allowed (Temperature  Gyro?)
• Kinect ↔ IMU challenges/limitations
– Different frame of reference (IMU  local vs. Kinect  world)
– Occlusions
– Subject out of range
– Torsions
Conclusions and future work
• Transfer system based on
– MIMO mapping model
– Template/Signal translation
• MAPPING: as few as a single gesture (~3 seconds)
• Successful translation across sensor modalities, Kinect ↔ IMU (4% and 8%
below baseline)
• NEXT STEPS
– Analyze the effect of data loss (occlusions, anomalies, etc.)
– Higher characterization of the considered MIMO model (i.e., ‘q’ value)
– Alternative mapping models: ARMA, TDNN, LSSVM
– Combination of sensors (homogeneous/heterogeneous)
– Test in more complex setups/real-world situations
Thank you for your attention.
Questions?
Oresti Baños Legrán
Dep. Computer Architecture & Computer Technology
Faculty of Computer & Electrical Engineering (ETSIIT)
University of Granada, Granada (SPAIN)
Email: oresti@atc.ugr.es
Phone: +34 958 241 516
Fax: +34 958 248 993
Work supported in part by the FP7 project OPPORTUNITY under FET-Open grant number 225938, the Spanish CICYT Project TIN2007-60587,
Junta de Andalucia Projects P07-TIC-02768 and P07-TIC-02906, the CENIT project AmIVital and the FPU Spanish grant AP2009-2244

More Related Content

Viewers also liked

Chapitre 3 robotique e
Chapitre 3 robotique eChapitre 3 robotique e
Chapitre 3 robotique eMouna Souissi
 
Correction examen Robotique
Correction examen Robotique Correction examen Robotique
Correction examen Robotique Mouna Souissi
 
Cours robotique complet
Cours robotique completCours robotique complet
Cours robotique completMouna Souissi
 
Food Analysis Quality Control
Food Analysis Quality ControlFood Analysis Quality Control
Food Analysis Quality ControlVedpal Yadav
 

Viewers also liked (6)

Chapitre 3 robotique e
Chapitre 3 robotique eChapitre 3 robotique e
Chapitre 3 robotique e
 
Correction examen Robotique
Correction examen Robotique Correction examen Robotique
Correction examen Robotique
 
Cours robotique
Cours robotiqueCours robotique
Cours robotique
 
Modele scara
Modele scaraModele scara
Modele scara
 
Cours robotique complet
Cours robotique completCours robotique complet
Cours robotique complet
 
Food Analysis Quality Control
Food Analysis Quality ControlFood Analysis Quality Control
Food Analysis Quality Control
 

More from Oresti Banos

Measuring human behaviour to inform e-coaching actions
Measuring human behaviour to inform e-coaching actionsMeasuring human behaviour to inform e-coaching actions
Measuring human behaviour to inform e-coaching actionsOresti Banos
 
Measuring human behaviour by sensing everyday mobile interactions
Measuring human behaviour by sensing everyday mobile interactionsMeasuring human behaviour by sensing everyday mobile interactions
Measuring human behaviour by sensing everyday mobile interactionsOresti Banos
 
Emotion AI: Concepts, Challenges and Opportunities
Emotion AI: Concepts, Challenges and OpportunitiesEmotion AI: Concepts, Challenges and Opportunities
Emotion AI: Concepts, Challenges and OpportunitiesOresti Banos
 
Biosignal Processing
Biosignal ProcessingBiosignal Processing
Biosignal ProcessingOresti Banos
 
Automatic mapping of motivational text messages into ontological entities for...
Automatic mapping of motivational text messages into ontological entities for...Automatic mapping of motivational text messages into ontological entities for...
Automatic mapping of motivational text messages into ontological entities for...Oresti Banos
 
Enabling remote assessment of cognitive behaviour through mobile experience s...
Enabling remote assessment of cognitive behaviour through mobile experience s...Enabling remote assessment of cognitive behaviour through mobile experience s...
Enabling remote assessment of cognitive behaviour through mobile experience s...Oresti Banos
 
Ontological Modeling of Motivational Messages for Physical Activity Coaching
Ontological Modeling of Motivational Messages for Physical Activity CoachingOntological Modeling of Motivational Messages for Physical Activity Coaching
Ontological Modeling of Motivational Messages for Physical Activity CoachingOresti Banos
 
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...Oresti Banos
 
Analysis of the Innovation Outputs in mHealth for Patient Monitoring
Analysis of the Innovation Outputs in mHealth for Patient MonitoringAnalysis of the Innovation Outputs in mHealth for Patient Monitoring
Analysis of the Innovation Outputs in mHealth for Patient MonitoringOresti Banos
 
First Approach to Automatic Performance Status Evaluation and Physical Activi...
First Approach to Automatic Performance Status Evaluation and Physical Activi...First Approach to Automatic Performance Status Evaluation and Physical Activi...
First Approach to Automatic Performance Status Evaluation and Physical Activi...Oresti Banos
 
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...Oresti Banos
 
High-Level Context Inference for Human Behavior Identi cation
High-Level Context Inference for Human Behavior IdenticationHigh-Level Context Inference for Human Behavior Identication
High-Level Context Inference for Human Behavior Identi cationOresti Banos
 
Facilitating Trunk Endurance Assessment by means of Mobile Health Technologies
Facilitating Trunk Endurance Assessment by means of Mobile Health TechnologiesFacilitating Trunk Endurance Assessment by means of Mobile Health Technologies
Facilitating Trunk Endurance Assessment by means of Mobile Health TechnologiesOresti Banos
 
Mining Human Behavior for Health Promotion
Mining Human Behavior for Health PromotionMining Human Behavior for Health Promotion
Mining Human Behavior for Health PromotionOresti Banos
 
Multiwindow Fusion for Wearable Activity Recognition
Multiwindow Fusion for Wearable Activity RecognitionMultiwindow Fusion for Wearable Activity Recognition
Multiwindow Fusion for Wearable Activity RecognitionOresti Banos
 
Mining Minds: an innovative framework for personalized health and wellness su...
Mining Minds: an innovative framework for personalized health and wellness su...Mining Minds: an innovative framework for personalized health and wellness su...
Mining Minds: an innovative framework for personalized health and wellness su...Oresti Banos
 
A Novel Watermarking Scheme for Image Authentication in Social Networks
A Novel Watermarking Scheme for Image Authentication in Social NetworksA Novel Watermarking Scheme for Image Authentication in Social Networks
A Novel Watermarking Scheme for Image Authentication in Social NetworksOresti Banos
 
mHealthDroid: a novel framework for agile development of mobile health appli...
mHealthDroid: a novel framework for agile development of mobile health appli...mHealthDroid: a novel framework for agile development of mobile health appli...
mHealthDroid: a novel framework for agile development of mobile health appli...Oresti Banos
 
Sistema automático para la estimación de la presión arterial a partir de pará...
Sistema automático para la estimación de la presión arterial a partir de pará...Sistema automático para la estimación de la presión arterial a partir de pará...
Sistema automático para la estimación de la presión arterial a partir de pará...Oresti Banos
 

More from Oresti Banos (20)

Measuring human behaviour to inform e-coaching actions
Measuring human behaviour to inform e-coaching actionsMeasuring human behaviour to inform e-coaching actions
Measuring human behaviour to inform e-coaching actions
 
Measuring human behaviour by sensing everyday mobile interactions
Measuring human behaviour by sensing everyday mobile interactionsMeasuring human behaviour by sensing everyday mobile interactions
Measuring human behaviour by sensing everyday mobile interactions
 
Emotion AI: Concepts, Challenges and Opportunities
Emotion AI: Concepts, Challenges and OpportunitiesEmotion AI: Concepts, Challenges and Opportunities
Emotion AI: Concepts, Challenges and Opportunities
 
Biodata analysis
Biodata analysisBiodata analysis
Biodata analysis
 
Biosignal Processing
Biosignal ProcessingBiosignal Processing
Biosignal Processing
 
Automatic mapping of motivational text messages into ontological entities for...
Automatic mapping of motivational text messages into ontological entities for...Automatic mapping of motivational text messages into ontological entities for...
Automatic mapping of motivational text messages into ontological entities for...
 
Enabling remote assessment of cognitive behaviour through mobile experience s...
Enabling remote assessment of cognitive behaviour through mobile experience s...Enabling remote assessment of cognitive behaviour through mobile experience s...
Enabling remote assessment of cognitive behaviour through mobile experience s...
 
Ontological Modeling of Motivational Messages for Physical Activity Coaching
Ontological Modeling of Motivational Messages for Physical Activity CoachingOntological Modeling of Motivational Messages for Physical Activity Coaching
Ontological Modeling of Motivational Messages for Physical Activity Coaching
 
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...
Mobile Health System for Evaluation of Breast Cancer Patients During Treatmen...
 
Analysis of the Innovation Outputs in mHealth for Patient Monitoring
Analysis of the Innovation Outputs in mHealth for Patient MonitoringAnalysis of the Innovation Outputs in mHealth for Patient Monitoring
Analysis of the Innovation Outputs in mHealth for Patient Monitoring
 
First Approach to Automatic Performance Status Evaluation and Physical Activi...
First Approach to Automatic Performance Status Evaluation and Physical Activi...First Approach to Automatic Performance Status Evaluation and Physical Activi...
First Approach to Automatic Performance Status Evaluation and Physical Activi...
 
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...
First Approach to Automatic Measurement of Frontal Plane Projection Angle Dur...
 
High-Level Context Inference for Human Behavior Identi cation
High-Level Context Inference for Human Behavior IdenticationHigh-Level Context Inference for Human Behavior Identication
High-Level Context Inference for Human Behavior Identi cation
 
Facilitating Trunk Endurance Assessment by means of Mobile Health Technologies
Facilitating Trunk Endurance Assessment by means of Mobile Health TechnologiesFacilitating Trunk Endurance Assessment by means of Mobile Health Technologies
Facilitating Trunk Endurance Assessment by means of Mobile Health Technologies
 
Mining Human Behavior for Health Promotion
Mining Human Behavior for Health PromotionMining Human Behavior for Health Promotion
Mining Human Behavior for Health Promotion
 
Multiwindow Fusion for Wearable Activity Recognition
Multiwindow Fusion for Wearable Activity RecognitionMultiwindow Fusion for Wearable Activity Recognition
Multiwindow Fusion for Wearable Activity Recognition
 
Mining Minds: an innovative framework for personalized health and wellness su...
Mining Minds: an innovative framework for personalized health and wellness su...Mining Minds: an innovative framework for personalized health and wellness su...
Mining Minds: an innovative framework for personalized health and wellness su...
 
A Novel Watermarking Scheme for Image Authentication in Social Networks
A Novel Watermarking Scheme for Image Authentication in Social NetworksA Novel Watermarking Scheme for Image Authentication in Social Networks
A Novel Watermarking Scheme for Image Authentication in Social Networks
 
mHealthDroid: a novel framework for agile development of mobile health appli...
mHealthDroid: a novel framework for agile development of mobile health appli...mHealthDroid: a novel framework for agile development of mobile health appli...
mHealthDroid: a novel framework for agile development of mobile health appli...
 
Sistema automático para la estimación de la presión arterial a partir de pará...
Sistema automático para la estimación de la presión arterial a partir de pará...Sistema automático para la estimación de la presión arterial a partir de pará...
Sistema automático para la estimación de la presión arterial a partir de pará...
 

Recently uploaded

Servosystem Theory / Cybernetic Theory by Petrovic
Servosystem Theory / Cybernetic Theory by PetrovicServosystem Theory / Cybernetic Theory by Petrovic
Servosystem Theory / Cybernetic Theory by PetrovicAditi Jain
 
Replisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdfReplisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdfAtiaGohar1
 
Thermodynamics ,types of system,formulae ,gibbs free energy .pptx
Thermodynamics ,types of system,formulae ,gibbs free energy .pptxThermodynamics ,types of system,formulae ,gibbs free energy .pptx
Thermodynamics ,types of system,formulae ,gibbs free energy .pptxuniversity
 
CHROMATOGRAPHY PALLAVI RAWAT.pptx
CHROMATOGRAPHY  PALLAVI RAWAT.pptxCHROMATOGRAPHY  PALLAVI RAWAT.pptx
CHROMATOGRAPHY PALLAVI RAWAT.pptxpallavirawat456
 
Loudspeaker- direct radiating type and horn type.pptx
Loudspeaker- direct radiating type and horn type.pptxLoudspeaker- direct radiating type and horn type.pptx
Loudspeaker- direct radiating type and horn type.pptxpriyankatabhane
 
Dubai Calls Girl Lisa O525547819 Lexi Call Girls In Dubai
Dubai Calls Girl Lisa O525547819 Lexi Call Girls In DubaiDubai Calls Girl Lisa O525547819 Lexi Call Girls In Dubai
Dubai Calls Girl Lisa O525547819 Lexi Call Girls In Dubaikojalkojal131
 
whole genome sequencing new and its types including shortgun and clone by clone
whole genome sequencing new  and its types including shortgun and clone by clonewhole genome sequencing new  and its types including shortgun and clone by clone
whole genome sequencing new and its types including shortgun and clone by clonechaudhary charan shingh university
 
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2AuEnriquezLontok
 
Pests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdfPests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdfPirithiRaju
 
Oxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxOxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxfarhanvvdk
 
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...D. B. S. College Kanpur
 
GENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptx
GENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptxGENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptx
GENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptxRitchAndruAgustin
 
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Christina Parmionova
 
Pests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPRPests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPRPirithiRaju
 
User Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationUser Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationColumbia Weather Systems
 
The Sensory Organs, Anatomy and Function
The Sensory Organs, Anatomy and FunctionThe Sensory Organs, Anatomy and Function
The Sensory Organs, Anatomy and FunctionJadeNovelo1
 
bonjourmadame.tumblr.com bhaskar's girls
bonjourmadame.tumblr.com bhaskar's girlsbonjourmadame.tumblr.com bhaskar's girls
bonjourmadame.tumblr.com bhaskar's girlshansessene
 
PROJECTILE MOTION-Horizontal and Vertical
PROJECTILE MOTION-Horizontal and VerticalPROJECTILE MOTION-Horizontal and Vertical
PROJECTILE MOTION-Horizontal and VerticalMAESTRELLAMesa2
 
well logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptxwell logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptxzaydmeerab121
 

Recently uploaded (20)

Servosystem Theory / Cybernetic Theory by Petrovic
Servosystem Theory / Cybernetic Theory by PetrovicServosystem Theory / Cybernetic Theory by Petrovic
Servosystem Theory / Cybernetic Theory by Petrovic
 
Replisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdfReplisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdf
 
Thermodynamics ,types of system,formulae ,gibbs free energy .pptx
Thermodynamics ,types of system,formulae ,gibbs free energy .pptxThermodynamics ,types of system,formulae ,gibbs free energy .pptx
Thermodynamics ,types of system,formulae ,gibbs free energy .pptx
 
CHROMATOGRAPHY PALLAVI RAWAT.pptx
CHROMATOGRAPHY  PALLAVI RAWAT.pptxCHROMATOGRAPHY  PALLAVI RAWAT.pptx
CHROMATOGRAPHY PALLAVI RAWAT.pptx
 
Loudspeaker- direct radiating type and horn type.pptx
Loudspeaker- direct radiating type and horn type.pptxLoudspeaker- direct radiating type and horn type.pptx
Loudspeaker- direct radiating type and horn type.pptx
 
Dubai Calls Girl Lisa O525547819 Lexi Call Girls In Dubai
Dubai Calls Girl Lisa O525547819 Lexi Call Girls In DubaiDubai Calls Girl Lisa O525547819 Lexi Call Girls In Dubai
Dubai Calls Girl Lisa O525547819 Lexi Call Girls In Dubai
 
whole genome sequencing new and its types including shortgun and clone by clone
whole genome sequencing new  and its types including shortgun and clone by clonewhole genome sequencing new  and its types including shortgun and clone by clone
whole genome sequencing new and its types including shortgun and clone by clone
 
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
 
Pests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdfPests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdf
 
Oxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxOxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptx
 
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
 
GENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptx
GENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptxGENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptx
GENERAL PHYSICS 2 REFRACTION OF LIGHT SENIOR HIGH SCHOOL GENPHYS2.pptx
 
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
 
Pests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPRPests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPR
 
User Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationUser Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather Station
 
The Sensory Organs, Anatomy and Function
The Sensory Organs, Anatomy and FunctionThe Sensory Organs, Anatomy and Function
The Sensory Organs, Anatomy and Function
 
bonjourmadame.tumblr.com bhaskar's girls
bonjourmadame.tumblr.com bhaskar's girlsbonjourmadame.tumblr.com bhaskar's girls
bonjourmadame.tumblr.com bhaskar's girls
 
PROJECTILE MOTION-Horizontal and Vertical
PROJECTILE MOTION-Horizontal and VerticalPROJECTILE MOTION-Horizontal and Vertical
PROJECTILE MOTION-Horizontal and Vertical
 
well logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptxwell logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptx
 
Let’s Say Someone Did Drop the Bomb. Then What?
Let’s Say Someone Did Drop the Bomb. Then What?Let’s Say Someone Did Drop the Bomb. Then What?
Let’s Say Someone Did Drop the Bomb. Then What?
 

Kinect=IMU? Learning MIMO Signal Mappings to Automatically Translate Activity Recognition Systems Across Sensor Modalities

  • 1. Kinect=IMU? Learning MIMO Signal Mappings to Automatically Translate Activity Recognition Systems Across Sensor Modalities ISWC 2012, Newcastle (UK) Oresti Baños1, Alberto Calatroni2, Miguel Damas1, Héctor Pomares1, Ignacio Rojas1, Hesam Sagha3, José del R. Millán3, Gerhard Tröster2, Ricardo Chavarriaga3, and Daniel Roggen2 1Department of Computer Architecture and Computer Technology, CITIC-UGR, University of Granada, SPAIN 2Wearable Computing Laboratory, ETH Zurich, SWITZERLAND 3CNBI, Center for Neuroprosthetics, École Polytechnique Fédérale de Lausanne, SWITZERLAND FET-Open Grant #225938
  • 13. Transfer learning in AR • Concept of transfer learning – Origin in ML: “Need for lifelong machine learning methods that retain and reuse previously learned knowledge” NIPS-95 workshop on “Learning to Learn” – Mechanism, ability or means to recognize and apply knowledge and skills learned in previous tasks or domains to novel tasks or domains • Intended for – Continuity of context-awareness across different sensing environments – Network topology redundancy – Collective and individual knowledge enhancement • Advantages – Knowledge may be conserved – Less labeled supervision is needed (ideally no additional recordings) – ‘Online’ process – Possibly heterogeneous
  • 14. Transfer learning in AR: related work • Selected contributions – On-body sensors ::: Calatroni et al. (2011) • Model parameters • Labels – Ambient sensors ::: van Kasteren et al. (2010) • Common meta-feature space • Limitations – Long time scales operation – Possible incomplete transfer – Difficult transfer across modalities A. Calatroni,D. Roggen, and G. Tröster, “Automatic transfer of activity recognition capabilitiesbetween body-worn motion sensors: Training newcomers to recognize locomotion,” in Proc. 8th Int Conf on Networked Sensing Systems, 2011. T. van Kasteren,G. Englebienne,and B. Kröse, “Transferringknowledge of activity recognition across sensor networks,” in Proc. 8th Int. Conf on Pervasive Computing, 2010, pp. 283–300.
  • 15. Translation setup (Kinect ↔ IMU) Skeleton Tracking System (Kinect) Body-worn Inertial Measurement Unit (Xsens)
  • 16. Translation setup (Kinect ↔ IMU) Skeleton Tracking System (Kinect) – RGB camera, IR LED, IR camera – Depth map – 15 joint skeleton – 3D joint coordinates (POS in mm) – Tracking range: 1.2-3.5m Body-worn Inertial Measurement Unit (Xsens) – Accurate 3D orientation – Several modalities (ACC, GYR, MAG)
  • 17. Translation setup (Kinect ↔ IMU) Kinect (Position) IMU (Acceleration)
  • 18. Translation setup (Kinect ↔ IMU) Kinect (Position) IMU (Acceleration)
  • 19. IMU (Acceleration) Translation setup (Kinect ↔ IMU) Kinect (Position)
  • 20. IMU (Acceleration) Translation setup (Kinect ↔ IMU) Kinect (Position) 0 1 2 3 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -0.5 0 0.5 1 1.5 Time (s) Acceleration(G) X Y Z
  • 21. Translation method • System identification (signal level) • Translation architectures (classification level) – Template translation – Signal translation
  • 22. IMU (Acceleration) Translation: Kinect to IMU Kinect (Position) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m)
  • 23. IMU (Acceleration) Kinect to IMU (signal mapping) Kinect (Position) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Coexistence… (T) 0 20 40 -1 0 1 2 Time (s) Position(m) 0 20 40 -1 0 1 2 Time (s)Acceleration(G)
  • 24. IMU (Acceleration) Kinect to IMU (signal mapping) Kinect (Position) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
  • 25. Signal mapping • Linear MIMO mapping – Definition • Ψ𝑆→𝑇 𝑡 ∝ 𝐵(𝑙) → 𝑋 𝑇 𝑡 = 𝐵(𝑙)𝑋𝑆(𝑡) • 𝐵 𝑙 = 𝑏11(𝑙) 𝑏12(𝑙) ⋯ 𝑏1𝑀(𝑙) 𝑏21(𝑙) 𝑏22(𝑙) ⋯ 𝑏2𝑀(𝑙) ⋮ 𝑏 𝑁1(𝑙) ⋮ 𝑏 𝑁2(𝑙) ⋮ ⋯ ⋮ 𝑏 𝑁𝑀(𝑙) 𝑏𝑖𝑘 𝑙 = 𝑏𝑖𝑘 (0) 𝑙−𝑠 𝑖𝑘 + 𝑏𝑖𝑘 (1) 𝑙−𝑠 𝑖𝑘−1 + ⋯ + 𝑏𝑖𝑘 (𝑞) 𝑙−𝑠 𝑖𝑘−𝑞 𝑙−𝑝 𝑥 𝑡 = 𝑥(𝑡 − 𝑝) – Transformations modeling: • Scaling  𝑏𝑖𝑘 (𝑟) = 𝐾𝑖𝑘, 𝑟 = 0 ∧ 𝑖 = 𝑗 0, 𝑟 > 0 • Rotation  𝑏𝑖𝑘 (𝑟) = 𝑅𝑖𝑘, 𝑟 = 0 0, 𝑟 > 0 • Differentiation of order h  𝑏𝑖𝑘 (𝑟) = 𝐻𝑖𝑘 (𝑟) , 𝑟 ≤ ℎ 0, 𝑟 > ℎ Coefficients of the polynomial obtained by means of a LS method
  • 26. IMU (Acceleration) Kinect to IMU (template translation) Kinect (Position) System S (source domain) System T (target domain) Signal level Classification level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡)
  • 27. IMU (Acceleration) Kinect to IMU (template translation) Kinect (Position) System S (source domain) System T (target domain) Signal level Classification level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡) 0 1 2 3 -1 0 1 2 Time (s) Position(m) X Y Z
  • 28. IMU (Acceleration) Kinect to IMU (template translation) Kinect (Position) System S (source domain) System T (target domain) Signal level Classification level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡) 0 1 2 3 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -0.5 0 0.5 1 1.5 Time (s) Acceleration(G) ^X ^Y ^Z
  • 29. IMU (Acceleration) Translation method (Kinect  IMU) Kinect (Position) System S (source domain) System T (target domain) Signal level Classification level L1 L2 L3 Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡) 0 1 2 3 0 0.5 1 1.5 Time (s) Acceleration(G) X Y Z ^X ^Y ^Z
  • 30. IMU (Acceleration) Kinect to IMU (template translation) Kinect (Position) System S (source domain) System T (target domain) Signal level Classification level 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) L1 L2 L3 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) 0 2 4 6 -1 0 1 2 Time (s) Position(m) X Y Z 0 2 4 -1 0 1 2 Time (s) Position(m) X Y Z 0 1 2 3 -1 0 1 2 Time (s) Position(m) Ψ𝑆→𝑇 𝑡 : 𝑋𝑆(𝑡) → 𝑋 𝑇(𝑡) ≈ 𝑋 𝑇(𝑡) 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z L1 L2 L3 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z
  • 31. IMU (Acceleration) Kinect to IMU (template translation) Kinect (Position) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level L1 L2 L3 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z L1 L2 L3 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 5 10 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z 0 2 4 -1 0 1 2 Time (s) Acceleration(G) ^X ^Y ^Z
  • 32. IMU (Acceleration) Kinect to IMU (template translation) Kinect (Position) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level L1 L2 L3
  • 33. Kinect (Position) IMU to Kinect IMU (Acceleration) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level
  • 34. Kinect (Position) IMU to Kinect (signal mapping) IMU (Acceleration) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level Coexistence… (T) 0 20 40 -1 0 1 2 Time (s) Position(m) 0 20 40 -1 0 1 2 Time (s) Acceleration(G)
  • 35. Kinect (Position) IMU to Kinect (signal mapping) IMU (Acceleration) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
  • 36. Kinect (Position) IMU to Kinect (signal translation) IMU (Acceleration) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
  • 37. Kinect (Position) IMU to Kinect (signal translation) IMU (Acceleration) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡)
  • 38. Kinect (Position) IMU to Kinect (signal translation) IMU (Acceleration) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡) 0 1 2 3 -1 0 1 2 Time (s) Position(m) X Y Z
  • 39. Kinect (Position) IMU to Kinect (signal translation) IMU (Acceleration) 𝑋𝑆(𝑡) 𝑋 𝑇(𝑡) System S (source domain) System T (target domain) Signal level Classification level Ψ 𝑇→𝑆 𝑡 : 𝑋 𝑇(𝑡) → 𝑋𝑆(𝑡) ≈ 𝑋𝑆(𝑡) 0 1 2 3 -0.5 0 0.5 1 1.5 Time (s)Acceleration(G) ^X ^Y ^Z
  • 40. Experimental setup Kinect  http://code.google.com/p/qtkinectwrapper/ Xsens  http://crnt.sourceforge.net/CRN_Toolbox/References.html
  • 41. Dataset • Two scenarios Geometric Gestures (HCI) Idle (Background) ~5 min of data5 gestures, 48 instances per gesture
  • 42. Evaluation • Analyzed transfers – Kinect (position): • HAND – IMUs (acceleration): • RIGHT LOWER ARM (RLA) • RIGHT UPPER ARM (RUA) • BACK
  • 43. Evaluation • Model – MIMO mapping with 10 tap delay • Mapping domains – Problem-domain mapping (PDM) – Gesture-specific mapping (GSM) – Unrelated-domain mapping (UDM) • Results – Mapping learning: 100 samples (~3.3s) – Mapping testing: rest of unused instances – Selection randomly repeated 20 times in an outer CV process
  • 44. Translation accuracy • Model – 3-NN, FS = max. & min. – 5-fold cross validation – 100 repetitions • Results To RLA To RUA To BACK From RLA From RUA From BACK 0 20 40 60 80 100 Accuracy(%) BS BT PDM GSM UDM From Kinect … … to Kinect
  • 45. Translation accuracy • Model – 3-NN, FS1 = mean, FS2 = max. & min. – 5-fold cross validation – 100 repetitions • Results (UDM) 100 200 500 1k 2k 4k 9k #Samples FS1BS FS1BT FS1T FS2BS FS2BT FS2T 100 200 500 1k 2k 4k 9k 50 60 70 80 90 100 #Samples Accuracy(%) From Kinect to IMU (RLA) From IMU (RLA) to Kinect
  • 46. Encountered limitations • General model challenges/limitations – Not all the mappings might be allowed (Temperature  Gyro?) • Kinect ↔ IMU challenges/limitations – Different frame of reference (IMU  local vs. Kinect  world) – Occlusions – Subject out of range – Torsions
  • 47. Conclusions and future work • Transfer system based on – MIMO mapping model – Template/Signal translation • MAPPING: as few as a single gesture (~3 seconds) • Successful translation across sensor modalities, Kinect ↔ IMU (4% and 8% below baseline) • NEXT STEPS – Analyze the effect of data loss (occlusions, anomalies, etc.) – Higher characterization of the considered MIMO model (i.e., ‘q’ value) – Alternative mapping models: ARMA, TDNN, LSSVM – Combination of sensors (homogeneous/heterogeneous) – Test in more complex setups/real-world situations
  • 48. Thank you for your attention. Questions? Oresti Baños Legrán Dep. Computer Architecture & Computer Technology Faculty of Computer & Electrical Engineering (ETSIIT) University of Granada, Granada (SPAIN) Email: oresti@atc.ugr.es Phone: +34 958 241 516 Fax: +34 958 248 993 Work supported in part by the FP7 project OPPORTUNITY under FET-Open grant number 225938, the Spanish CICYT Project TIN2007-60587, Junta de Andalucia Projects P07-TIC-02768 and P07-TIC-02906, the CENIT project AmIVital and the FPU Spanish grant AP2009-2244