Intelligent 
Robo,cs 
and 
Wearables 
Dzmitry Tsetserukou 
Assistant Professor 
Space 
Robo*cs 
Lab 
Wearable 
Media 
Lab 
The best way to predict the future is to 
invent it. 
Steve Jobs
Outline 
1. Research Background 
2. Teleoperation system TeleTA and robotic 
hand ExoPower 
3. NaviGoid: Mobile Robot Navigation with 
Haptic Vision 
4. LinkTouch: Wearable Fingertip Haptic 
Device 
5. iFeel_IM! Communication system 
2
Susumu Tachi (born January 1, 1946) is a 
professor of Graduate School of Media Design 
at Keio University and Professor Emeritus of the 
University of Tokyo. 
Prof. Tachi is also a Founding Director of the 
Robotics Society of Japan (RSJ), a Fellow of the 
Society of Instrument and Control Engineers 
(SICE) and is the Founding President of the 
Virtual Reality Society of Japan (VRSJ). 
Invisible 
cloaking 
Twister 
Telesar 
IV
Teleoperation system TeleTA
LinkTouch: true 2D display
NAVIgoid 
Robot Navigation with High Level of 
Immersion
Robot motion 
control by 
torso tilt 
Tactile, visual 
and auditory 
feedback 
Mobile Robot 
User 
wearing 
a Belt 
Interface 
LRF 
UHG-08LX 
by HOKUYO 
Robot-side PC 
CPU: Intel 
Core i7 640M 
(2.8 GHz) 
Mem:8GB 
Mobile Robot 
PeopleBot by 
MobileRobots 
Belt 
Interface 
NAVIgoid 
User bends 
forward 
yu 
Single 
tactor 
Vr 
Human 
Vh 
yr 
rw 
xo 
yo 
Obstacle 
rh 
Wall 
xr 
ro 
Flex 
sensor 
Belt 
Interface 
xu 
0 
LRF 
(a) Human operator (b) Mobile robot PeopleBot 
GOAL: 
development a new 
type of tactile interface 
a n d t e l e p r e s e n c e 
robotic system that can 
make operator of being 
“embodied”. 
NOVELTY: 
• engaging the user into 
teleoperation 
• a h i g h l e v e l o f 
immersion through 
p r o p r i o c e p t i v e 
telepresence 
• tactile feedback
Right Back 
Elastic band 
User’s waist 
Flex sensor I 
Flex sensor II 
Accelerometer 
Flex sensor III 
Flex sensor IV 
Front 
Left 
Geomagnetic sensor 
ProInterface 
Lug 
Accelerometer 
Contact slider 
Flex sensor 
Vibration 
motor 
Holder 
The ProInterafce (Proprioception-controlled Interface) detects the trunk 
stance through the flex sensors. The tactors vibrate to produce the tactile 
stimuli indicating: 
Stationary 
obstacle 
distance 
direction 
shape 
Mo v i n g 
obstacle 
distance 
location 
motion 
direction
Tracking of Moving Obstacle 
(a) 226th frame (b) 227th frame (c) 228th frame (d) 229th frame (e) 230th frame 
(a) 231th frame (b) 232th frame (c) 233th frame (d) 234th frame (e) 235th frame 
: robot, : new detected obstacle, : moving obstacle, : velocity of moving obstacle 
• Object is tracked by 
T 
using KF. x = ( p , v T ) T 
= ( p , p ,v ,v 
)T 
ot ot ot 
xt yt xt yt 
q (0 I) 
0 
1 
⎛ + Δ 
= ⎟⎟⎠ 
ot ot 
ot 
⎛ 
+ ⎟⎟⎠ 
⎛ 
= 
x 2 
+ 
, N ,(8 t) 
1 ⎟⎟⎠ 
o t q 
Δ p v 
v 
p 
v 
1 
t 
vt 
ot vt 
ot 
⎞ 
⎜⎜⎝ 
⎞ 
⎜⎜⎝ 
⎞ 
⎜⎜⎝ 
+ 
+ 
~ 
State vector: 
Object motion 
model (motion 
prediction): 
position speed 
process noise
Cloud swarm robotics
LinkTouch
[█■ 퐹↓푡 @ 퐹↓푛  ]=[█■ 푆↓12  퐶↓2 &− 푆↓43  퐶↓3 @− 
푆↓12  푆↓2 & 푆↓43  푆↓3  ] [█■ 휏↓1 ∕ 푙↓2  @ 휏↓2 ∕ 푙↓5   ] 
F 
Operation principle of PulseTouch 
LinkTouch: true 2D display 
Inverted five-bar linkage mechanism 
C Fn 
Fn 
Ft 
l1 
l3 l4 
l2 
l5 
Motor Faulhaber DC 
Motor 0615 4,5 S 
Gear Ratio GH Series 06/1 
64:1 
Encoder HXM3-64 
D i m e n s i o n s 
(W)×(H)×(D) [mm] 26.1×32×38.5 
Link length l1, l2, l3, 
l4, l5 [mm] 18, 25, 10, 10,25 
Weight [g] 13.5 
Normal force [N] 0.58
iFeel_IM! 
Communication system with rich emotional 
and haptic channels
iFeel_IM!: communication system with 
emotional and haptic channels 
The philosophy behind iFeel_IM! (intelligent system for Feeling 
enhancement powered by affect sensitive Instant Messenger) is 
“I feel [therefore] I am!” 
D. Tsetserukou, et al. Enhancing mediated interpersonal communication through affective haptics, 
INTETAIN 2009, pp. 246-251.
Architecture of iFeel_IM! 
interface of Affect 
Analysis Model 
chat text 
3D world Second Life 
Web-based 
emotion: 
intensity 
PC of each user 
Haptic Devices 
Controller 
HaptiTemper and HaptiShiver 
Chat 
log file 
D/A 
Driver 
box 
HaptiHeart HaptiHug HaptiTickler HaptiBatterfly
Affect Analysis Model: recognition of emotions from text 
Sentence 
I stage 
Symbolic cue analysis module 
Test for emoticons, abbreviations, 
acronyms, interjections, “?” and 
“!” marks, repeated punctuation 
and capital letters 
E“myeost”ic o n o r e m a b b r . “ n o ” 
Estimation of 
resulting 
emotion state 
Sentence pre-processing 
for 
parser 
Affect 
database 
II stage 
Syntactical structure 
analysis module 
Stanford Parser 
output of the parser 
Parser output 
processing 
Sentence 
annotated by 
emotion 
III stage 
Word-level 
analysis module 
IV stage 
Phrase-level 
analysis module 
V stage 
Sentence-level 
analysis module 
emotion category: intensity emotion category: intensity 
Emotions: Anger, Disgust, Fear, Guilt, Interest, Joy, Sadness, Shame, Surprise
Website: 
h:ps://sites.google.com/site/dzmitrytsetserukou/ 
Email: 
d.tsetserukou@skoltech.ru
Спасибо за внимание! 
Thank you for your attention!

Д. Тетерюков. Skoltech 2014

  • 1.
    Intelligent Robo,cs and Wearables Dzmitry Tsetserukou Assistant Professor Space Robo*cs Lab Wearable Media Lab The best way to predict the future is to invent it. Steve Jobs
  • 2.
    Outline 1. ResearchBackground 2. Teleoperation system TeleTA and robotic hand ExoPower 3. NaviGoid: Mobile Robot Navigation with Haptic Vision 4. LinkTouch: Wearable Fingertip Haptic Device 5. iFeel_IM! Communication system 2
  • 3.
    Susumu Tachi (bornJanuary 1, 1946) is a professor of Graduate School of Media Design at Keio University and Professor Emeritus of the University of Tokyo. Prof. Tachi is also a Founding Director of the Robotics Society of Japan (RSJ), a Fellow of the Society of Instrument and Control Engineers (SICE) and is the Founding President of the Virtual Reality Society of Japan (VRSJ). Invisible cloaking Twister Telesar IV
  • 6.
  • 7.
  • 8.
    NAVIgoid Robot Navigationwith High Level of Immersion
  • 9.
    Robot motion controlby torso tilt Tactile, visual and auditory feedback Mobile Robot User wearing a Belt Interface LRF UHG-08LX by HOKUYO Robot-side PC CPU: Intel Core i7 640M (2.8 GHz) Mem:8GB Mobile Robot PeopleBot by MobileRobots Belt Interface NAVIgoid User bends forward yu Single tactor Vr Human Vh yr rw xo yo Obstacle rh Wall xr ro Flex sensor Belt Interface xu 0 LRF (a) Human operator (b) Mobile robot PeopleBot GOAL: development a new type of tactile interface a n d t e l e p r e s e n c e robotic system that can make operator of being “embodied”. NOVELTY: • engaging the user into teleoperation • a h i g h l e v e l o f immersion through p r o p r i o c e p t i v e telepresence • tactile feedback
  • 10.
    Right Back Elasticband User’s waist Flex sensor I Flex sensor II Accelerometer Flex sensor III Flex sensor IV Front Left Geomagnetic sensor ProInterface Lug Accelerometer Contact slider Flex sensor Vibration motor Holder The ProInterafce (Proprioception-controlled Interface) detects the trunk stance through the flex sensors. The tactors vibrate to produce the tactile stimuli indicating: Stationary obstacle distance direction shape Mo v i n g obstacle distance location motion direction
  • 11.
    Tracking of MovingObstacle (a) 226th frame (b) 227th frame (c) 228th frame (d) 229th frame (e) 230th frame (a) 231th frame (b) 232th frame (c) 233th frame (d) 234th frame (e) 235th frame : robot, : new detected obstacle, : moving obstacle, : velocity of moving obstacle • Object is tracked by T using KF. x = ( p , v T ) T = ( p , p ,v ,v )T ot ot ot xt yt xt yt q (0 I) 0 1 ⎛ + Δ = ⎟⎟⎠ ot ot ot ⎛ + ⎟⎟⎠ ⎛ = x 2 + , N ,(8 t) 1 ⎟⎟⎠ o t q Δ p v v p v 1 t vt ot vt ot ⎞ ⎜⎜⎝ ⎞ ⎜⎜⎝ ⎞ ⎜⎜⎝ + + ~ State vector: Object motion model (motion prediction): position speed process noise
  • 13.
  • 14.
  • 15.
    [█■ 퐹↓푡 @ 퐹↓푛  ]=[█■푆↓12  퐶↓2 &− 푆↓43  퐶↓3 @− 푆↓12  푆↓2 & 푆↓43  푆↓3  ] [█■ 휏↓1 ∕ 푙↓2  @ 휏↓2 ∕ 푙↓5   ] F Operation principle of PulseTouch LinkTouch: true 2D display Inverted five-bar linkage mechanism C Fn Fn Ft l1 l3 l4 l2 l5 Motor Faulhaber DC Motor 0615 4,5 S Gear Ratio GH Series 06/1 64:1 Encoder HXM3-64 D i m e n s i o n s (W)×(H)×(D) [mm] 26.1×32×38.5 Link length l1, l2, l3, l4, l5 [mm] 18, 25, 10, 10,25 Weight [g] 13.5 Normal force [N] 0.58
  • 16.
    iFeel_IM! Communication systemwith rich emotional and haptic channels
  • 17.
    iFeel_IM!: communication systemwith emotional and haptic channels The philosophy behind iFeel_IM! (intelligent system for Feeling enhancement powered by affect sensitive Instant Messenger) is “I feel [therefore] I am!” D. Tsetserukou, et al. Enhancing mediated interpersonal communication through affective haptics, INTETAIN 2009, pp. 246-251.
  • 18.
    Architecture of iFeel_IM! interface of Affect Analysis Model chat text 3D world Second Life Web-based emotion: intensity PC of each user Haptic Devices Controller HaptiTemper and HaptiShiver Chat log file D/A Driver box HaptiHeart HaptiHug HaptiTickler HaptiBatterfly
  • 19.
    Affect Analysis Model:recognition of emotions from text Sentence I stage Symbolic cue analysis module Test for emoticons, abbreviations, acronyms, interjections, “?” and “!” marks, repeated punctuation and capital letters E“myeost”ic o n o r e m a b b r . “ n o ” Estimation of resulting emotion state Sentence pre-processing for parser Affect database II stage Syntactical structure analysis module Stanford Parser output of the parser Parser output processing Sentence annotated by emotion III stage Word-level analysis module IV stage Phrase-level analysis module V stage Sentence-level analysis module emotion category: intensity emotion category: intensity Emotions: Anger, Disgust, Fear, Guilt, Interest, Joy, Sadness, Shame, Surprise
  • 21.
  • 22.
    Спасибо за внимание! Thank you for your attention!