EMPATHIC GLASSES:
SHARING REMOTE GAZE AND EMOTION
Mark Billinghurst
mark.billinghurst@unisa.edu.au
February 18th 2016
Christchurch, New Zealand
AR Smart Glasses Are Coming
Microsoft Hololens
DAQRI SmartHelmet
Meta SpaceGlasses
Ideal for Remote Collaboration
•  SmartGlasses are a natural platform for remote collaboration
•  Taskspace vs. Talking Head conferencing
Local User
Remote Expert
Lessons From Previous Research
•  Sharing communication cues improves task performance
•  Shared remote view, Remote pointing
•  Natural gesture support
•  AR can improve further
•  Use of HMD, world stabilized AR cues, shared virtual models
Communication Model
•  Communication mixture of Explicit and Implicit cues
•  Explicit
•  Conscious cues – speech, gesture, object manipulation, etc
•  Implicit
•  Unconscious cues – gaze, emotion, face expression, etc
Research Questions
•  Will supporting both Explicit and Implicit communication
cues improve remote collaboration?
•  Can we use technology to provide better communication
than face to face collaboration?
Gaze and Video Conferencing
• Gaze tracking
•  Implicit communication cue
•  Shows intent
• Task space collaboration
•  HMD + camera + gaze tracker
• Expected Results
•  Gaze cues reduce need for communication
•  Allow remote collaborator to respond faster
Equipment
•  Custom eye-tracker
•  Head mounted camera
•  Head mounted display
Experiment Set Up
•  Lego assembly
•  Two assembly areas
•  Remote expert
Experiment Design
• 4 conditions varying eye-tracking/pointer support
• 13 pairs subjects
• Measures
•  Performance time
•  Likert scale results, Ranking results, User preference
Task Performance
•  Performance Time (seconds)
Ranking
•  Median Ranking Values
Key Results
•  Both the pointer and eye tracking visual cues
helped participants to perform significantly faster
• The pointer cue significantly improved perceived
quality of collaboration and co-presence
•  Eye-tracking improved the collaboration quality,
and sense of being focused for the local users, and
enjoyment for the remote users
• The Both condition was ranked as the best in user
experience, while the None condition was worst.
Empathic Glasses
•  Combine together eye-tracking, display, face expression
•  Impicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
AffectiveWear – Emotion Glasses
•  Photo sensors to recognize expression
•  User calibration
•  Machine learning
•  Recognizing 8 face expressions
Masai, K., Sugiura, Y., Suzuki, K., Shimamura, S., Kunze, K., Ogata, M., ... & Sugimoto, M. (2015,
September). AffectiveWear: towards recognizing affect in real life. In Proceedings of the 2015 ACM
International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM
International Symposium on Wearable Computers (pp. 357-360). ACM.
Integrated System
• Local User
• Video camera
• Eye-tracking
• Face expression
• Remote Helper
• Remote pointing
System Diagram
•  Two monitors on Remote User side
•  Scene + Emotion Display
Pilot User Evaluation
•  Shared block picture building – 5 minutes
•  “Make a house”
•  4 Conditions
•  V: Video P: Video + remote pointer
•  E: Video + expression A: All conditions (V+P+E)
•  5 pairs subjects
Measures
•  Pictures made
•  Likert scale questions (1= disagree, 7 = agree) [No sig. diff.]
•  “I felt connected with my partner”
•  Ranking of conditions (1=best, 4 = worst) [Sig. diff.]
•  “Which condition did you communicate best with your partner in?”
Ranking Results
"I ranked the (A) condition best, because I could easily
point to communicate, and when I needed it I could check
the facial expression to make sure I was being understood.”
Q2: Communication Q3: Understanding partner
HMD – Local User Computer – Remote User
Behavior Observation
• Video Only
•  Remote users talked more
•  Described block colour/shape
• Pointing Conditions (A, P)
•  Diectic language – “move that over there”
• Expression Conditions (E,A)
•  Looked at expression less than 20% of time
Lessons Learned
• Pointing really helps in remote collaboration
•  Makes remote user feel more connected
• Gaze looks promising
•  shows context of what person talking about
• More work needed on emotion/expression cues
• Limitations
•  Limited implicit cues
•  Two separate displays
•  Task was a poor emotional trigger
•  AffectiveWear needs improvement
Future Work
•  Add additional physiological monitoring
•  Empatica E-4
•  Provide physiological feedback
•  Integrated eye-tracking/display
Emotion Classification
• Classify emotions from physiological data
•  8 emotional states
• Collaboration with Sensaura
•  http://www.sensauratech.com
Conclusions
• Wearable systems ideal for task space conferencing
•  Remote shared view, remote pointing
• AR can enhance remote collaboration
•  Virtual annotations, content sharing
• Additional sensors can be used for additional cues
•  Support Explicit + Implicit communication
• Significant areas for future research
•  Capturing/sharing physiological cues
•  Representing cognitive/emotional state
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Empathic Glasses

  • 1.
    EMPATHIC GLASSES: SHARING REMOTEGAZE AND EMOTION Mark Billinghurst mark.billinghurst@unisa.edu.au February 18th 2016 Christchurch, New Zealand
  • 2.
    AR Smart GlassesAre Coming Microsoft Hololens DAQRI SmartHelmet Meta SpaceGlasses
  • 3.
    Ideal for RemoteCollaboration •  SmartGlasses are a natural platform for remote collaboration •  Taskspace vs. Talking Head conferencing Local User Remote Expert
  • 4.
    Lessons From PreviousResearch •  Sharing communication cues improves task performance •  Shared remote view, Remote pointing •  Natural gesture support •  AR can improve further •  Use of HMD, world stabilized AR cues, shared virtual models
  • 5.
    Communication Model •  Communicationmixture of Explicit and Implicit cues •  Explicit •  Conscious cues – speech, gesture, object manipulation, etc •  Implicit •  Unconscious cues – gaze, emotion, face expression, etc
  • 6.
    Research Questions •  Willsupporting both Explicit and Implicit communication cues improve remote collaboration? •  Can we use technology to provide better communication than face to face collaboration?
  • 7.
    Gaze and VideoConferencing • Gaze tracking •  Implicit communication cue •  Shows intent • Task space collaboration •  HMD + camera + gaze tracker • Expected Results •  Gaze cues reduce need for communication •  Allow remote collaborator to respond faster
  • 8.
    Equipment •  Custom eye-tracker • Head mounted camera •  Head mounted display
  • 9.
    Experiment Set Up • Lego assembly •  Two assembly areas •  Remote expert
  • 11.
    Experiment Design • 4 conditionsvarying eye-tracking/pointer support • 13 pairs subjects • Measures •  Performance time •  Likert scale results, Ranking results, User preference
  • 12.
  • 13.
  • 14.
    Key Results •  Boththe pointer and eye tracking visual cues helped participants to perform significantly faster • The pointer cue significantly improved perceived quality of collaboration and co-presence •  Eye-tracking improved the collaboration quality, and sense of being focused for the local users, and enjoyment for the remote users • The Both condition was ranked as the best in user experience, while the None condition was worst.
  • 15.
    Empathic Glasses •  Combinetogether eye-tracking, display, face expression •  Impicit cues – eye gaze, face expression ++ Pupil Labs Epson BT-200 AffectiveWear
  • 16.
    AffectiveWear – EmotionGlasses •  Photo sensors to recognize expression •  User calibration •  Machine learning •  Recognizing 8 face expressions Masai, K., Sugiura, Y., Suzuki, K., Shimamura, S., Kunze, K., Ogata, M., ... & Sugimoto, M. (2015, September). AffectiveWear: towards recognizing affect in real life. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (pp. 357-360). ACM.
  • 17.
    Integrated System • Local User • Videocamera • Eye-tracking • Face expression • Remote Helper • Remote pointing
  • 18.
    System Diagram •  Twomonitors on Remote User side •  Scene + Emotion Display
  • 19.
    Pilot User Evaluation • Shared block picture building – 5 minutes •  “Make a house” •  4 Conditions •  V: Video P: Video + remote pointer •  E: Video + expression A: All conditions (V+P+E) •  5 pairs subjects
  • 20.
    Measures •  Pictures made • Likert scale questions (1= disagree, 7 = agree) [No sig. diff.] •  “I felt connected with my partner” •  Ranking of conditions (1=best, 4 = worst) [Sig. diff.] •  “Which condition did you communicate best with your partner in?”
  • 21.
    Ranking Results "I rankedthe (A) condition best, because I could easily point to communicate, and when I needed it I could check the facial expression to make sure I was being understood.” Q2: Communication Q3: Understanding partner HMD – Local User Computer – Remote User
  • 22.
    Behavior Observation • Video Only • Remote users talked more •  Described block colour/shape • Pointing Conditions (A, P) •  Diectic language – “move that over there” • Expression Conditions (E,A) •  Looked at expression less than 20% of time
  • 23.
    Lessons Learned • Pointing reallyhelps in remote collaboration •  Makes remote user feel more connected • Gaze looks promising •  shows context of what person talking about • More work needed on emotion/expression cues • Limitations •  Limited implicit cues •  Two separate displays •  Task was a poor emotional trigger •  AffectiveWear needs improvement
  • 24.
    Future Work •  Addadditional physiological monitoring •  Empatica E-4 •  Provide physiological feedback •  Integrated eye-tracking/display
  • 25.
    Emotion Classification • Classify emotionsfrom physiological data •  8 emotional states • Collaboration with Sensaura •  http://www.sensauratech.com
  • 26.
    Conclusions • Wearable systems idealfor task space conferencing •  Remote shared view, remote pointing • AR can enhance remote collaboration •  Virtual annotations, content sharing • Additional sensors can be used for additional cues •  Support Explicit + Implicit communication • Significant areas for future research •  Capturing/sharing physiological cues •  Representing cognitive/emotional state
  • 27.