Presentation given by Mark Billinghurst on research into Empathic Glasses. Combining Augmented Reality, Wearable Computers, Emotion Sensing and Remote Collaboration. Given on February 18th 2016.
1. EMPATHIC GLASSES:
SHARING REMOTE GAZE AND EMOTION
Mark Billinghurst
mark.billinghurst@unisa.edu.au
February 18th 2016
Christchurch, New Zealand
2. AR Smart Glasses Are Coming
Microsoft Hololens
DAQRI SmartHelmet
Meta SpaceGlasses
3. Ideal for Remote Collaboration
• SmartGlasses are a natural platform for remote collaboration
• Taskspace vs. Talking Head conferencing
Local User
Remote Expert
4. Lessons From Previous Research
• Sharing communication cues improves task performance
• Shared remote view, Remote pointing
• Natural gesture support
• AR can improve further
• Use of HMD, world stabilized AR cues, shared virtual models
5. Communication Model
• Communication mixture of Explicit and Implicit cues
• Explicit
• Conscious cues – speech, gesture, object manipulation, etc
• Implicit
• Unconscious cues – gaze, emotion, face expression, etc
6. Research Questions
• Will supporting both Explicit and Implicit communication
cues improve remote collaboration?
• Can we use technology to provide better communication
than face to face collaboration?
7. Gaze and Video Conferencing
• Gaze tracking
• Implicit communication cue
• Shows intent
• Task space collaboration
• HMD + camera + gaze tracker
• Expected Results
• Gaze cues reduce need for communication
• Allow remote collaborator to respond faster
14. Key Results
• Both the pointer and eye tracking visual cues
helped participants to perform significantly faster
• The pointer cue significantly improved perceived
quality of collaboration and co-presence
• Eye-tracking improved the collaboration quality,
and sense of being focused for the local users, and
enjoyment for the remote users
• The Both condition was ranked as the best in user
experience, while the None condition was worst.
15. Empathic Glasses
• Combine together eye-tracking, display, face expression
• Impicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
16. AffectiveWear – Emotion Glasses
• Photo sensors to recognize expression
• User calibration
• Machine learning
• Recognizing 8 face expressions
Masai, K., Sugiura, Y., Suzuki, K., Shimamura, S., Kunze, K., Ogata, M., ... & Sugimoto, M. (2015,
September). AffectiveWear: towards recognizing affect in real life. In Proceedings of the 2015 ACM
International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM
International Symposium on Wearable Computers (pp. 357-360). ACM.
19. Pilot User Evaluation
• Shared block picture building – 5 minutes
• “Make a house”
• 4 Conditions
• V: Video P: Video + remote pointer
• E: Video + expression A: All conditions (V+P+E)
• 5 pairs subjects
20. Measures
• Pictures made
• Likert scale questions (1= disagree, 7 = agree) [No sig. diff.]
• “I felt connected with my partner”
• Ranking of conditions (1=best, 4 = worst) [Sig. diff.]
• “Which condition did you communicate best with your partner in?”
21. Ranking Results
"I ranked the (A) condition best, because I could easily
point to communicate, and when I needed it I could check
the facial expression to make sure I was being understood.”
Q2: Communication Q3: Understanding partner
HMD – Local User Computer – Remote User
22. Behavior Observation
• Video Only
• Remote users talked more
• Described block colour/shape
• Pointing Conditions (A, P)
• Diectic language – “move that over there”
• Expression Conditions (E,A)
• Looked at expression less than 20% of time
23. Lessons Learned
• Pointing really helps in remote collaboration
• Makes remote user feel more connected
• Gaze looks promising
• shows context of what person talking about
• More work needed on emotion/expression cues
• Limitations
• Limited implicit cues
• Two separate displays
• Task was a poor emotional trigger
• AffectiveWear needs improvement
24. Future Work
• Add additional physiological monitoring
• Empatica E-4
• Provide physiological feedback
• Integrated eye-tracking/display
26. Conclusions
• Wearable systems ideal for task space conferencing
• Remote shared view, remote pointing
• AR can enhance remote collaboration
• Virtual annotations, content sharing
• Additional sensors can be used for additional cues
• Support Explicit + Implicit communication
• Significant areas for future research
• Capturing/sharing physiological cues
• Representing cognitive/emotional state