A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
9. Limitations with Current Technology
•Lack of spatial cues
• Person blends with background
•Poor communication cues
• Limited gaze, gesture, non-verbal communication
•Separation of task/communication space
• Can’t see person and workspace at same time
10. Connecting at a Distance with AR/VR
• Restore spatial cues
• Sharing non-verbal cues
• Creating shared spaces
22. Communication Trends
• 1. Experience Capture
• Move from sharing faces to sharing places
• 2. Natural Collaboration
• Faster networks support more natural collaboration
• 3. Implicit Understanding
• Systems that recognize behaviour and emotion
29. Changing Perspective
• View from remote user’s perspective
• Wearable Teleconferencing
• audio, video, pointing
• send task space video
• CamNet (1992)
• British Telecom
• Similar CMU study (1996)
• cut performance time in half
32. Changing Perspective - Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
33. Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
35. Adding in Sensor Input
• Using sensors to enhance collaboration
• Sharing heart rate
• Gaze cues
• Face expression
36. Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
38. Connecting between Spaces
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues
39. 3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
45. View Sharing Evolution
• Increased immersion
• Improved scene understanding
• Better collaboration
2D 360 3D
46. • Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing: Virtual Communication Cues (2019)
47. Sharing Virtual Communication Cues
• Collaboration between AR and VR
• Gaze Visualization Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
49. Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
53. Sharing VR Experiences
• HTC Vive HMD
• Empathic glove
• Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
56. Changing the Other Person’s Heartrate?
• Follow-on study: Artificially changing and sharing heartrate (-20%, 0%, +20%)
• Key findings
• Manipulated heart rate affects perceived valence and arousal levels of another person
• No change in actual heartrate, but trend towards significance (p = 0.08)
• Significant environment effect – active has higher HR than passive
A. Dey, H. Chen, A. Hayati, M. Billinghurst and R. W. Lindeman, "Sharing Manipulated Heart
Rate Feedback in Collaborative Virtual Environments," 2019 IEEE International Symposium
on Mixed and Augmented Reality (ISMAR), Beijing, China, 2019, pp. 248-257.
57. Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive
avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
58. Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
60. User Study (16 participants)
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks:
• (1) asymmetric, (2) symmetric
• Key findings
• Mini-Me significantly improved performance time (task1)
• Mini-Me significantly improved Social Presence scores
• 63% (task 2) – 75% (task 1) of users preferred Mini-Me
“The ability to see the small
avatar … enhanced the
speed of solving the task”
61. Technology Trends
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
62. Sensor Enhanced HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
67. Brain Synchronization in VR
Gumilar, I., Sareen, E., Bell, R., Stone, A., Hayati, A., Mao, J., ... & Billinghurst, M. (2021). A comparative study on inter-
brain synchrony in real and virtual environments using hyperscanning. Computers & Graphics, 94, 62-75.
68.
69.
70.
71. NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
72. Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
81. Possible Research Directions
• Lifelogging to VR
• Bringing real world actions into VR, VR to experience lifelogging data
• AR to Lifelogging
• Using AR to view lifelogging data in everyday life, Sharing physiological data
• Mirror Worlds to VR
• VR copy of the real world, Mirroring real world collaboration in VR
• AR to Mirror Worlds
• Visualizing the past in place, Asymmetric collaboration
• And more..