This document discusses empathic computing and its relationship to the metaverse. It defines key elements of the metaverse like virtual worlds, augmented reality, mirror worlds, and lifelogging. Research on the metaverse is still fragmented across these areas. The document outlines a vision for empathic computing systems that allow sharing experiences, emotions, and environments through technologies like virtual reality, augmented reality, and sensor data. Examples are given of research projects exploring collaborative VR experiences and AR/VR systems for remote collaboration and communication. The goal is for technology to support more natural and implicit understanding between people.
6. A Definition
• Real-time, 3D, Interactive, Social, Persistent
AWE 2022 John Riccitiello
CEO, Unity Technologies
(Mostly)
7. An AI Definition – Google Bard
The metaverse is a vast and complex concept, and it is still too early to say exactly
what it will look like or how it will be used. However, it is clear that the metaverse has
the potential to be a major technological and social development.
8. A Better Definition
• Neal Stephenson’s “Snow Crash” (1992)
• The Metaverse is the convergence of:
• 1) virtually enhanced physical reality
• 2) physically persistent virtual space
A persistent virtual environment around The Street
15 million avatars on The Street at any one time
11. What is the Metaverse Research Landscape Today?
• Survey of Scopus papers
• ~1900 papers found with Metaverse in abstract/keywords
• ~600 papers analyzed to date
• Further analysis
• Look for publications in AR, VR, MirrorWorlds (MW), LifeLogging (LL)
• Look for research across boundaries
• Application analysis
• Most popular application domains
19. Lessons Learned
• Research Strengths
• Most Metaverse research VR related (36%)
• Strong connections between AR/VR (16%)
• Strong connections between MW/VR (11%)
• Research Opportunities
• Opportunities across boundaries - 1% papers in AR/LL, 0% in MW/LL
• Opportunities to combine > 2 quadrants – 0% in AR/MW/LL
• Opportunities for research combining all elements
• Broadening application space – industry, finance, etc
20. Possible Research Directions
• Lifelogging to VR
• Bringing real world actions into VR, VR to experience lifelogging data
• AR to Lifelogging
• Using AR to view lifelogging data in everyday life, Sharing physiological data
• Mirror Worlds to VR
• VR copy of the real world, Mirroring real world collaboration in VR
• AR to Mirror Worlds
• Visualizing the past in place, Asymmetric collaboration
• And more..
22. Communication Trends
• 1. Experience Capture
• Move from sharing faces to sharing places
• 2. Natural Collaboration
• Faster networks support more natural collaboration
• 3. Implicit Understanding
• Systems that recognize behaviour and emotion
25. Empathic Computing Vision
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
26. Key Elements of Empathic Systems
•Understanding
• Emotion Recognition, physiological sensors
•Experiencing
• Content/Environment capture, VR, Mirrorworlds
•Sharing
• Communication cues, AR
28. Sharing Emotional VR Experiences
• HTC Vive HMD
• Empathic glove
• Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
31. Example: AR/VR/MW Collaboration
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues
34. Empathic Tele-Existence
• Covering the entire Metaverse
• AR, VR, Lifelogging, Mirror Worlds
• Transforming collaboration
• Observer to participant
• Feeling of doing things together
• Supporting Implicit collaboration
35. NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
36. Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
40. Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M.
(2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the
2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
41. Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
42.
43. Conclusions
• The Metaverse is overhyped
• Long-term impact could be more significant than imagined
• We need a broader Metaverse definition and taxonomy
• Many research opportunities in crossing boundaries
• Need to focus on high-level visions
• Empathic Computing encompasses the entire Metaverse