Invited guest lecture by Mark Billingurust given at the MIT Media Laboratory on November 21st 2023. This was given as part of Professor Hiroshi Ishii's class on Tangible Media
8. The Metaverse: A Definition?
AWE 2022 John Riccitiello
ex-CEO, Unity
Real-time, 3D, Interactive, Social, Persistent
9. • The Metaverse Roadmap (2007)
• The Metaverse is the convergence of:
• 1) virtually enhanced physical reality
• 2) physically persistent virtual space
A Better Definition
11. • Simulations of external space/content
• Capturing and sharing surroundings
• Photorealistic content
• Digital twins
Matterport Google Street View
Soul Machines
MirrorWorlds
12. • Measuring user’s internal state
• Capturing physiological cues
• Recording everyday life
• Augmenting humans
Appl
e
Shimmer
OpenBCI
LifeLogging
15. Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays
Internet of Things
Milgram’s Mixed Reality Continuum
16. Example: MagicBook (2000)
Reality Augmented Reality Virtual Reality
• Seamlessly transition between Reality and Virtual Reality
• Support ego-centric and exo-centric collaboration
19. • Survey of Scopus papers
• ~1900 papers found with Metaverse in abstract/keywords
• Further analysis
• Publications in AR, VR, MirrorWorlds (MW), LifeLogging (LL)
• Look for research across boundaries
• Application analysis
• Most popular application domains
What is the Metaverse Research Landscape Like?
20. • Of the papers mentioning AR, VR, MW, LL
• 66% only mentioned one topic
• VR the most popular research topic (72%)
Most Research in One Quadrant
21. • Key Findings
• Most research combines AR and VR (AR/VR – 67%)
• Little research involving LL (7% of 2Q)
• < 2% research combined AR, VR, MW, LL
Research Crossing Boundaries
67%
0
%
23%
5
%
2%
3%
23. Key Lessons Learned
• Research Strengths
• Most Metaverse research VR related
• Strong connections between AR/VR
• Strong connections between MW/VR
• Research Opportunities
• Opportunities across boundaries - 2% papers in AR/LL, 0% in MW/LL
• Opportunities for research combining all elements
• Broadening application space – industry, finance, etc.
26. “Empathy is Seeing with the
Eyes of another, Listening with
the Ears of another, and Feeling
with the Heart of another..”
Alfred Adler
27. Understanding
Emotion Recognition - Life Logging
Experiencing
Content/Environment capture - VR, Mirror Worlds
Sharing
Enhanced communication cues - AR
Key Elements of Empathic Systems
29. 2. Experiencing: Virtual Reality
"Virtual reality offers a whole different
medium to tell stories that really connect
people and create an empathic connection."
Nonny de la Peña
http://www.emblematicgroup.com/
30. 3: Sharing: Empathic Computing
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
31. Example: Sharing Communication Cues
• Measuring non-verbal cues
• Gaze, face expression, heart rate
• Sharing in Augmented Reality
• Collaborative AR experiences
32. Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
33. Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
34.
35. Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
36.
37. 3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
44. Example: AR/VR/MW Collaboration
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues
45. • Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing Virtual Communication Cues (2019)
48. Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
49. Sharing Gaze Cues
How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment.
âž” Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system
âž” Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared
between a local host (AR) and a remote collaborator (VR).
Jing, A., May, K., Matthews, B., Lee, G., & Billinghurst, M. (2022). The Impact of Sharing Gaze Behaviours in Collaborative Mixed Reality.
Proceedings of the ACM on Human-Computer Interaction, 6(CSCW2), 1-27.
50. System Design
âž” 360 Panaramic Camera + Mixed Reality View
âž” Combination of HoloLens2 + Vive Pro Eye
âž” 4 gaze behavioural visualisations:
browse, focus, mutual, fixated circle
52. • Bi-directional gaze with behaviours (BWB) significantly improved performance
• Behaviour visualisations stimulate frequent joint attention
• Sharing gaze cues made communication easier and more effective
Results
Joint gaze frequency
53. Sharing Gesture Cues
• What type of gesture cues should be shared in AR/VR collaboration?
Kim, S., Lee, G., Huang, W., Kim, H., Woo, W., & Billinghurst, M. (2019). Evaluating the combination of visual communication cues for HMD-
based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-13).
Augmented Reality
Virtual Reality
54. Communication Cues
• Four different cues used
• (1) Hands Only (HO), (2) Hands + Pointer (HP)
• (3) Hands + Sketch (HS), (4) Hands + Pointer + Sketch (HPS)
• Three experimental tasks
• Lego assembly, Tangram puzzle, Origami folding
55. Key Results
• Task completion time
• Sketch cues enabled users to complete tasks significantly faster (task dep.)
• Adding pointing didn’t improve task completion time
• Co-Presence
• Adding pointing and sketch cues didn’t improve feeling of co-presence
• User Preference
• User’s overwhelming preferred Hands + Pointer + Sketch condition, Hands Only ranked last
Task completion time
“sketching is pretty useful for
describing actions that was
difficult with words and could
express more details".
58. On the Shoulder of the Giant
Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder of the giant: A
multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI (pp. 1-17).
59. Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April).
Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI (pp. 1-13).
Collaborating Collaborator out of View
60. Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
61.
62. Results from User Evaluation
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks
• Asymmetric, symmetric collaboration
• Significant performance improvement
• 20% faster with Mini-Me
• Social Presence
• Higher sense of Presence
• Users preferred
• People felt the task was easier to complete
• 60-75% preference
“I feel like I am
talking to my
partner”
63. • Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
Technology Trends
64. • Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
67. NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
68. • HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Set Up
69.
70. NeuralDrum – Brain Synchronisation in XR
Poor Player Good Player
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
71. • HTC Vive HMD
• Empathic glove
• Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
Being in the Body of Another
74. Results:
• Game experience had significant impact
• In zombie game, sharing heart rate
• Enabled players to understand emotional state of partner
• Helped players to be significantly more attentive
• Subjects had a preference for sharing heart rate
• But no significant difference on feeling connected
Understanding emotional state
75. Empathic Tele-Existence
• Based on Empathic Computing
• Creating shared understanding
• Covering the entire Metaverse
• AR, VR, Lifelogging, Mirror Worlds
• Transforming collaboration
• Observer to participant
• Feeling of doing things together
• Supporting Implicit collaboration
76. Conclusions
• We need a broader Metaverse definition and taxonomy
• Including AR, VR, Mirror Worlds, Life Logging
• Current research is narrowly focused
• But many research opportunities exist in crossing boundaries
• Empathic Computing encompasses the entire Metaverse
• Transforming face to face and remote collaboration