METAVERSE
Mark Billinghurst
mark.billinghurst@auckland.ac.nz
November 22nd 2022
Opportunities for Research
Schmetaverse!!
The Metaverse is Hot!
Google Searches on the Metaverse
• 5 year search volume to October 2022
• Metaverse (blue) vs. Virtual Reality (red)
Publications about the Metaverse
• Going vertical !
A Definition
• Real-time, 3D, Interactive, Social, Persistent
AWE 2022 John Riccitiello
CEO, Unity Technologies
Meta’s Metaverse
Ready Player One
Back to 1994: Young Me in Seattle
Virtual Reality was HOT! .. In 1995..
My First VR Experience
1500 Polygons!
$250,000
My First VR Experience …
https://www.youtube.com/watch?v=pAC5SeNH8jw
In Five to Ten years ..
• VR becoming commonplace
• VR devices less than $1000
• Large scale networked worlds
• Billion-dollar VR industry
• April 2007 Computer World
• VRVoted 7th on list of 21 biggest technology flops
What Happened After That
• Growth of internet
• Natural user interfaces
• Increase in graphics performance
• Explosion in mobile phones
• Pokemon Go!
Increase in VR Research
• Papers grew over fivefold..
Social VR Today
Rec Room
- 30 million users
Fortnite
- 24 million DAU
- 28 million @ travis scott
A Better Definition
• Neal Stephenson’s “Snow Crash” (1992)
• The Metaverse is the convergence of:
• 1) virtually enhanced physical reality
• 2) physically persistent virtual space
A persistent virtual environment around The Street
15 million avatars on The Street at any one time
Metaverse Taxonomy
• Four Key Components
• Virtual Worlds
• Augmented Reality
• Mirror Worlds
• Lifelogging
• Metaverse Roadmap
• http://metaverseroadmap.org/
Mirror Worlds
• Simulations of external space/content
• Capturing and sharing surroundings
• Photorealistic content
• Digital twins
Matterport Deep Mirror Google Street View
Soul Machines
Lifelogging
• Measuring user’s internal state
• Capturing physiological cues
• Recording everyday life
• Augmenting humans
Apple Fitbit Shimmer
OpenBCI
Sensing
Immersing
Augmenting
Capturing
Metaverse Activities
Expanded Research Opportunities
Possible Research Directions
• Lifelogging to VR
• Bringing real world actions into VR, VR to experience lifelogging data
• AR to Lifelogging
• Using AR to view lifelogging data in everyday life, Sharing physiological data
• Mirror Worlds to VR
• VR copy of the real world, Mirroring real world collaboration in VR
• AR to Mirror Worlds
• Visualizing the past in place, Asymmetric collaboration
• And more..
Vision-Based Research (Hiroshi Ishii)
Communication Trends
• 1. Experience Capture
• Move from sharing faces to sharing places
• 2. Natural Collaboration
• Faster networks support more natural collaboration
• 3. Implicit Understanding
• Systems that recognize behaviour and emotion
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Empathic
Computing
Empathic Computing Vision
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Example: Sharing Communication Cues
• Measuring non-verbal cues
• Gaze, face expression, heart rate
• Sharing in Augmented Reality
• Collaborative AR experiences
Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
https://youtu.be/xwU7M1xaFJ8
Example: AR/VR Collaboration
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues
Sharing Virtual Communication Cues
• AR/VR displays
• Gesture input (Leap Motion)
• Room scale tracking
• Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
https://youtu.be/j8GdnSS3nAY
Technology Trends
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
Sensor Enhanced HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
Empathic Tele-Existence
• Based on Empathic Computing
• Creating shared understanding
• Covering the entire Metaverse
• AR, VR, Lifelogging, Mirror Worlds
• Transforming collaboration
• Observer to participant
• Feeling of doing things together
• Supporting Implicit collaboration
NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
https://www.youtube.com/watch?v=aG261GfiR90
Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player
Conclusions
• The Metaverse is overhyped
• Long-term impact could be more significant than imagined
• We need a broader Metaverse definition and taxonomy
• Many research opportunities in crossing boundaries
• Need to focus on high-level visions
• Empathic Computing encompasses the Metaverse
Delivering the Entire Metaverse
www.empathiccomputing.org
@marknb00
mark.billinghurst@auckland.ac.nz

ISS2022 Keynote