Slides from my session at the Global XR Conference. Nov 2022.
Abstract:
Augmented and Virtual Reality technologies allow advanced interactions in three-dimensional space by projecting digital content into users' field of view using dedicated opaque and see-through head-mounted displays.
At the Global XR Conference 2022, I will show what I have learned by building a software prototype combining AR and VR to transmit in real-time a three-dimensional video.
Point clouds were captured using a mobile depth camera and transmitted to a Virtual Reality device (Meta Quest 2) using a WebSocket server hosted on the Microsoft Azure platform.
Come along, there will be Unity demos, and we will have some fun looking at building a distributed system integrating holograms in a Virtual Reality experience.
3. Agenda
Introduction to Mixed Reality and the Metaverse
Telepresence Using Point Clouds
Capturing and Transmitting Three-Dimensional Points
Visualising Point Clouds Using a Virtual Reality Device
Lessons Learned
5. The state of things as they
“actually exist” through our
human senses without any
technology.
Artificially created sensory experiences
of people, environments and objects,
which can include sight, touch, hearing,
and smell.
PHYSICAL REALITY VIRTUAL REALITY
7. PHYSICAL
REALITY
DIGITAL
REALITY
MIXED REALITY (MR)
HoloLens / Magic Leap / Meta Quest Pro / Nreal Light / RealWare HMT-1 / LYNX-R / Varjo
Meta Quest 2 / PICO 4 / HTC VIVE - Cosmos / Valve Index / Varjo
AR
TODAY
VR
TODAY
Physical - Digital Reality Spectrum
*The term mixed reality was originally introduced in a 1994 paper by Paul Milgram and Fumio Kishino, "A Taxonomy of Mixed Reality Visual Displays."
9. Designing Great VR Experiences
• Place Illusion
• Plausibility Illusion
• Embodiment Illusion
M. Slater, 2009, “Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments”
https://www.coursera.org/lecture/introduction-virtual-reality/introduction-to-plausibility-illusion-psi-K5PGj
10.
11.
12. Meta Presence Platform
• Interactions, Hands and Voice SDKs
• Social Presence and Movement
SDK for Eye, Face and Body
Tracking
• Mixed Reality (Passthrough, Spatial
Anchors, Scene Understanding)
Source: Meta Connect - https://www.metaconnect.com/
15. The Metaverse
• Metaverse Introduced in 1992 by Neal
Stephenson in the Novel “Snow Crash”.
• Integration of Multiple 3D
Environments is Complex.
• Challenges include Realism, Ubiquity,
Interoperability and Scalability
(Dionisio et al., 2013).
17. Collaboration
• Meta Horizon Worlds
https://www.oculus.com/horizon-worlds
• Meta Horizon Workrooms
Collaboration experience for meetings
https://about.fb.com/news/2021/08/introducing-horizon-workrooms-remote-collaboration-
reimagined/
• Microsoft Mesh (*)
Virtual collaboration / Holoportation
https://www.microsoft.com/en-us/mesh
(*) Source: https://techcommunity.microsoft.com/t5/mixed-reality-blog/microsoft-
mesh-app-august-2021-update-new-features/ba-p/2746856
18. Telepresence
• Requires a sensor like Kinect for capturing colors and depth data from
cameras.
• Captured data (e.g., arrays of Vector3[] / Color[] can be transmitted
over the network)
• Requires a PC to act as a server
• A sample implementation is available here:
Kowalski, M.; Naruniec, J.; Daniluk, M.: "LiveScan3D: A Fast and
Inexpensive 3D Data Acquisition System for Multiple Kinect v2 Sensors". in
3D Vision (3DV), 2015 International Conference on, Lyon, France, 2015
https://github.com/MarekKowalski/LiveScan3D
• Streaming Point Clouds
to AR/VR Devices.
• Requires Kinect Sensors
for capturing depth data.
• Requires a PC to act as a
server.
• Other literature: S. Orts-
Escolano et al.,
“Holoportation, Virtual
3D Teleportation in real
time”, 2016
22. What are
passthrough
APIs?
https://developer.oculus.com/experimental/passthrough-api/
• Passthrough APIs must be enabled using experimental features using
adb:
adb shell setprop
debug.oculus.experimentalEnabled 1
• Must be enabled in the OVR Manager script of the OVRCameraRig:
• Require additional passthrough script:
• Enable
development of
pass-through
experiences using
Quest
• SDKs available for
Voice, Hands
manipulations.
51. • Point clouds are captured using
depth camera sensor of a mobile
device;
• An array containing vertices and
color information is transmitted in
real-time to a WebSocket server
hosted on Azure;
• VR clients connect to the
WebSocket server and receive the
point clouds stream in real-time.
58. Key findings and takeaways:
• Avoid using serialization/deserialization for real-time point clouds transmission
(usage of byte-arrays can optimize performance).
• Quality of reconstructed hologram requires multiple cameras for higher fidelity.
• Interestingly, data packet size was transmissible nearly in real-time (avg 900kB) .
• Cloud services (e.g., Azure Remote Rendering) provide point clouds file support.
to render high-quality 3D content to devices like HoloLens 2 or Meta Quest.
• Design properly the experience for maximising the user immersion.
• Real-time audio can be implemented using protocols like WebRTC.
• Collaboration performed using AR/VR devices introduces a greater sense of
presence.
59. Summary
Introduction to Mixed Reality and the Metaverse
Intro to Telepresence Using Point Clouds
Capturing and Transmitting Three-Dimensional Points
Visualising Point Clouds Using a Virtual Reality Device
Lessons Learned
60. “Any sufficiently advanced technology is
indistinguishable from magic”
Arthur C. Clark, Profiles of the future, 1961
61. References
Remote Telepresence using AR and VR
https://github.com/davidezordan/remote-telepresence-vr
D. Zordan, "Combining Augmented and Virtual Reality for Remote Collaboration in the Workplace,”
Master’s Dissertation, Wrexham Glyndŵr University, Wrexham, UK, 2022.
Unity XR Interaction Toolkit
https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@0.9/manual/index.html
Unity XR Interaction Toolkit examples
https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples
University of London, Coursera, “Introduction to Virtual Reality”
https://www.coursera.org/lecture/introduction-virtual-reality/introduction-to-plausibility-illusion-psi-K5PGj
Oculus VR performance optimization guide
https://developer.oculus.com/documentation/pcsdk/latest/concepts/dg-performance-opt-guide
62. References
Unity: the Most Common Mistakes to Avoid
https://unity3d.com/how-to/unity-common-mistakes-to-avoid
Azure Remote Rendering
https://learn.microsoft.com/en-us/azure/remote-rendering
Oculus passthrough APIs
https://developer.oculus.com/experimental/passthrough-api/
Kowalski, M.; Naruniec, J.; Daniluk, M.: "LiveScan3D: A Fast and Inexpensive 3D Data Acquisition System for
Multiple Kinect v2 Sensors". in 3D Vision (3DV), 2015 International Conference on, Lyon, France, 2015
https://github.com/MarekKowalski/LiveScan3D
Displaying a Point Cloud Using Scene Depth
https://developer.apple.com/documentation/arkit/environmental_analysis/displaying_a_point_cloud_using_sce
ne_depth
63. References
iPad LiDAR Depth Sample for Unity
https://github.com/TakashiYoshinaga/iPad-LiDAR-Depth-Sample-for-Unity
Prof. Mark Billinghurst - COMP 4010: Virtual Reality lectures – Introduction to XR
https://www.slideshare.net/marknb00/comp-4010-2010-lecture1introduction-to-xr
Prof. Mark Billinghurst - COMP 4010: Virtual Reality lectures – Research directions
https://www.slideshare.net/marknb00/comp4010-lecture12-research-directions
J. D. N. Dionisio, W. G. B. III, and R. Gilbert, “3D Virtual Worlds and the Metaverse: Current Status and Future
Possibilities,” ACM Comput. Surv., vol. 45, no. 3, Jul. 2013, doi: 10.1145/2480741.2480751.
S. Orts-Escolano et al., “Holoportation: Virtual 3D Teleportation in Real-Time,” in Proceedings of the 29th Annual
Symposium on User Interface Software and Technology, 2016, pp. 741–754, doi: 10.1145/2984511.2984517.
M. Slater, “Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments,” Philos.
Trans. R. Soc. Lond. B. Biol. Sci., vol. 364, no. 1535, pp. 3549–3557, Dec. 2009, doi: 10.1098/rstb.2009.0138.