VBI: Lost Connection is a VR multiplayer escape room where two players, as VBI special agents, need to cooperate inside Avery’s Virtual World to unfold the story and solve Adam Vogel’s mystery. The experience had to be designed for public spaces, including narrative elements and 2-people enigmas, which has led to specific challenges and solutions given a development time of 2 months. In this session we'll see how the experience has been made with Unreal Engine 4, the best practices followed, the technical integration of Watson Visual Recognition and lessons learned.
Fabio Mosca - Developing a VR multiplayer escape room: behind the scenes of VBI - Lost Connection - Codemotion Milan 2018
1. Developing a VR
multiplayer escape
room: behind the scenes
of VBI - Lost Connection
Fabio Mosca
CTO @ AnotheReality
Milan | November 29-30, 2018
2. Fabio Mosca
Born in Bergamo, 1989.
• Computer science Engineer
• Videogame developer
• VR experimenter since 2014
• Founder and CTO of AnotheReality
• Founder of Virtual Reality Milan
Meetup (AperitiVR)
• XR Speaker, mentor and teacher
• Mastering expert songs in Beat Saber
3. AnotheReality
AnotheReality is a XR Dev Studio specialized in Location Based
Entertainment and Simulation & Training.
Developers of the training platform used by companies such as
Nokia, Hitachi, General Electric, and videogames / escape rooms
used by ASUS, IBM, etc etc...
Expertise on XR Development since 2014, coming from the
videogame industry.
4. What we're going to see
- Planning the production based on requests and time
- Tools and softwares used
- Design approach, slave of feasibility
- Art style definition
- Code implementation (Multiplayer, Interactions,
Watson Visual Recognition)
- Audio Production
- Dirty tricks
- Results (spoiler: we made it!)
5.
6.
7. VR coop escape room
Two agents of the Virtual Bureau of
Investigation will have to solve a mistery
inside the virtual world of Avery, an AI of
the year 2043
VBI: Lost Connection is a cooperative
escape room in VR, created for events
and showcasing how IBM Watson can be
used in a VR videogame
9. Request
Create a VR multiplayer escape room to be used in
events, showcasing what can be done with IBM
Watson
First event in 2 months
Should be integrated and coordinated with the
narrative universe of a mobile application, featuring a
"very human" AI to talk with, to help her to recover her
memories.
10. Underlying requirements
- Fixed play time for easy queuing
- 4m x 3m play area (each player)
- Playable by inexperienced users
- Related to AI, in a future setting
- Multiplayer can be local
- 2 MONTHS for creativity, design, development &
test is nothing.
18. Parallel production tracks
- Narrative (what the AI will say? what's the
ending?)
- Art (how's the AI looks like?)
- Design (what the AI is going to do?)
- Code (I need to spawn the AI and play the
animations)
- Audio (I need those effects for when the AI
spawns)
- Production (making sure that all the tracks
are working coherently together)
19. Level design in VR
- Narrative (what the AI will say? what's the
ending?)
- Art (how's the AI looks like?)
- Design (what the AI is going to do?)
- Code (I need to spawn the AI and play the
animations)
- Audio (I need those effects for when the AI
spawns)
- Production (making sure that all the tracks
are working coherently together)
20. Level design in VR
- Narrative (what the AI will say? what's the
ending?)
- Art (how's the AI looks like?)
- Design (what the AI is going to do?)
- Code (I need to spawn the AI and play the
animations)
- Audio (I need those effects for when the AI
spawns)
- Production (making sure that all the tracks
are working coherently together)
34. Holographic memories
- Representing sone of Vogel's actions
- They should tell an "indirect story"
- We want less text as possible in VR
- They should give hints to players about
what to draw
35. Motion capture!
Cheap motion capture
using HTC Vive and
Ikinema Orion
No need to clean up
animations, as they will be
used for holograms (noise
and imperfections are not
a bug, but a feature!)
38. Watson visual recognition
You can draw in 3D in a special area. If the
drawing is recognized as one of the key
objects, it will spawn the real 3D model.
39. Watson visual recognition in UE4
- Realtime recognition of abstract shapers
- Variation between players drawing styles
can be extreme
- Should not "teach" players how to draw
properly
- Find the best way to take a snapshot of
the 3D drawing
Two drawings of the same object: a photo camera
40. - No SDK available for UE4. Reference was
https://github.com/Perefin/watson-unreal-
sdk-plugin (supports STT/TTS/Assistant)
- Implemented Visual Recognition Classify
call
- Simple POST request with image data
attached (simulated HTML form, less than
80kb per call)
- Automatic JSON-to-USTRUCT conversion
for response parsing
- Blueprint exposed
43. - Drawings are composed of multiple ribbon
particle systems with particles spawned
based on movement delta (attached to the
brush/pen)
- When a player leaves the drawing volume
at any point his last position is saved as
the snapshot camera’s direction
- One Scene Capture component renders
only drawings. Orthographic view for more
consistent results
- A single render target is used by the scene
capture to save the resulting drawing to
later classify
47. VBI - Audio Production
- Voices through Watson text to speech +
filers
- Multiple musical layers
- Spatialization (HRTF) for immersive audio
- Mixing environment sounds, music,
effects, voices...
- Integrate all of this in the code
53. Results
Succesfully deployed in 10+
events since June 2018
Codemotion is one of them!
(Expo area, photo here for
reference)
Friendly staff is awaiting you
to let you try it ☺