Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Building VR Applications For Google Cardboard

8,353 views

Published on

Slides showing how to use Unity to build Google Cardboard Virtual Reality applications. From a series of lectures given by Mark Billinghurst from the University of South Australia.

Published in: Technology

Building VR Applications For Google Cardboard

  1. 1. BUILDING VR APPLICATIONS FOR GOOGLE CARDBOARD Mark Billinghurst mark.billinghurst@unisa.edu.au January 20th 2017
  2. 2. Mark Billinghurst ▪ Director, Empathic Computing Lab University of South Australia ▪ Past Director of HIT Lab NZ, University of Canterbury ▪ PhD Univ. Washington ▪ Research on AR, mobile HCI, Collaborative Interfaces ▪ More than 300 papers in AR, VR, interface design
  3. 3. What You Will Learn • Definitions of VR, Brief History of VR • Introduction to Mobile VR/Google Cardboard • Intoduction to Unity3D • Complete 7 projects • 1 Building a Unity Scene • 2 Immersive 360 Panorama • 3 Creating a 3D VR Scene • 4 Adding Movement • 5 Gaze based interaction • 6 Menu input • 7 Moving Menus • Cardboard interface design guidelines • Resources for learning more
  4. 4. Introduction to Virtual Reality
  5. 5. Virtual Reality Computer generated multi-sensory simulation of an artificial environment that is interactive and immersive.
  6. 6. What is Virtual Reality? Virtual reality is.. a computer technology that replicates an environment, real or imagined, and simulates a user's physical presence and environment to allow for user interaction. (Wikipedia) • Defining Characteristics • Environment simulation • Presence • Interaction
  7. 7. Key Technologies • Autonomy • Head tracking, body input • Intelligent systems • Interaction • User input devices, HCI • Presence • Graphics/audio/multisensory output • Multisensory displays • Visual, audio, haptic, olfactory, etc
  8. 8. Types of VR 9
  9. 9. Brief History of Virtual Reality https://immersivelifeblog.files.wordpress.com/2015/04/vr_history.jpg
  10. 10. Desktop VR - 1995 • Expensive - $150,000+ • 2 million polys/sec • VGA HMD – 30 Hz • Magnetic tracking
  11. 11. Desktop VR 2016 • Graphics Desktop • $1,500 USD • >4 Billion poly/sec • $600 HMD • 1080x1200, 90Hz • Optical tracking • Room scale
  12. 12. Oculus Rift Sony Morpheus HTC/Valve Vive 2016 - Rise of Consumer HMDs
  13. 13. Google Cardboard - Mobile VR
  14. 14. Computer Based vs. Mobile VR
  15. 15. MobileVR:Google Cardboard • Released 2014 (Google 20% project) • >5 million shipped/given away • Easy to use developer tools + =
  16. 16. Version 1.0 vs Version 2.0 • Version 1.0 – Android focused, magnetic switch, small phone • Version 2.0 – Touch input, iOS/Android, fits many phones
  17. 17. Many Different Cardboard Viewers
  18. 18. Multiple Mobile VR Viewers Available
  19. 19. • In 2016 – 46m possible desktop VR users vs. 400 m mobile VR users • https://thoughts.ishuman.co/vr-will-be-mobile- 11529fabf87c#.vfcjzy1vf
  20. 20. Mobile VR Applications
  21. 21. Types of VR Experiences • Immersive Spaces • 360 Panorama’s/Movies • High visual quality • Limited interactivity • Changing viewpoint orientation • Immersive Experiences • 3D graphics • Lower visual quality • High interactivity • Movement in space • Interact with objects
  22. 22. Immersive Panorama • High quality 360 image or video surrounding user • User can turn head to see different views • Fixed position
  23. 23. Cardboard Camera (iOS/Android) • Capture 360 panoramas • Stitch together images on phone • View in VR on Cardboard
  24. 24. Example Panorama Applications • Within • http://with.in • High quality 360 VR content • New York Times VR Experience • NYTVR application • Documentary experiences • YouTube 360 Videos • Collection of 360 videos
  25. 25. Google Cardboard App • 7 default experiences • Earth: Fly on Google Earth • Tour Guide: Visit sites with guides • YouTube: Watch popular videos • Exhibit: Examine cultural artifacts • Photo Sphere: Immersive photos • Street View: Drive along a street • Windy Day: Interactive short story
  26. 26. 100’s of Google Play Cardboard apps
  27. 27. Building VR Experiences
  28. 28. What You Need • Cardboard Viewer/VR Viewer • https://vr.google.com/cardboard/ • Smart phone • Android/iOS • Authoring Tools/SDK • Google VR SDK • Unity/Unreal game engine • Non programming tools • Content • 3D models, video, images, sounds
  29. 29. Software Tools • Low level SDKs • Need programming ability • Java, C#, C++, etc • Example: Google VR SDK (iOS, Android) • https://developers.google.com/vr/ • Game Engines • Powerful, need scripting ability • Unity - https://unity3d.com/ • Unreal - https://www.unrealengine.com/vr • Combine with VR plugins (HMDs, input devices) • Google VR Unity plugin
  30. 30. Unity 3D Game Editor
  31. 31. Tools for Non-Programmers • Focus on Design, ease of use • Visual Programming, content arrangement • Examples • Insta-VR – 360 panoramas • http://www.instavr.co/ • Vizor – VR on the Web • http://vizor.io/ • A-frame – HTML based • https://aframe.io/ • ENTiTi – Both AR and VR authoring • http://www.wakingapp.com/ • Eon Creator – Drag and drop tool for AR/VR • http://www.eonreality.com/eon-creator/
  32. 32. Google VR SDK for Unity Free Download https://developers.google.com/vr/unity/download/ Features: 1. Lens distortion correction 2. Head tracking 3. 3D calibration 4. Side-by-side rendering 5. Stereo geometry configuration 6. User input event handling 7. VR emulation mode, etc.. Unity Google VR SDK
  33. 33. INTRODUCTION TO UNITY
  34. 34. Unity Overview (see www.unity3d.com) • Created in 2005 • Tool for creating games and 2D/3D applications • Advanced graphics support • Support for multiplayer, analytics, performance, ads, etc • Cross Platform Game Engine • One of the most popular (> 1.5 million developers) • 27 platforms (iOS,Android, Windows, Mac, etc) • Multiple license models • Free for personal use/small business • Large developer community • Tutorials, support • User generated content/assets
  35. 35. SETUP
  36. 36. Download and Install (for Android) • Go to unity3d.com/download • Use Download Assistant – pick components you want • Make sure to install Android components • Also install Android studio (https://developer.android.com/studio/)
  37. 37. Getting Started • First time running Unity you’ll be asked to create a project • Specify project name and location • Can pick asset packages (pre-made content)
  38. 38. Unity Interface • Toolbar, Scene, Hierarchy, Project, Inspector
  39. 39. Customizable Interface
  40. 40. Building Scenes • Use GameObjects: • Containers that hold different components • Eg 3D model, texture, animation • Use Inspector • View and edit object properties and other settings • Use Scene View • Position objects, camera, lights, other GameObjects etc • Scripting • Adding interaction, user input, events, etc
  41. 41. GameObjects • Every object in Scene is a GameObject • GameObjects contain Components • Eg Transform Component, Camera Components • Clicking on object will show values in Inspector panel
  42. 42. Adding 3D Content • Create 3D asset using modeling package, or download • Fbx, Obj file format for 3D models • Add file to Assets folder in Project • When project opened 3D model added to Project View • Drag mesh from Project View into Hierarchy or Scene View • Creates a game object
  43. 43. Positioning/Scaling Objects • Click on object and choose transform
  44. 44. Unity Prefabs • When download assets, often download Prefabs (blue squares) • Use by dragging and dropping into scene hierachy • Prefab is a way of storing a game object with properties and components already set • Prefab is a template from which you can create new object instances in the scene • Changes to a prefab asset will change all instances in the scene
  45. 45. Unity Asset Store • Download thousands models, scripts, animations, etc • https://www.assetstore.unity3d.com/
  46. 46. PROJECT 1:BUILDING A UNITY SCENE
  47. 47. Making a Simple Scene - Key Steps 1. Create New Project 2. Create Game Object 3. Moving main camera position 4. Adding lights 5. Adding more objects 6. Adding physics 7. Changing object materials 8. Adding script behaviour
  48. 48. CreateProject • Create new folder and project
  49. 49. New Empty Project
  50. 50. Create GameObject • Load a Sphere into the scene • GameObject -> 3D Object -> Sphere
  51. 51. Moving main camera • Select Main Camera • Select translate icon • Move camera
  52. 52. Add Light • GameObject -> Light -> Directional Light • Use inspector to modify light properties (colour, intensity)
  53. 53. Add Physics • Select Sphere • Add Rigidbody component • Add Component -> Physics -> RigidBody • or Component -> Physics -> RigidBody • Modify inspector properties (mass, drag, etc)
  54. 54. Add More Objects • Add several cubes • GameObject -> 3D Object – Cube • Move cube • Add Rigid Body component (uncheck gravity)
  55. 55. Add Material • Assets -> Create -> Material • Click Albedo colour box in inspector • Select colour • Drag asset onto object to apply
  56. 56. Add Script • Assets -> Create -> C# script • Edit script using Mono • Drag script onto Game Object
  57. 57. Example C# Script GameObject Rotation using UnityEngine; using System.Collections; public class spin : MonoBehaviour {     // Use this for initialization     void Start () {          }          // Update is called once per frame     void Update () {         this.gameObject.transform.Rotate(Vector3.up*10);     } }
  58. 58. Scripting C# Unity 3D • void Awake(): • Is called when the first scene is loaded and the game object is active • void Start(): • Called on first frame update • void FixedUpdate(): • Called before physics calculations are made • void Update(): • Called every frame before rendering • void LateUpdate(): • Once per frame after update finished
  59. 59. Final Spinning Cube Scene
  60. 60. PROJECT 2: IMMERSIVE 360 PANORAMA
  61. 61. Key Steps 1. Create a new project 2. Load the Google VR SDK 3. Load a panorama image asset 4. Create a Skymap 5. Add to VR scene 6. Deploy to mobile phone
  62. 62. New Project
  63. 63. Load Google VR SDK • Assets -> Import Package -> Custom Package • Navigate to GoogleVRForUnity.unitypackage • Uncheck iOS (for Android build)
  64. 64. Load Cardboard Main Camera • Drag GvrViewerMain prefab into Hierarchy • Assets -> GoogleVR -> Prefabs • Keep Main Camera
  65. 65. Panorama Image Asset • Find/create suitable panorama image • Ideally 2K or higher resolution image in cubemap layout • Google “Panorama Image Cubemap”
  66. 66. Capturing Panorama • Stitching photos together • Image Composite Editor (Microsoft) • AutoPano (Kolor) • Using 360 camera • Ricoh Theta-S • Fly360
  67. 67. Image Composite Editor (Microsoft) • Free panorama stitching tool • http://research.microsoft.com/en-us/um/redmond/projects/ice/
  68. 68. AutoPano (Kolor) • Finds image from panoramas and stitches them together • http://www.kolor.com/autopano/
  69. 69. Add Image Asset to Project • Assets -> Import Asset • Select desired image • In Inspector • Set Texture Type to Cubemap • Set mapping to Latitude- Longitude (Cylindrical) • Hit Apply button
  70. 70. Create Skybox Material • Assets -> Create -> Material • Name material - e.g. 'Sky' • Set Shader to Skybox -> Cubemap • Drag texture to cubemap
  71. 71. Create Skybox • Window -> Lighting • new window pops up • Drag Skybox material into Skypebox form
  72. 72. Panorama Image Appears in Unity
  73. 73. One Last Thing.. • Check Clear Flags on Camera is set to Skybox • Select Main Camera • Look at Camera in Inspector • Clear Flags -> Skybox
  74. 74. Test It Out • Hit play button • Use alt/option key + mouse to look around
  75. 75. Deploying to Phone (Android) 1. Plug phone into USB • Put phone into debug mode 2. Open Build Settings 3. Change Target platform to Android 4. Resolution and Presentation • Default Orientation -> Landscape Left 5. Under Player Settings • Edit Bundle Identifier – eg com.UniSA.cubeTest • Minimum API level 6. Build and Run • Select .apk file name
  76. 76. Setting Path to Android • You may need to tell Unity where the Android SDK is • Set the path: • Edit -> Preferences -> External Tools
  77. 77. Running on Phone • Droid@Screen View on Desktop
  78. 78. Making Immersive Movie • Create movie texture • Convert 360 video to .ogg or ,mp4 file • Add video texture as asset • Make Sphere • Equirectangular UV mapping • Inward facing normals • Move camera to centre of sphere • Texture map video to sphere • Easy Movie Texture ($65) • Apply texture to 3D object • For 3D 360 video • Render two Spheres • http://bernieroehl.com/360stereoinunity/
  79. 79. PROJECT 3: CREATING A 3D VR SCENE
  80. 80. Key Steps 1. Creating a new project 2. Load Google VR SDK 3. Add GvrViewerMain to scene 4. Loading in 3D asset packages 5. Loading a SkyDome 6. Adding a plane floor
  81. 81. New Project • GvrViewerMain added to Hierachy
  82. 82. Download Model Package • Magic Lamp from 3dFoin • Search on Asset store
  83. 83. Load Asset + Add to Scene • Assets -> Import Package -> Custom Package • Look for MagicLamp.unitypackage (If not installed already) • Drag MagicLamp_LOD0 prefab into Hierarchy • Assets -> MagicLamp -> MagicLamp_LOD0 • Position and rotate
  84. 84. Import SkySphere package • SkySphere Volume1 on Asset store
  85. 85. Add SkySphere to Scene • Drag Skyball_WithoutCap into Hierarchy • SkySphere_V1 -> Meshes • Rotate and Scale as needed (using Inspector)
  86. 86. Add Ground Plane • GameObject -> 3D Object -> Plane • Set Scale X to 3.0, Z to 3.0
  87. 87. Testing View • Use alt/option key plus mouse to rotate view
  88. 88. Adding More Assets • Load from Asset store – look for free assets
  89. 89. PROJECT 4: ADDING MOVEMENT
  90. 90. Moving Through VR Scenes • Move through looking • Look at target to turn on/off moving • Button/tapping screen • Being in a vehicle (e.g. Roller Coaster)
  91. 91. Adding Movement Through Looking Goal: Move in direction user is looking when button on VR display pressed or screen touched • Key Steps 1. Start with static scene 2. Create player body 3. Create movement script 4. Add movement script to player body
  92. 92. Key Steps 1. Create New Project 2. Import GoogleVRforUnity Package 3. Create objects in scene 4. Add player body 5. Include collision detection 6. Add player movement script
  93. 93. Create New Project • Include GoogleVRforUnity • Assets->ImportPackage->Custom Package
  94. 94. Add GvrViewerMain to Project • Drag GvrViewerMain into Hierarchy • from Asset->GoogleVR->Prefabs
  95. 95. Add Ground Plane and Objects • Create simple scene of Ground Plane and obects • GameObject -> 3D Object -> Plane/Cube/Sphere/Cylinder • Scale and position as you like, add materials • Add rigidbody components to objects (not plane) to enable collisions • Select object -> Add Component -> Rigidbody • Fix position of object: Constraints -> Freeze Position -> check x,y,z (Freeze Rotation)
  96. 96. Add Player Body • Select Main Camera • Add Component->Mesh Filter • Click on circle icon on right -> Select Capsule mesh
  97. 97. Make the Body Visible • Select Main Camera • Add component -> Mesh Renderer • Create a material and drag onto capsule mesh
  98. 98. Add Collision Detection • Allow player to collide with objects • Select Main Camera • Add Component -> Capsule Collider • Add Component -> RigidBody • Fix player to ground • In RigidBody component • Uncheck “Use Gravity” • Uncheck “Is Kinematic” • Check Constraints -> Freeze Position -> Y axis
  99. 99. Add Movement Script • Select Main Camera • Create new script called PlayerMovement • Add component -> New Script • Key variables - speed, rigidbody public float speed = 3.0f; Rigidbody rbody; • Define fixedupdate movement function (move in direction looking) void FixedUpdate () { if(Input.touchCount>0||Input.GetMouseButton(0)) rbody.MovePosition(transform.position+transform.forward * Time.deltaTime*speed); }
  100. 100. PlayerMovement Script
  101. 101. Run Demo • Use left mouse button to move in direction looking • Button press/screen tap on mobile phone
  102. 102. Demo Problem • Wait! I'm bouncing off objects • Moving body hits fixed objects and gets negative velocity
  103. 103. Stopping Camera Motion • When camera collides it's given momentum • velocity and angular velocity • Need to set velocity and angular velocity to zero • In player movement script • Set rbody velocity components to zero
  104. 104. Revised PlayerMovement Script
  105. 105. Final Demo • Move in direction camera looking • Collide with objects and stop moving
  106. 106. PROJECT 5: GAZE INTERACTION
  107. 107. Gaze Interaction • Cause events to happen when looking at objects • E.g look at a target to shoot at it
  108. 108. Key Steps 1. Begin with VR scene from Project 4 2. Add physics ray caster • Casts a ray from camera (gaze ray) 3. Add function to object to respond to gaze • E.g. when gaze ray hits target cause particle effect 4. Add event trigger to target object 5. Add event system to target object
  109. 109. Adding Physics Raycaster • Aim: To send a virtual ray from camera view • Process • Select Main Camera • Add GvrPointerPhysicsRaycaster Component to Main Camera • Add component -> GvrPointerPhysicsRaycaster
  110. 110. Add Gaze Function • Select target object (the cube model) • Add component -> new script • Call script CubeInteraction • Add OnGazeEnter(), OnGazeExit() public functions • Decide what happens when gaze enters/exits Cube model • Complete this later
  111. 111. Add Event Trigger • Select Target Object (Cube) • Add component • EventTriger • Add New Event Type -> PointerEntry • Add object to event • Hit ‘+’ tag • Drag Cube object to box under Runtime Only • Select Function to run • Select function list -> scroll to CubeInteraction -> OnGazeEnter • Repeat for OnGazeExit
  112. 112. Adding Event System • Need to user Event System for trigger to work • Looks for gaze events occuring with Cube object • Add Event System to Hierachy • Game Object -> UI -> Event System • Add gazeInputModule to Event System • Add component -> Gaze Input Module
  113. 113. Add Collider to Object • Need to detect when target object is being looked at • Select target Object • Add Collider (eg Box) • Add component -> Box Collider • Adjust position and size of Collider if needed • Make sure it covers the target area
  114. 114. Making Gaze Point Visible • In current system can't see user's Gaze point • Add viewing reticle • Drag GvrReticlePointer prefab onto main camera • Assets -> GoogleVR -> Prefabs -> UI • Reticle changes shape when on active object • Change reticle material to make it more visible • Set color in GvrReticleMaterial (e.g. to Red)
  115. 115. Demo • Reticle changes shape when gazing at an object that responds to gaze events
  116. 116. Add Gaze Event • Add code to the gaze functions • Change cube colour when gazed at • Get initial cube material • Add code to gaze functions
  117. 117. Final CubeInteraction Script
  118. 118. Final Demo • Cube changes to blue colour when gazed at • Cube changes to white colour when gazed away from
  119. 119. PROJECT 6: MENU INTERACTION
  120. 120. Menu Placement • Different types of menu placement • Screen aligned - always visible on screen • World aligned - attached to object or location in VR scene • Camera aligned - moves with the user • This project shows a world aligned menu
  121. 121. Interacting with VR Menus • Touch input • Tap screen to select menu button • Suitable for handheld applications • Head/Gaze pointing • Look at menu button, click to select • Ideal for menus in VR display
  122. 122. Key Steps 1. Create New Scene and gaze support 2. Create User Interface menu object 3. Add buttons to user interface 4. Add button scripts 5. Add gaze interaction 6. Object interaction scripts 7. Make the menu disappear and reappear
  123. 123. Create New Scene • Create scene with cube and plane • Add materials • Import GoogleVRforUnity package • Drag GvrViewerMain into project hierachy
  124. 124. Setup Gaze Pointing • Drag GvrReticlePointer to Main Camera • Assets -> GoogleVR -> Prefabs -> UI • Add Gvr Pointer Physics Raycaster to Main Camera • Add component -> GvrPointerPhysicsRaycaster
  125. 125. Menu Functionality • Want to set up a menu that changes cube colour • Menu fixed in space • Located need object which it affects • Two buttons (white/blue) • Look at blue button to set cube colour to blue • Look at white button to set cube colour to white
  126. 126. Menu Implementation • Create a 2D canvas plane • Place canvas in VR scene where it is needed • Add buttons to the plane • Add scripts to the buttons • Triggered based on gaze input
  127. 127. Setting up Menu Canvas • Create Empty Object name it UserInterface • Create image object under UserInterface • Right click UserInterface -> UI -> Image • Set the canvas to world space • Move image until visible and resize • Change image colour
  128. 128. Menu Canvas
  129. 129. Add Buttons • Add two buttons to UI image • Colour one blue (Image script colour) • Remove button scripts • We'll add our own • Add sphere collider same size as button
  130. 130. Add Button Scripts • Create identical scripts for Blue and White buttons • Different names • BlueButton, WhiteButton • Include OnLook() Function • Gaze function
  131. 131. Blue Button Script
  132. 132. Add Event Triggers • Add event triggers to each button • Add component -> Event Trigger • Event trigger type as Pointer Enter • Set target object as button • Set target function as OnLook() • Add Event System to Hierarchy • Add component Gaze Input Module
  133. 133. Testing • Reticle changes style over buttons
  134. 134. Add Cube Behaviour • Add new script to cube, CubeActions • Add component -> New Script • Script that can change cube colour • Define local materials, copy existing materials • Create functions that can change colours • SetColorWhite(), SetColorBlue()
  135. 135. CubeActions Script
  136. 136. Add Gaze Behaviour • Edit button scripts to add cube colour changing • Add public CubeActions object • public CubeActions m_cube; • Call set colour function in OnLook function • m_cube.SetColorBlue(); • Drag Cube object to script form
  137. 137. Final BlueButton Script • White button same, but use m_cube.SetColorWhite();
  138. 138. Testing It Out • Cube changes colour depending on button looked at
  139. 139. Making the Menu Disappear • Don't want menu visible all the time • Right click with mouse to appear/disappear • Double tap with VR headset to appear/disappear • Create menu script • ToggleMenu function - turns menu on and off • Note: Add script to User Interface object • Add menu image as arguement
  140. 140. Menu Script
  141. 141. Testing it Out
  142. 142. PROJECT 7: MOVING MENU
  143. 143. Moving a Menu with the User • World aligned menus good for actions on objects • e.g. select to change colour • However you may want to move a menu with the user • e.g. menu for user navigation • This project shows how to add a menu to the camera • Menu moves with the user as they move through the VR scene
  144. 144. Key Steps 1. Start with scene from Project 6 2. Create canvas object 3. Add button to canvas 4. Create player 5. Add player movement script 6. Add script for canvas movement
  145. 145. User Experience • Have a walk button on the ground • When player looks down they can toggle button on and off • Look at walk button, click to toggle walking on and off
  146. 146. Create MoveButton Canvas • Create canvas object • UI->Canvas • Set render mode to world space • Resize and reposition • Put flat on plane, a little in front of camera
  147. 147. Add Image to Canvas • Create image on canvas • Right click canvas • UI -> image • Set image to transparent • Set image size to smaller than canvas
  148. 148. Add Button to Image • Right click image • UI -> button • Resize and move to fill image • Set colour and pressed colour • Set text to “Walk” • Expand button to see text object
  149. 149. Create Player Object • Create empty obect • Rename it Player • Create empty child • Rename it LocalTrans • Move Canvas under LocalTrans • Move Main Camera under Player
  150. 150. Add PlayerMove Script • Add script to Main Camera • ToggleWalk function that toggles walking • If walking on then move camera
  151. 151. PlayerMove Script
  152. 152. Connect Player Moving to Button • Select Button Object • In the Button Script On Click () Action • Set target object as Main Camera • Set target function as ToggleWalk • PlayerMove -> ToggleWalk
  153. 153. Event System • Make sure project has event system • Add at same level as Player • GameObject -> UI -> EventSystem • Add Gaze Input Module component • Add Component -> Gaze Input Module • Remove Standalone Input Module script • or deactivate by checking checkbox
  154. 154. Testing • Look at Walk button and click • Player moves, but button doesn't !
  155. 155. Moving Menu with Camera • Add a script to the LocalTrans object • CanvasMovement script • Script does the following: • finds the current camera position • sets LocalTrans to that position • rotates LocalTrans about y axis the same as camera • Outcome: • Menu moves with camera. • User can look down to click on button
  156. 156. CanvasMovement Script
  157. 157. Final Result • Menu follows camera/player • Note: You may have to experiment with different canvas position and scale settings for it to appear
  158. 158. DESIGN GUIDELINES
  159. 159. Google Design Guidelines • Google’s Guidelines for good VR experiences: • Physiological Considerations • Interactive Patterns • Setup • Controls • Feedback • Display Reticle • From http://www.google.com/design/spec-vr/designing- for-google-cardboard/a-new-dimension.html
  160. 160. Physiological Considerations • Factors to Consider • Head tracking • User control of movement • Use constant velocity • Grounding with fixed objects • Brightness changes
  161. 161. Interactive Patterns - Setup • Setup factors to consider: • Entering and exiting • Headset adaptation • Full Screen mode • API calls • Indicating VR apps
  162. 162. System Control • Issuing a command to change system state or mode • Examples • Launching application • Changing system settings • Opening a file • Etc. • Key points • Make commands visible to user • Support easy selection
  163. 163. Example: GearVR Interface • 2D Interface in 3D Environment • Head pointing and click to select
  164. 164. Interactive Patterns - Controls • Use fuze buttons for selection in VR
  165. 165. Interactive Patterns - Feedback • Use audio and haptic feedback • Reduce visual overload • Audio alerts • 3D spatial sound • Phone vibrations
  166. 166. Interactive Patterns - Display Reticle • Easier for users to target objects with a display reticle • Can display reticle only when near target object • Highlight objects (e.g. with light source) that user can target
  167. 167. Use Ray-casting technique • “Laser pointer” attached to virtual hand or gaze • First object intersected by ray may be selected • User only needs to control 2 DOFs • Proven to perform well for remote selection • Variants: • Cone casting • Snap-to-object rays
  168. 168. Gaze Directed Steering • Move in direction that you are looking • Very intuitive, natural navigation • Can be used on simple HMDs (Google Cardboard • But: Can’t look in different direction while moving
  169. 169. Cardboard Design Lab Application • Use Cardboard Design Lab app to explore design ideas
  170. 170. Cardboard Design Lab Video https://www.youtube.com/watch?v=2Uf-ru2Ndvc
  171. 171. RESOURCES
  172. 172. Books • Unity Virtual Reality Projects • Jonathan Linowes • Holistic Game Development with Unity • Penny de Byl
  173. 173. User Experiences for VR Website • www.uxofvr.com
  174. 174. Useful Resources • Google Cardboard main page • https://www.google.com/get/cardboard/ • Developer Website • https://vr.google.com/cardboard/developers/ • Building a VR app for Cardboard • http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/ • Creating VR game for Cardboard • http://danielborowski.com/posts/create-a-virtual-reality-game-for- google-cardboard/ • Moving in VR space • http://www.instructables.com/id/Prototyping-Interactive-Environments- in-Virtual-Re/
  175. 175. Resources • Unity Main site • http://www.unity3d.com/ • Holistic Development with Unity • http://holistic3d.com • Official Unity Tutorials • http://unity3d.com/learn/tutorials • Unity Coder Blog • http://unitycoder.com
  176. 176. www.empathiccomputing.org @marknb00 mark.billinghurst@unisa.edu.au

×