Learn more about VR controllers and the XR Input Mapping system in our VR Mini-Degree: https://academy.zenva.com/product/unity-game-development-mini-degree/?zva_src=slideshare-unitesydney2019
VR, the ability to experience and be present in worlds that don't exist is certainly a super exciting field
Using hand controllers is one of the main ways in which we can interact with these virtual worlds.
But the VR ecosystem is very fragmented. Until very recently, your options as a developer were to focus on just one platform using their native SDK, or to embark in a cross-platform hell where you were at the mercy of frameworks that were not always well maintained
But that's all changed for good in Unity 2019.1 thanks to the new XR input mapping, which gives you an easy way to access all buttons in all the main controllers.
In this presentation we are going to introduce the new xr input mapping system and we are going to develop 4 basic controller interactions that will be useful in a wide variety of scenarios:
- grabbing
- using
- throwing
- selecting
11. Agenda
11
✅ Headset and controller tracking
— XR Input Mapping
— Common interactions
— Grabbing
— Throwing
12. XR Input Mapping
12
— Available in 2019.1
— Standard set of Feature Usages
— Supports all major XR platforms
13. Hello World XR Button
13
In Update()
— Get the device (specifying left / right hand)
— Get the Feature Usage value
— Check the value is true (“pressed”)
Hello everyone Pablo here I'm a certified unity instructor and founder of Zenva, an online academy that teaches coding and game development.
VR, the ability to experience and be present in worlds that don't exist is certainly a super exciting field
Using hand controllers is one of the main ways in which we can interact with these virtual worlds.
But the VR ecosystem is very fragmented. Until very recently, your options as a developer were to focus on just one platform using their native SDK, or to embark in a cross-platform hell where you were at the mercy of frameworks that were not always well maintained
But that's all changed for good in Unity 2019.1 thanks to the new XR input mapping, which gives you an easy way to access all buttons in all the main controllers.
In this presentation we are going to introduce the new xr input mapping system and we are going to develop basic controller interactions that will be useful in a wide variety of scenarios:
- grabbing
- throwing
Let’s start with tracking
Let me ask you a question. What happens when you are playing a game and you move your laptop. The answer is of course nothing. The camera in a normal game is controller by the keyboard, gamepad or by the game itself in lets say a cut scene.
In VR on the other hand, the camera movement and rotation is controlled by the player. The same thing happens with hand-tracked controllers.
You can never impose rotation or movement to any of these, it would feel like someone taking your head and forcing you to look around.
In VR applications we create a container object which we can call an XR rig. And when you need to move the player, for example if they hop into a vehicle or a lift, what you move is the entire rig
There are two main types of VR experiences, and of course these definitions are not set in stone. We have stationary experiences where the location of the floor is not necessarily detected or known so we use a floor offset to set that height, and we have room-scale experience where the floor is detected by the tracking system
In Unity, the setup can be the same on both cases. We just use containing objects, the head is the camera, and we use a component named Tracked Pose Driver to track the position and rotation of these devices.
The tracked pose driver is available in the package manager under XR Legacy Input Helpers. And don’t be afraid by the word legacy here, there is no replacement for the time being so this is the only thing you can use for now.
The controllers also need a 3D model so that you can see them on your application.
That’s it in regards to tracking. Then the xr input mapping is the part that has been significantly improved in 2019.1.
The Xr input mapping system defines a set of standard feature usages that you can use in your code, and that translate to different buttons in the main VR and AR controller platforms
In this presentation we are going to introduce the new xr input mapping system and we are going to develop basic controller interactions that will be useful in a wide variety of scenarios:
- grabbing
- throwing
Let’s start with tracking
That’s it in regards to tracking. Then the xr input mapping is the part that has been significantly improved in 2019.1.
Hello everyone Pablo here I'm a certified unity instructor and founder of Zenva, an online academy that teaches coding and game development.