In these slides, you will learn the basics to help you get started with ray tracing in Unity 2019.3.
Speakers:
Soner Sen - Unity
Kate McFadden - Unity
Dany Ayoub - Unity
5. DXR Overview
5
API designed to leverage hardware-accelerated ray
tracing
Why trace rays?
— Off-screen rendering (reflection, refraction)
— Algorithms that calls for raycasting
Two major concepts to be concerned with:
— Ray Tracing Acceleration Structures:
What are we drawing?
— Ray Tracing Shaders:
How should we draw things?
7. Acceleration Structure
7
— New Unity class: RayTracingAccelerationStructure
— Manually or automatically managed
– Manual: AddInstance(), UpdateInstanceTransform()
— Specify layer masks to filter GameObjects to be added
— Call BuildRayTracingAccelerationStructure once a frame
8. — New Renderer setting: RayTracingMode
— In the order of least to most expensive
– Off
– Static
– DynamicTransform
– DynamicGeometry (WIP)
Acceleration Structure
8
9. Ray Tracing Shaders
9
— Raytrace shaders
– Raygeneration: First shader executed during dispatch
– Miss: Executes only if ray fails to intersect any object
— Surface shaders
– ClosestHit: Executes for the intersection nearest to ray
origin
– AnyHit: Executes for every intersection
— Callable Shaders
11. — New shader type: RayTracingShader
– Extension is .raytrace
— CommandBuffer API additions
– CommandBuffer.SetRayTracingShaderPass
– CommandBuffer.SetRayTracingAccelerationStructure
– CommandBuffer.SetRayTracing*Param
– e.g. SetRayTracingMatrixParam, SetRayTracingIntParam, etc.
– CommandBuffer.DispatchRays
– Analogous bindings also available from RayTracingShader class
itself, for immediate execution
Ray Tracing Shader API
11
12. Ray Tracing Shader Authoring
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
// Dispatch shader: Defines at minimum a ray generation shader, often also a miss shader.
// This is the shader that is dispatched, as one would a compute shader,
// for a given ray traced pass.
struct RayPayload { float4 color; uint2 launchIdx; }; // User-defined
[shader(“raygeneration”)]
void FullResRayGen()
{
uint2 launchIdx = DispatchRaysIndex().xy; // DXR callback
uint2 launchDim = DispatchRaysDimensions().xy; // DXR callback
float2 ndcCoords = (launchIdx / float2(launchDim.x - 1, launchDim.y - 1)) * 2 - float2(1, 1);
float3 viewDirection = normalize(float3(ndcCoords.x * aspectRatio, ndcCoords.y, -1);
RayDesc ray; // DXR defined
ray.Origin = float3(camera_IV[0][3], camera_IV[1][3], camera_IV[2][3]);
ray.Direction = normalize(mul(camera_IV, viewDirection));
ray.TMin = 0;
ray.TMax = 1e20f;
RayPayLoad payload;
payload.color = float4(0, 0, 0, 0);
TraceRay(accelerationStructure, 0, 0xFF, 0, 1, 0, ray, payload); // DXR callback
}
12
14. Surface Shader Authoring for Ray Tracing
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
// Material/Surface shader: Hit shaders should be defined as a pass in a shader used for a
// material in the scene.
Shader “FlatColor”
{
SubShader { Pass { CGPROGRAM
#pragma vertex vert
#pragma fragment frag
v2f vert (appdata v) { return UnityObjectToClipPos(v.vertex); }
fixed4 frag (v2f i) : SV_Target { return albedo; }
ENDCG
} }
SubShader { Pass Name "DefaultRTPass" { HLSLPROGRAM // Pass name must match that specified
by SetShaderPass()
#pragma raytracing
struct AttributeData { float2 barycentrics; }; // User-defined
[shader(“closesthit”)]
void FullResRayGen(inout RayPayload payload : SV_RayPayload,
AttributeData attribs : SV_IntersectionAttributes)
{ // A trivial hit shader that populates a bound RT with albedo of hit object
payload.color = albedo;
outputRT[payload.launchIdx] = albedo;
} ENDHLSL
} }
}
14
15. — Windows 10 v1809+
— Unity 2019.3b1+
— DXR capable graphics card with
latest drivers
Requirements for Ray Tracing
15
Unity > Edit > Project Settings > Player:
— Select DX12 as Windows Graphics
API
16. State of Ray Tracing in Unity
16
— Ray tracing API is pipeline-agnostic
– Officially supported for only HDRP
– HDRP is the only pipeline that actually implements features using ray
tracing
– Users can still use the public C# API to build their own features!
— Unsupported in 19.3:
– Intersection shaders
– Animated meshes
– Procedural geometry
17. HDRP Ray Tracing Architecture
17
— Primary rays for most effects are computed from depth/normal buffers
— Cluster-based lighting added to HDRP for ray tracing
— Render graph here is simplified and omits many HDRP stages
20. Want to learn more?
Project:
Unity Lighting Fundamentals
Tutorial:
Post-Processing Effects
(2018.x+)
Tutorial:
HDRP Material Properties
Subscribe now for a 3-month free trial at unity.com/learn-
premium with code: UNITECPH19
*Learn Premium is included in Unity Plus and Pro licenses. Just sign in with your Unity ID.
Editor's Notes
Hello Everyone. My name is Soner, I’m a lead developer at Unity.
Today me, Kate and Dany will be going over the details of ray tracing in Unity 2019.3 we are aiming to provide you enough background to start your own ray tracing projects
Our talk will start with me trying to explain the technical details of ray tracing in Unity. I will go over DXR, acceleration structure and ray tracing shaders API. I will continue with the integration into High Definition Render Pipeline. then Kate and Dany will take over to give you some examples and demo of how to get started with ray tracing in Unity.
Lets start with explaining how ray tracing works in Unity 2019.3. Disclaimer: ray tracing is still in preview mode as of today, September 26th. The core should not change, but there may be slight changes in the future.
I’ll start with an introduction to DXR and its integration with Unity.
As many of you may already know, DXR is a DX12 feature designed to leverage the hardware acceleration some graphics cards provide for ray tracing.
For real time rendering, we generally do this as a complement to rasterization rather than a replacement.
For example when rendering reflections or refractions that capture objects obscured from camera view, like in the screenshots to the right
Or as a direct upgrade for compute shaders in algorithms that call for raycasting, (like participating media/translucents, caustics, ambient occlusion or dynamic diffuse global illumination)
To start with, I wanna go over two main concepts:
(verbatim)
(verbatim)
The acceleration structure captures the scene we want to render in 3D space. This is the data structure against which we cast our rays, it is illustrated on the right here for the cornell box shown on the left.
We begin by defining one bottom-level acceleration structure for each unique mesh, in this case a sphere, a prism, and a plane.
Then in the next level up of this diagram, instances are constructed by pointing to an associated BLAS, including relevant transform and material information
Instances are then grouped together in a Bounding Volume Hierarchy by the driver, finally producing the top-level acceleration structure
This is exposed in Unity 2019.3 through the RayTracingAccelerationStructure class. It can be managed automaticaly or manually
If managed automatically, all eligible GameObjects are added to the AS as they are added to the scene
If managed manually, the user must explicitly add GameObjects to the AS using AddInstance and update transforms using UpdateInstanceTransform
you can specify layer masks on creation to filter which GameObjects may be added to your acceleration structure
you need to call BuildRayTracingAccelerationStructure once a frame
This command checks for updates to transforms in automatically managed structures, then either rebuilds or updates the AS on the GPU as needed.
Aside from layer masks, GameObjects are also filtered based on RayTracingMode, a new setting that’s been added to the Renderer
RayTracingMode lets the acceleration structure know how frequently an instance will be updated, and in what way
If raytracingmode is off, it’s trivially cheap since it won’t be ray traced.
If raytracing mode is static, the accelerationstructure expects that it will never be updated in any way.
For dynamictransform, the accelerationstructure will allow for updates of the renderer’s transform only
While for DynamicGeometry, both the renderer’s transform and vertex and index buffers may be updated.
Currently only Static and DynamicTransform are supported for ray tracing
DynamicGeometry is next on our list!
Next I want to go over ray tracing shaders, and I find it helpful to group them into two categories based on their use in Unity: raytrace shaders and surface shaders.
Raytrace shaders are generally defined in a .raytrace file on which you then call Dispatch, much like compute shaders
(verbatim)
(verbatim)
Surface shaders, on the other hand, are defined in the shaders used in materials applied to objects in the scene
(verbatim)x3
Note that a material with a closesthit shader may also have an anyhit shader for the same pass, since these hit shaders are executed at different points in the control flow.
Callable shaders also exist in DXR and are supported in Unity 2019.3, however I won’t go into detail on them here, but they’re basically function calls
So to illustrate these shaders in action,
We start with the RayGen shader, which casts all these rays into the acceleration structure
In this case the rays are casted originated from the camera, but you can cast them however you want
One of those rays doesn’t hit anything in the scene, and so executes the miss shader
Next we have a ray that intersects two objects, both with a ClosestHit shader. In this case, even if the hit on the cylinder gets registered first, only the sphere executes its ClosestHit shader, since ClosestHit shaders don’t execute until acceleration structure traversal is complete and the actual closest hit has been found.
In contrast, we have this ray that intersects both a ClosestHit and an AnyHit shader. Now if the prism gets intersection tested first, it will execute its AnyHit shader before acceleration structure traversal is complete. This means that when we later find the ClosestHit shader on the occluding cylinder, we end up discarding the results of the AnyHit shader.
So AnyHit shaders can be more expensive than ClosestHit shaders, but the reason you’d like to use them is for situations like these two transparent objects down here, where you would want to add surface information to the ray payload from every object hit for some simple alpha blend transparency.
//
AnyHit shaders can be used at lower cost by calling AcceptHitAndEndSearch, for example when testing for visibility (shadows, AO).
Technically an intersection shader is executed for every geometry; the built-in intersection shader is for triangle intersection.
Here’s how that fits into Unity. If you save your raygeneration and miss shaders in a .raytrace file, it will get imported as the new RayTracingShader class we have in 2019.3.
Then, the order of operations for dispatching a ray tracing shader goes as follows:
Specify the pass name you’re using for this dispatch. This needs to match the pass name used for your hit shaders in your surface shaders!
Bind the acceleration structure you’re tracing against, and any other properties like matrices, textures, etc.
Finally, call DispatchRays for a given dispatch width, height, and depth.
(verbatim)
So here’s a basic example of a ray generation shader. In red we have DXR-specific features, and in blue we have properties that are declared and bound elsewhere.
Firstly you have to declare what kind of ray tracing shader you’re authoring
You need to populate a description of the ray you are casting, and then you call TraceRay() to cast that ray into the acceleration structure.
This ray generation shader assumes the dispatch dimensions match the screen size,
from that calculates the direction for the ray originated from the camera and through each pixel in the screen
//The rays cast in this ray generation shader are a little off (for brevity); you’d want to fudge the ray direction to shoot through the center of the pixel rather than the edge
In the same .raytrace shader, you usually also see a miss shader.
Notice that the shader type is declared as “miss”.
The SV_RayPayload semantic is available to all shaders that are executed as a result of a TraceRay() call. RayPayload is entirely user-defined, the larger the payloads the more expensive it is. Try to keep payload as small as possible
This miss shader simply samples a sky cubemap based on the direction of the ray.
More detailed information on DXR callbacks and shader authoring may be found at the link to slides from Chris Wyman’s intro to DXR talk.
As for surface shaders, here’s an example of a Unity shader that has both a pass for rasterization and a pass for ray tracing.
You’ll notice that the ray tracing shader must be written in HLSL,
must include an additional ray tracing pragma,
and must include a pass name, which must be the pass name set to the raytrace shader prior to dispatch.
The SV_IntersectionAttributes semantic is available to hit shaders. AttributeData is also user defined, but usually it includes barycentrics, which define the position in a given triangle the ray intersects (in terms of distance from each vertex)
This closest hit shader does the same thing as the raterization shader- simply returning the albedo set to the material.
Now that you have some background, here’s what you need to get started with Unity Ray tracing.
(win10 verbatim)
(unity verbatim)
(So our implementation technically isn’t vendor-specific because DXR isn’t technically vendor-specific, but) you will need a graphics card that supports DXR for now,
As DXR is a DX12 fature dont forget to set your graphics api in the Player settings window to DX12
Ok, everything’s set up, and you’re ready to trace rays! The Ray tracing API works on any graphics pipeline, but...
We only officially support it for HDRP, which takes care of everything for you to provide state of the art ray-traced effects in a handy UI inspector
If you’re feeling adventurous, you’re still welcome to author your own ray tracing features using the API described before.
I also want to call out a few features that aren’t yet supported for ray tracing in Unity:
(verbatim)
Here is how it fits into the High Definition Rendering Pipeline
Shown in green here we have the ray traced effects supported by HDRP, and roughly where they fit into the HDRP render graph.
Note that this graph is condensed and many unrelated features are omitted
For these six ray tracing passes, we skip the primary eye raycast, instead we cast rays using the depth and normal GBuffers
An exception to this is transparents
transparents are handled after opaques have already been fully rendered, but before postprocessing
Acceleration structures are built and updated in immediate mode, asynchronously with the render graph shown here which is serialized on a rendering command buffer.
Now take a look at the these effects more closely. Kate and Dany the stage is yours.
If you want to continue learning, check out our learning platform at learn.unity.com.
You’ll find free learning resources, and you can sign up for a 30-day trial of Learn Premium to access live sessions with Unity experts, and the best learning resources updated for the latest Unity release.