2. The good old days
- Send vertex data to the GPU, and specify settings such as colour and fog.
- Many limitations - less low-level control
- Some APIs had “high level” concepts, such as sprites
PlayStation 1 “PSYQ” SDK
OpenGL (fixed function pipeline)
3. Programmable shading pipeline
- More control through the use of shaders
- Vertex shader: modify vertex positions
- Fragment shader: modify output colour
- Newer features: Tesselation shaders and geometry shaders
- Allows you to add screen-space effects by first rendering scene to texture
OpenGL 2.0 shading pipeline
5. Vertex
- Wikipedia: “a point where two or more curves, lines, or edges meet”
- Usually: A point (and its position, normal, texCoord, etc.) in a triangle
● Position
● Normal
● Texture coordinate
● Tangent / Bitangent
6. Vertex buffer
- Buffered vertex data on the GPU
- Vertex data is created on the CPU and then uploaded to the video device
Vertex layout
- Order of the vertex attributes/components (position, normal, texcoord)
- Each attribute can have its own buffer (slow) or be on the same buffer
- One buffer per attribute: (VVVV) (NNNN) (CCCC)
- Blocks in a batch: (VVVVNNNNCCCC)
- Interleaved: (VNCVNCVNCVNC) (“stride” = byte offset between attributes)
7. Uniforms / shader constants
- OpenGL: “Uniform” ≈ DirectX: “Shader constant”
- Per-material data sent to shaders
- Vertex data is per vertex - uniforms are per material
- Examples: material properties (colour, smoothness, specular
reflectiveness), light sources, cross-section plane
- In Unity, these are called “properties”, and you can set their value using
Material::SetFloat(...) / Material::SetInt(...), etc.
8. Rendering
Create vertex data (array of vertices)
Create vertex buffer (send vertices to GPU)
Bind vertex buffer and index buffer, and draw
From Ming3D: https://github.com/mlavik1/Ming3D
9. Problems
- Many rendering APIs: OpenGL, DirectX, Vulkan, GNM (PS4), Metal
- Each rendering API has its own shader language
- GLSL (OpenGL), HLSL (DirectX)
- Need to support several rendering APIs and shader languages, and in
some cases several versions of them
Solution: Make your own shader language and convert it to GLSL, HLSL, etc..
Unity has their own shader language (based on Nvidia’s Cg)
16. Shader semantic
- MSDN: “A semantic is a string attached to a shader input or output that
conveys information about the intended use of a parameter”
- Unity needs to know which attributes in the vertex layout are position,
normal, etc. (so it can buffer your mesh correctly)
- Some rendering APIs require semantics on all input/output data
- Vertex input/output: POSITION, TEXCOORD0, TEXCOORD1, NORMAL,
COLOR, TANGENT
- Fragment shader output: SV_Target
- Multiple render targets: SV_Target0, SV_Target1,..
18. Including
- You can split shader into several files, by putting common function in a
.cginc-file
- Unity’s standard shader functions are in:
- UnityStandardCore.cginc
- UnityStandardCoreForward.cginc
- UnityStandardShadow.cginc
- UnityStandardMeta.cginc
Unity shader includes location:
Program FilesUnityEditorDataCGIncludes
19. Multicompiler shader program variants
- If you want to enable/disable a set of features in a shader, without passing a
boolean uniform and checking its value, you can use multicompile program
variants
1. Add this after CGPROGRAM: #pragma multi_compile __ YOUR_DEFINE
2. Use #if YOU_DEFINE_HERE to conditionally enable/disable feature
3. Enable feature with: material.EnableKeyword("YOUR_DEFINE");
4. Disable feature with: material.DisableKeyword("YOUR_DEFINE");
This will create two versions of the shader: One where the the “YOUR_DEFINE”
preprocessor definition is defined, and one where it is not.
The #if-check is done at compile time (or when shader is converted)
Use shader_feature for multicompile definitions that will only be set in the material
21. Debugging
- Unity has RenderDoc integrations
- RenderDoc allows you to capture a frame, see all render API calls,
visualise input/output if each shader pass, visualise textures, inspect
material properties (uniforms / shader constants) and much more.
- See: https://docs.unity3d.com/Manual/RenderDocIntegration.html
- Alternatively use the Visual Studio shader debugger, which allows you to
add breakpoints, step through code and more:
https://docs.unity3d.com/Manual/SL-DebuggingD3D11ShadersWithVS.ht
ml
22. 1. Download RenderDoc: https://renderdoc.org/builds
2. Include #pragma enable_d3d11_debug_symbols in your shader’s
CGPROGRAM block, if you want to see property names and more.
3. Right-click on “Game” tab and load RenderDoc
4. While in-game, capture a frame