Successfully reported this slideshow.
Your SlideShare is downloading. ×

Paris Master Class 2011 - 07 Dynamic Global Illumination

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 40 Ad
Advertisement

More Related Content

Slideshows for you (20)

Similar to Paris Master Class 2011 - 07 Dynamic Global Illumination (20)

Advertisement

Recently uploaded (20)

Advertisement

Paris Master Class 2011 - 07 Dynamic Global Illumination

  1. 1. Dynamic Global Illumination Wolfgang Engel Confetti Special Effects Inc., Carlsbad Paris Master Class
  2. 2. Agenda • Requirement for Real-Time GI • Ambient Cubes • Diffuse Cube Mapping • Screen-Space Ambient Occlusion • Screen-Space Global Illumination • Reflective Shadow Maps • Splatting Indirect Illumination (SII)
  3. 3. Agenda
  4. 4. Agenda
  5. 5. Agenda
  6. 6. Requirement for Real-Time GI • Pre-calculated lighting disadvantages – hard to mimic a 24 hour cycle – storing those light or radiosity maps on disk or even the DVD / Blu-ray required a lot of memory – streaming the light or radiosity maps from disk or hard-drive through hardware to the GPU consumes valuable memory bandwidth – geometry with light maps or radiosity maps is not destructible anymore (this is a reason to avoid any solution with pre-computed maps) – while the environment is lit nicely, it is hard to light characters in a consistent way with this environment
  7. 7. Requirement for Real-Time GI • Following Next-gen Game requirements: – As less as possible pre-calculated – As less as possible memory consumption -> as much as possible is calculated on the fly – 24 hour sun movement – Destructible Geometry 
  8. 8. Ambient Cube • Ambient Cube (as used in Half-Life 2) (http://www2.ati.com/developer/gdc/D3DTutorial10_Half-Life2_Shading.pdf) ->Interpolate between six color values • Six colors are stored spatially in level data: – Represent ambient light flowing through that volume in space – Application looks this up for each model to determine the six ambient colors to use for a given model • HL2: local lights that aren’t important enough to go directly into the vertex shader are also added to the ambient cube
  9. 9. Diffuse Cube Mapping • [Brennan] • Steps – Render environment into a cube map or use static map – Blur it – Fetch it by indexing with the vertex normal • Vertex normal is used to index the cube map with the intention that it returns the total amount of light scattered from this direction
  10. 10. Diffuse Cube Mapping Original Blurred
  11. 11. Screen-Space Ambient Occlusion
  12. 12. Screen-Space Ambient Occlusion • See for the original approach [Kajalin] • Requires only Depth Buffer in original approach • Uses a sphere-like sampling kernel
  13. 13. Screen-Space Ambient Occlusion • Computes the amount of solid geometry around a point • The ratio between solid and empty space approximates the amount of occlusion P 1 P 2 P 3 P1 is located on a flat plane -> 0.5 P2 is located in the corner -> 0.75 P3 is located on the edge -> 0.25
  14. 14. Screen-Space Ambient Occlusion • A very simplified version of the source code comes down to: // very simplified version … float fPixelDepth = Depth.Sample( DepthSampler, IN.tex.xy ).r; for ( int i = 0; i < 8; i++ ) { float fSampleDepth = Depth.Sample(DepthSampler, vRotatedKernel.xy).r; float fDelta = max( fSampleDepth - fPixelDepth, 0 ); fOcclusion += fDelta; }
  15. 15. Screen-Space Ambient Occlusion • Designing a SSAO Kernel – Distance attenuation -> not necessary if the density of the samples is higher closer to the center of the kernel – Noise is necessary because even 16 taps spread out are not enough – Running on a quarter-sized render target will make large kernels like 16-tap possible
  16. 16. Screen-Space Ambient Occlusion
  17. 17. Screen-Space Global Illumination
  18. 18. Screen-Space Global Illumination • Every pixel in the G-Buffer is considered a secondary light source -> point lights • The normal is considered the light direction • Radiant intensity emitted at point p in direction ω Φp– the flux defines the brightness of the pixel light np – normal at point p spatial emission
  19. 19. Screen-Space Global Illumination • Irradiance at surface point p with normal n due to pixel light p is thus Φp– the flux defines the brightness of the pixel light np – normal at point p spatial emission • Evaluating indirect irradiance at a surface point x with normal x PTarget PSampling V NormalSampling NormalTarget
  20. 20. Screen-Space Global Illumination • PSampling contribution to Ptarget(Outgoing cosinus) ISampling = NSampling.-V • Ptarget contribution (Incoming cosinus) Itarget = NTarget.V • Attenuation: A = 1.0 / D2 • Result Result = Isampling * Itarget * A * ColorScreenSpaceRT
  21. 21. Screen-Space Global Illumination • Issues – All the issues from SSAO – Does not consider occlusion for the indirect light sources -> color changes based on viewing angle -> Different with reflective shadow maps …
  22. 22. Reflective Shadow Maps • Extended shadow map [Dachsbacher] - Multiple-Render Target (MRT) that holds – Depth Buffer – Normal – World space position – Flux • Each pixel in the MRT is considered a secondary light source; considered local point lights -> called pixel lights • Radiant intensity emitted at point p in direction ω Φp– the flux defines the brightness of the pixel light np – normal at point p spatial emission
  23. 23. • Irradiance at surface point p with normal n due to pixel light p is thus Φp– the flux defines the brightness of the pixel light np – normal at point p spatial emission • Evaluating indirect irradiance at a surface point x with normal x • Does not consider occlusion for the indirect light sources -> color changes based on viewing angle Reflective Shadow Maps
  24. 24. Reflective Shadow Maps • Challenge: a typical RSM is 512x512 -> evaluation of all those too expensive • Importance Sampling -> evaluate about 400 samples How to pick the sampling points: 1. Uniformly distributed random numbers are applied to polar coordinates 2. Then they compensate for the varying density by weighting the achieved samples ξ1- uniformly distributed random number ξ2- uniformly distributed random number • Precompute sampling pattern and re-use it for all indirect light computations • Still too expensive -> Screen-Space interpolation
  25. 25. Reflective Shadow Maps • Screen-Space Interpolation 1. pass – compute the indirect illumination for a low-resolution image in screen-space 2. pass – render full-res + check if we can interpolate from the four surrounding low- res samples • A low-res sample is regarded as suitable for interpolation if – the sample’s normal is similar to the pixel’s normal, – and if its world space location is close to the pixel’s location • Each samples contribution is weighted by the factors used for bi-linear interpolation – If not all samples of the four samples are used -> normalization – If all three or four samples are considered as suitable -> interpolate between those
  26. 26. Reflective Shadow Maps • How to pick important pixel lights 1. Not visible in shadow map: x is not directly illuminated -> not visible in shadow map 2. Normals point away: x-1 and x-2 are relatively close, but since their normal points away from x they do not contribute 3. Same floor: x1 is very close but lies on the same floor -> won’t contribute 4. 1 – 3 not applicable: x2 will contribute 5. What is important: distance between x and a pixel light xp in the shadow map is a reasonable approximation for their distance in world space -> if the depth values with respect to the light source differ significantly, the world space distance is much bigger and we thus overestimate the influence -> important indirect lights will always be close, and these must also be close in the shadow map
  27. 27. Splatting Indirect Illumination •Build on RSM •Instead of iterating over all image pixels and gathering indirect light from RSM -> Selects a subset of RSM pixels == pixel lights and distributes their light to the image pixels, using a splat in screen-space -> Deferred lighting idea re-visited
  28. 28. Splatting Indirect Illumination • Stores – Flux – N.L * Color – Normal – just world-space normal – Position – offset in neg. direction of normal – Gloss value – gloss value is instead of specular component • in Reflective Shadow map
  29. 29. Splatting Indirect Illumination •How to create a Reflective Shadow Map: RESULT_RSM psCreateRSM( FRAGMENT_RSM fragment ) { RESULT_RSM result; float3 worldSpacePos = fragment.wsPos.xyz; float3 lightVector = normalize( lightPosition - worldSpacePos ); float3 normal = normalize( fragment.normal ); // compute flux float4 flux = materialColor * saturate( dot( normal.xyz, lightVector ) ); // we squeeze everything into two floats quadrupels, as vertex texture fetches are expensive! float squeezedFlux = encodeFlux( flux.xyz ); float phongExponent = 1.0f; float3 mainLightDirection = normal; if ( dot( mainLightDirection, lightVector ) < 0.0f ) normal *= -1.0f; result.color[ 0 ] = float4( mainLightDirection, squeezedFlux ); // we move position a little bit back in the direction of the normal // this is done to avoid rendering problems along common boundaries of walls -> illumination integral has a // singularity there result.color[ 1 ] = float4( worldSpacePos - normal * 0.2f, phongExponent ); return result; }
  30. 30. Splatting Indirect Illumination • How do we fetch the RSM? • Splat position: pre-calculated distribution of sample points in light space -> 3D poisson disk distribution Left - Sample points in light space / Right – Splats in main view
  31. 31. Splatting Indirect Illumination Pseudo code 1.Fetch pre-calculated distribution of sample points in light space in a texture 2.Calculate size of ellipsoid bounding volume in screen- space (see article for code)
  32. 32. Splatting Indirect Illumination 3.Reduce size of splat based on – Size of the bounding volume – Distance camera <-> pixel light
  33. 33. Splatting Indirect Illumination • Going back to the original Reflective Shadow Map algorithm: • Irradiance at surface point p with normal n due to pixel light p is thus Φp – the flux defines the brightness of the pixel light np – normal at point p spatial emission • Evaluating indirect irradiance at a surface point x with normal x PTarget PSampling V NormalSampling NormalTarget
  34. 34. Splatting Indirect Illumination • PSampling contribution to Ptarget(Outgoing cosinus) ISampling = NSampling.-V • Ptarget contribution (Incoming cosinus) Itarget = NTarget.V • Attenuation: A = 1.0 / (D2 * Scale + Bias) • Result Result = Isampling * Itarget * A * ColorScreenSpaceRT
  35. 35. Splatting Indirect Illumination • Glossiness based on outgoing cosinus float Gloss = pow(NSampling.-V, phongExp)
  36. 36. Splatting Indirect Illumination • Let’s single-step through the splatting pixel shader float4 psSI( FRAGMENT_SI fragment ) : COLOR { float4 result; // get surface info for this pixel float4 wsPos = tex2Dproj( DSSample0, fragment.pos2D ); float4 normal = tex2Dproj( DSSample1, fragment.pos2D ) * 2.0f - 1.0f; // decode world space position (8 bit render target!) wsPos = ( wsPos - 0.5f ) * WSSIZE; // compute lighting float l2, lR, cosThetaI, cosThetaJ, Fij, phongExp; // lightPos comes from the RSM read in the vertex shader // lightPos – pixel light // wsPos – fragment position float3 R = fragment.lightPos - wsPos.xyz; // R = vector from fragment to pixel light l2 = dot( R, R ); // squared length of R (needed again later) R *= rsqrt( l2 ); // normalize R lR = ( 1.0f / ( distBias + l2 * INV_WS_SIZE2_PI * 2 ) ); // distance attenuation (there's a global scene scaling factor "...WS_SIZE...")
  37. 37. Splatting Indirect Illumination // lightDir comes from RSM read in vertex shader // this is the world space normal stored in the RSM of the pixel light == direction of the pixel light cosThetaI = saturate( dot( fragment.lightDir, -R ) ); // outgoing cosine phongExp = fragment.lightDir.w; // with a phong like energy attenuation and widening of the high intensity region if ( phongExp > 1.0f ) cosThetaI = pow( cosThetaI, phongExp * l2 ); // compare world-space normal at fragment that gets rendered with R cosThetaJ = saturate( dot( normal, R ) ); // incoming cosine Fij = cosThetaI * cosThetaJ * lR; // putting everything together #ifdef SMOOTH_FADEOUT // screen-space position of center of splat float3 t1 = fragment.center2D.xyz / fragment.center2D.w; // screen-space position of the pixel we are currently shading float3 t2 = fragment.pos2D.xyz / fragment.pos2D.w; // lightFlux.w holds fade out value based on distance between camera and the pixel light position // xy is based on screen-space and z is based on the distance between the camera and the pixel position float fadeOutFactor = saturate( 2 - 6.667 * length( t1.xy - t2.xy ) / fragment.lightFlux.w ); Fij *= fadeOutFactor; #endif result = fragment.lightFlux * Fij; // transfer energy! return result; } Green arrow – R Orange arrow – normal fragment Blue arrow – pixel light direction
  38. 38. Splatting Indirect Illumination • SII importance Sampling -> doesn’t seem to work on modern hardware -> ignore it for now
  39. 39. Conclusion • Screen-Space Global Illumination – allows to generate the impression of color bleeding with the lowest possible amount of memory – does not require any transform – ”fits” into a PostFX pipeline – has all the issues of SSAO + false colors • Reflective Shadow Maps + SII – higher quality than SSGI – “fits” into a Deferred Lighting engine and any Cascaded Shadow Map approach – requires one or more additional G-Buffers from the point of view of the light – more expensive
  40. 40. References • [Brennan] Chris Brennan, “Diffuse Cube Mapping”, ShaderX, pp. 287 - 289 • [Dachsbacher] Carsten Dachsbacher, Marc Stamminger, “Reflective Shadow Maps”, http://www.vis.uni-stuttgart.de/~dachsbcn/download/rsm.pdf http://www.vis.uni-stuttgart.de/~dachsbcn/publications.html • [DachsbacherSii] Carsten Dachsbacher, Marc Stamminger, “Splatting Indirect Illumination”, http://www.vis.uni-stuttgart.de/~dachsbcn/download/sii.pdf • [Kajalin] Vladimir Kajalin, “Screen-Space Ambient Occlusion”, pp. 413 – 424, ShaderX7 • [Loos] Bradford James Loos, Peter-Pike Sloan, “VolumetricObscurance”, http://www.cs.utah.edu/~loos/publications/vo/vo.pdf • [Nichols] Greg Nichols, Chris Wyman, “Multiresolution Splatting for Indirect Illumination”, http://www.cs.uiowa.edu/~cwyman/publications/files/techreports/UICS-TR-08-04.pdf • [Ritschel] Tobias Ritschel, “Imperfect Shadow Maps for Efficient Computation of Indirect Illumination”, http://www.uni-koblenz.de/~ritschel/ • [Zink] Jason Zink, “Screen Space Ambient Occlusion”, http://wiki.gamedev.net/index.php/D3DBook:Screen_Space_Ambient_Occlusion

Editor's Notes

  • Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
  • Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
  • Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
  • Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
  • ω – greek small letter omega
    Φ – greek capital letter Phi
  • ω – greek small letter omega
  • ω – greek small letter omega
    Irradience - the amount of light or other radiant energy striking a given area of a surface; illumination

×