Advanced Lighting Techniques Dan Baker (Meltdown 2005)


Published on

Published in: Business, Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Advanced Lighting Techniques Dan Baker (Meltdown 2005)

    1. 2. Advanced Lighting Techniques Dan Baker Software Design Engineer Microsoft Corporation
    2. 3. Review The BRDF <ul><li>B idirectional R eflectance D istribution F unction </li></ul><ul><ul><li>Ratio of irradiance to reflected radiance </li></ul></ul>ω e dω i ω i
    3. 4. The BRDF <ul><li>Incident, excitant directions defined in the surface’s local frame (4D function) </li></ul>N ω e ω i T θ i θ e φ i φ e
    4. 5. The BRDF <ul><li>Isotropic BRDFs are 3 D functions </li></ul>N ω e ω i θ i θ e φ
    5. 6. The Reflection Equation
    6. 7. The Reflection Equation
    7. 8. The BRDF <ul><li>The laws of physics impose certain constraints on a BRDF </li></ul><ul><li>To be physically plausible , it must be: </li></ul><ul><ul><li>Reciprocal </li></ul></ul><ul><ul><li>Energy-Conserving </li></ul></ul><ul><li>You are already using a BRDF, even if you don’t know it </li></ul><ul><ul><li>Phong </li></ul></ul><ul><ul><li>Blinn-Phong </li></ul></ul><ul><ul><li>Diffuse </li></ul></ul>
    8. 9. How Do We Evaluate a Lighting Model? <ul><li>Real time graphics technology is based on rasterization </li></ul><ul><li>Currently, our pipeline consists of triangles, which get processed by a vertex shader, and then rasterized </li></ul><ul><li>The rasterization step produces pixels (aka fragments), which are then processed and placed into a rendertarget </li></ul>
    9. 10. A Modern Real Time Graphics System Vertex Buffer Input Assembler Vertex Shader Setup Rasterizer Output Merger Pixel Shader Index Buffer Texture Render Target Depth Stencil Texture Memory memory programmable fixed Sampler Sampler Constant Constant
    10. 11. Tomorrows Graphic’s System Vertex Buffer Input Assembler Vertex Shader Setup Rasterizer Output Merger Pixel Shader Geometry Shader Index Buffer Texture Texture Render Target Depth Stencil Texture Stream Buffer Stream out Memory memory programmable fixed Sampler Sampler Sampler Constant Constant Constant Memory
    11. 12. Where We Have Control <ul><li>Shader controls each process point </li></ul><ul><li>With some restrictions, can perform any computation we want </li></ul><ul><li>A reflectance model can be evaluated at any of these stages, or any combination </li></ul><ul><li>Historically, computations were done on vertex levels, but have steadily moved into pixels </li></ul><ul><li>Soon, will be able to shade on a triangle </li></ul>
    12. 13. Frequency of Evaluation <ul><li>Generally, vertex evaluation is lowest frequency, while pixel is at a higher frequency </li></ul><ul><li>Not the case for low resolution </li></ul><ul><li>Small triangles (sliver) not rendered by realtime hardware </li></ul><ul><li>Geometry aliasing avoided by not drawing sliver triangles </li></ul>
    13. 14. An Example BRDF <ul><li>In HLSL, the Blinn-Phong model looks like: </li></ul><ul><li>float3 BlinnPhong( float3 L, float3 V) </li></ul><ul><ul><li>{ </li></ul></ul><ul><ul><li>float3 H = normalize (V + L); </li></ul></ul><ul><ul><li>float3 C = Ks* pow ( dot (H, float3 (0,0,1)), Power) + Kd* dot (N,L); </li></ul></ul><ul><ul><li>} </li></ul></ul>
    14. 15. Gamma Space <ul><li>BRDFs operate in linear space, not gamma space </li></ul><ul><li>Most albedo textures are authored in gamma space, so must convert into linear space in the shader (and convert them back into gamma space) </li></ul><ul><li>Can use SRGB to convert gamma albedo textures into linear space, gives more precision where needed and acts as a compression scheme </li></ul>
    15. 16. Gamma Correcting <ul><li>If we render straight to the screen, and backbuffer isn’t linear (usual case), need to go into gamma </li></ul><ul><li>sqrt is a close approximation </li></ul><ul><li>float3 BlinnPhong( float3 L, float3 V) </li></ul><ul><ul><li>{ </li></ul></ul><ul><ul><li>float3 H = normalize (V + L); </li></ul></ul><ul><ul><li>float3 C = Ks* pow ( dot (H, float3 (0,0,1)), Power) + Kd* dot (N,L); </li></ul></ul><ul><ul><ul><li>C = sqrt (C); </li></ul></ul></ul><ul><ul><li>} </li></ul></ul>
    16. 17. Dynamic Range <ul><li>BRDFs can have a wide range of color intensities </li></ul><ul><li>8 bit per channel backbuffer does poor job of capturing range </li></ul><ul><li>Easy for information to be lost – e.g. clamped to max value, or not enough precision </li></ul>
    17. 18. Blooming/Tone Mapping <ul><li>Bright pixels bleed to neighbors </li></ul><ul><li>Exposure control </li></ul><ul><li>If these are used – BRDF will store color value stored in rendertarget, and an image space filter applied </li></ul><ul><li>Likely de-facto standard in future </li></ul>
    18. 19. How Do We Evaluate a BRDF? <ul><li>Direct Evaluation </li></ul><ul><ul><li>Make an ALU program in the GPU </li></ul></ul><ul><li>Texture Evaluation </li></ul><ul><ul><li>Perform a texture lookup </li></ul></ul><ul><li>A combination of the 2. </li></ul><ul><ul><li>Factor some things into textures </li></ul></ul><ul><ul><li>Do others with ALU </li></ul></ul>
    19. 20. How Do We Represent Material Variation? <ul><li>Microvariation is important </li></ul><ul><li>Want more then just a single BRDF for a surface, this BRDF must change across the surface </li></ul><ul><li>Bumpmapping is a coarse variation </li></ul>
    20. 21. Which Reflectance Model to Use <ul><li>Different reflectance models have different costs </li></ul><ul><li>Cost – Data </li></ul><ul><li>Cost – ALU </li></ul><ul><li>Cost – Time to make content </li></ul><ul><li>Cost – Accuracy </li></ul><ul><li>Consider: Strong trend toward more ALU, less data </li></ul>
    21. 22. BRDF Costs, Minimal Surface Variation 10 + 5 *n 0 Lafortune Direct 35 0 Cook-Torrance 10 4 Ashikhmin/Shirley factored 40 0 Ashikhmin/Shirley 5 1 Banks Factored 12 0 Banks Direct 2 1 Blinn-Phong factored 7 0 Blinn-Phong Direct ALU costs Texture costs Model
    22. 23. BRDF Costs With Surface Variation 30 + 5*Lobes 2 Lafortune Direct 40 1 Cook-Torrance 30 6 Ashikhmin/Shirley factored 50 (60)* 2 Ashikhmin/Shirley 18 2 Banks Factored 25 1 Banks Direct 10 2 Blinn-Phong factored 15 1 Blinn-Phong Direct ALU costs Texture costs Model
    23. 24. Accuracy of Blinn-Phong chart courtesy of Addy Ngan, Fr é do Durand, and Wojciech Matusik
    24. 25. Accuracy of Ashikhmin Shirley chart courtesy of Addy Ngan, Fr é do Durand, and Wojciech Matusik
    25. 26. Getting Variation <ul><li>Could sample real materials to generate a BRDF map [McAllister SBRDF]. </li></ul><ul><li>But, variation usually done by an artist. </li></ul><ul><li>Sometimes we get variation by using an understanding of mesogeometry </li></ul><ul><li>Sometimes it is done by making changes in the assumption of the microgeometry </li></ul>
    26. 27. Bump Mapping
    27. 28. Pixel Level Evaluation
    28. 29. Pixel Level Evaluation, Shift
    29. 30. Shifting Samples <ul><li>The sample points change drastically as the triangle moves and changes size </li></ul><ul><li>If using micro variation, e.g. loading data elements from a texture, the evaluation can be significantly different </li></ul><ul><li>Texture filters – solves this problem for linear data elements </li></ul><ul><li>Linear for temporal, MIP mapping for resolution changes </li></ul>
    30. 31. But… <ul><li>Texture filters are linear </li></ul><ul><li>BRDFs are usually non linear </li></ul><ul><li>Get some image stability, but not great results </li></ul><ul><li>Must at least prevent radical shifts in image </li></ul><ul><li>Often, variation must be mitigated </li></ul><ul><li>Would be nice to texel shade instead </li></ul>
    31. 32. What Does a Sample Point Mean? What happens when this pixel is evaluated? Given a set of parameters interpolated from the triangles 3 vertices, how do we go about generating a color?
    32. 33. Texture Filtering Review A (1-A) B (1-B) Sample Color = A*B * + A*(1-B) * + (1-A)*B * + (1-A)*(1-B)* +
    33. 34. Texture swimming
    34. 35. What about lower resolutions?
    35. 36. A common hack: Level of Detail MIP Level By lowering the amplitude Of the displacement, effectively decreasing to a less complex model as the model moves further away. This will prevent aliasing, but isn’t accurate
    36. 37. Scale independent lighting <ul><li>Ideally, the size in pixels of an object on the screen should not affect its overall appearance </li></ul><ul><li>High frequency detail should disappear </li></ul><ul><li>Global effects should stay the same </li></ul><ul><li>Reasoning behind MIP mapping </li></ul><ul><li>Shouldn’t see drastic changes in image as sample points change </li></ul>
    37. 38. <ul><li>A low resolution rendering of an object should look like a scaled down version of the high resolution one </li></ul>Scale independent lighting
    38. 39. MIP Mapping r t
    39. 40. Mip mapping for diffuse <ul><li>For a simple diffuse case, the lighting calculating can be approximately refactored </li></ul><ul><li>The normal integration can also be substituted by a MIP map aware texture filter </li></ul>
    40. 41. Non Linear Lighting models Blinn-Phong isn’t Linear!
    41. 42. Texture Space Lighting <ul><li>Rasterize using texture coordinates as a position </li></ul><ul><li>Equivalent to explicitly evaluating the summation </li></ul><ul><li>An object needs a texture atlas </li></ul><ul><li>The values in the texture space are now colors – and are linear </li></ul><ul><li>Create a MIP chain explicitly or through the auto MIP gen option </li></ul><ul><li>Render the object with this MIP chain </li></ul>
    42. 43. Texture Space Lighting Rasterize triangles using texture coordinates as positions, the left image is the normal sampled at each point, and the right image is the computed lighting
    43. 44. Texture Space lighting We paste the texture onto the object
    44. 45. TSL: A case study <ul><li>Texture Space Lighting (TSL) can allows us to have high frequency lighting on fast moving objects </li></ul><ul><li>Will only have aliasing problems associated with standard filtering techniques </li></ul><ul><li>A roadway is a perfect candidate, subtle specular reflections with a high degree of motion </li></ul>
    45. 46. A roadway, snapshots We can see Moiré patterns on any filtered non linear functions. Additionally, temporal aliasing becomes problematic.
    46. 47. Using TSL The artifacts are largely mitigated if we render the non linear function and MIP reduce. We can also see more high frequency detail.
    47. 48. How this demo works <ul><li>Each road segment is rendered into a 1024x1024 rendertarget in texture space – alpha is set to 0 </li></ul><ul><li>AUTOMIPGEN is then used to do a simple mip filter of the highest level rendering </li></ul><ul><li>This model is then light with the prelit, mip filtered texture and rendered with full aniso </li></ul><ul><li>Art content is simple – just an albedo texture, a normal map with a glossy term </li></ul><ul><li>Everything is done in linear space </li></ul>
    48. 49. The Shader void lightPositional( …) { // norm, light and half are in tangent space // power, Kd, Ks and norm come from textures … //normalized blinn phong float fS = pow ( saturate ( dot (norm, half)), power)) * power; float fD = saturate ( dot (norm, light)) * diffuseIntensity; = fS * Ks + fD * Kd; vColorOut.a = 1; //put us in gamma space for the direct rendering only if (bTextureView) vColorOut = sqrt (vColorOut); }
    49. 50. Pasting the texture on the Scene void light_from_texture( float2 vTexCoord: TEXCOORD0, out float4 vColorOut: COLOR0 ) { // this is done with full anisotropy turned on // with SRGBTexture set to true vColorOut = tex2D (LightMapSampler, vTexCoord); //for atlased objects, we do not want to blend in // unrendered pixels if ( vColorOut.a > 1e-5 ) /= vColorOut.a; //put us in gamma space vColorOut = sqrt (vColorOut); }
    50. 51. Drawing polygons on the backside [ emittype [ triangle MyVType ]] [ maxvertexcount [3]] void ClipGeometryShader( triangle MyVType TextureTri[3]) { float2 coord1 = project(TextureTri[0]); float2 coord2 = project(TextureTri[1]); float2 coord3 = project(TextureTri[2]); float3 Vec1 = float3 (coord3 – coord1, 0); float3 Vec2 = float3 (coord3 – coord2, 0); float Sign = cross (Vec1, Vec2).z; if (Sign > -EPSILON) { emit (TextureTri[0]); emit (TextureTri[1]); emit (TextureTri[2]); cut ; } }
    51. 52. Problems with TSL <ul><li>Main problem is performance </li></ul><ul><li>We can’t render each object in the screen at a fixed resolution and scale down </li></ul><ul><li>We need a way to render the object at a reduced resolution which looks close to the higher detail one when zoomed out </li></ul><ul><li>Basically, we need to know how to create non linear MIP chains </li></ul>
    52. 53. MIP-Mapping Reflectance Models
    53. 54. MIP-Mapping Reflectance Models <ul><li>Commonly thought of as the normal map filtering problem </li></ul><ul><li>Some common techniques involve first MIP mapping the height map, and then creating a normal map from them, or fading the normal to (0,0,1) </li></ul><ul><li>All of these techniques are grossly inaccurate– in fact, the normal is the least important aspect of a correct MIP filter </li></ul><ul><li>Only NDM (Olano and North) filters linearly </li></ul>
    54. 55. A little bit of microfacet Theory <ul><li>Surface modeled as flat microfacets </li></ul><ul><ul><li>Each microfacet is a Fresnel mirror </li></ul></ul><ul><ul><li>most BRDFs are microfacet models </li></ul></ul><ul><li>Normal Distribution Function (NDF) p ( ω ) </li></ul><ul><ul><li>p ( ω ) d ω is fraction of facets which have normals in the solid angle d ω around ω </li></ul></ul><ul><ul><li>ω is defined in the surfaces’ local frame </li></ul></ul><ul><ul><li>There are various different options for p ( ω ) </li></ul></ul>
    55. 56. Microfacet Theory <ul><li>For given ω i and ω e , only facets oriented to reflect ω i into ω e are active </li></ul><ul><li>These facets have normals at ω h </li></ul>ω i ω e ω i ω i ω i ω i ω i ω i ω e ω e ω e ω e ω e ω e ω h ω h ω h ω h ω h ω h ω h
    56. 57. Microfacet Theory <ul><li>Fraction of active microfacets is p ( ω h ) d ω </li></ul><ul><li>Active facets have reflectance R F ( α h ) </li></ul><ul><li>α h is the angle between ω i (or ω e ) and ω h </li></ul>ω i ω i ω i ω e ω i ω i ω i ω i ω e ω e ω e ω e ω e ω e ω h ω h ω h ω h ω h ω h ω h
    57. 58. So what does a MIP map mean anyway? <ul><li>Non-linear lighting functions fundamentally represent microfacets </li></ul><ul><li>Specifically, represents a distribution of micofacets </li></ul><ul><li>A mip map is a lower resolution image which best represents the higher resolution one </li></ul><ul><li>But – how does this work if we have a bunch of little mirrors? </li></ul>
    58. 59. A simple non linear function The Blinn-Phong lighting model is one of the most common half angle based lighting functions. The above is a normalized version , that is the total emitted energy is close to constant for any power. This model is parameterized by N, Kd, Ks, and p. Changing these values changes the properties of the underlying surface
    59. 60. A common approach The image on the left is lit with MIP maps fading to flat, while the image on the right is what the image would look like if rendered at a higher resolution and scaled down. The objects do not physically look the same.
    60. 61. A more correct MIP map The images in the middle were rendered directly to the screen using a BRDF approximating MIP map. The more we zoom out, the larger the difference between the correct and incorrect approaches
    61. 62. Examining a mip level – simple 32 32 32 32 32 Next MIP Level – same thing
    62. 63. Examining a mip level 8 45 16 25 25 Next MIP Level, Average Power?
    63. 64. Examining a MIP level 32 32 32 32 32 Next MIP Level, Average power, normal?
    64. 65. What about a larger patch and a lower mip? 5
    65. 66. Setting up the problem <ul><li>Forgetting for a second about texture filtering, each MIP level should approximate the signal in the higher level texture </li></ul><ul><li>For a BRDF model, this means that the lower MIP levels should approximate the more detailed version </li></ul><ul><li>In other words, we want the BRDF at a different scale </li></ul>
    66. 67. A simple BRDF <ul><li>To start, consider the specular part of the normalized Blinn-Phong BRDF </li></ul><ul><li>For each texel, there is a function F(V,L), which given a View direction and a Light direction, returns a color </li></ul><ul><li>Each texel has a different N, Kd, and Power to get surface variation </li></ul><ul><li>N, Ks, and Power are the parameters of a Blinn-Phong lighting model </li></ul>
    67. 68. Reflectance at a single texel Ks = .02, P = 30 Ks = .02, P = 30 Ks = .02, P = 24 Ks = .01, P =30 Ks = ???, P = ??? (?,?,?) For a given light and view direction, how do we pick a Ks, a Power and Normal so that we get the same final color? MIP Level 0 MIP Level 1
    68. 69. Solving for a single patch of texels For a given patch of texels w wide and h tall: Where,
    69. 70. We care about all pairs of L and V Creating an error function for a single pair of L and V: Need to find the minimum error across all possible V and L
    70. 71. Creating a MIP filter <ul><li>We want to find a value of N, k, and p such that E(N, k, p) is the smallest value possible </li></ul><ul><li>This is a non-linear optimization problem </li></ul><ul><li>There are many ways to fit non-linear problems, for this problem we chose to use BFGS which is a quasi-Newton algorithm </li></ul>
    71. 72. Non Linear fitting <ul><li>BFGS requires the partial derivatives of the error function with respect to each parameter we want to fit </li></ul><ul><li>The derivatives must be continuous over the space we are interested in </li></ul><ul><li>Double integral cannot be solved analytically, so it is numerically evaluated </li></ul><ul><li>A good way to express the normal is as an X, Y with an implied Z. But error function must take care to keep the normal normalized </li></ul>
    72. 73. Giving a good first guess <ul><li>Non-linear algorithms work well when we don’t have many local minima and we have a reasonable starting guess </li></ul><ul><li>For the Blinn-Phong, seed of the averaged Normal, the averaged Ks, and a low power – less then half of the previous level works well </li></ul><ul><li>But, power should not be near or less then 1! </li></ul>
    73. 74. Non linear with Blinn Phong With the Blinn-Phong BRDF, the error plot shows that the function is mostly smooth. As long as we take a guess in the dark region, we should be fine. Graph is log(1 + error) power Ks
    74. 75. Speeding it up <ul><li>Running a BFGS with that error function took a few hours on a 512x512 texture </li></ul><ul><li>Gives a good fit, but still too slow </li></ul><ul><li>However, this function can be refactored in terms of the half angle </li></ul><ul><li>This reduces dimensionality from 4 to 2! </li></ul><ul><li>Additionally, summation in middle of integral is constant for a given H, so this can be precomputed and reused, assuming enough memory </li></ul>
    75. 76. Speeding it up But, we must use Half Angle Distribution
    76. 77. Results <ul><li>Resulting MIP level looks closer to original detail </li></ul><ul><li>Turns out that the optimal normal is usually pretty close to the average normal. You can get away with minimizing just P and Ks </li></ul><ul><li>The power decreases rapidly for a rough surface. Each MIP has about 50-70% of the exponent of its previous level! </li></ul><ul><li>An object with a shiny rough surface will appear dull as the object is further away </li></ul>
    77. 78. Dealing with Anisotropy <ul><li>Using a simple Blinn-Phong BRDF doesn’t allow anisotropic effects to be captured at lower MIP levels </li></ul><ul><li>A simplified, half angle version of the Ashikhmin-Shirley BRDF can do this </li></ul><ul><li>We now have E(N, T, Ks, Pu, Pv), that is, we have 2 powers for the tangent frame – and the Tangent frame might vary per pixel </li></ul><ul><li>Anisotropic objects, e.g. a record with grooves on it should stay anisotropic </li></ul>
    78. 79. Why Ashikhmin-Shirley ? <ul><li>Convergence is generally better, compared to other BRDFS </li></ul><ul><li>Is half angle based – so MIP evaluation is still fast </li></ul><ul><li>Is data light, computation heavy </li></ul><ul><li>Can exactly represent Blinn-Phong </li></ul>
    79. 80. Anisotropy <ul><li>For more complex BRDFs, one should take care to ensure function well behaved at limit points </li></ul>
    80. 81. A side note: <ul><li>Artists can create extremely high resolution height maps, modeling the micro geometry aspects of a surface </li></ul><ul><li>Brush strokes, grooves, scratches etc would just appear on the high-res height map </li></ul><ul><li>A fitting algorithm could then capture the various aspects – roughness, anisotropy, etc. </li></ul>
    81. 82. Solution for high quality rendering <ul><li>Determine objects maximum MIP level </li></ul><ul><li>Render from that MIP level downward, culling off unseen polygons </li></ul><ul><li>Compute lower MIP levels </li></ul><ul><li>Draw object with the lit texture, using best filter possible* </li></ul><ul><li>At lower MIP levels, once texel variation is minimal, disable TSL </li></ul><ul><li>*Optionally blend MIP transitions to eliminate any popping </li></ul>
    82. 83. System considerations <ul><li>Don’t need a texture for each objects –just a set of textures which we recycle </li></ul><ul><li>Can reduce render resolution to adjust framerate </li></ul>
    83. 84. Final thoughts on fitting <ul><li>More research to be done on BRDF fitting </li></ul><ul><li>Untackled issues : more anisotropic fitting, need to look at more data </li></ul><ul><li>Optimizing for bilinear reconstruction would also help direct rasterization at the expense of removing some detail </li></ul>
    84. 85. More thoughts <ul><li>Even if you don’t use Texture space lighting, fitting non linear functions to create MIP maps should be done </li></ul><ul><li>It’s free at runtime! </li></ul><ul><li>Texture Space Lighting is an easily scalable feature. It can be enabled on newer hardware </li></ul><ul><li>Because TSL gets computed in point sampling mode and pasted once, we can invest a resources in a much better reconstruction filter – e.g. bicubic with better anisotropy </li></ul>
    85. 86. <ul><li>Introduction To Modern Optics, 2 nd ed., Fowles (Dover) </li></ul><ul><li>Principles of Digital Image Synthesis, Glassner (MK) </li></ul><ul><li>The Physics and Chemistry of Color, 2 nd ed., Nassau (Wiley) </li></ul><ul><li>Geometric considerations and nomenclature for reflectance, Nicodemus et. al. (NBS 1977) </li></ul><ul><li>A practitioners' assessment of light reflection models, Shirley et. al. (Pacific Graphics 1997) </li></ul><ul><li>A survey of shading and reflectance models for computer graphics, Schlick (Computer Graphics Forum) </li></ul><ul><li>Illumination for computer generated pictures, Phong (Communications of the ACM) </li></ul><ul><li>Models of light reflection for computer synthesized pictures, Blinn (SIGGRAPH 1977) </li></ul><ul><li>A reflectance model for computer graphics, Cook and Torrance (ACM ToG) </li></ul><ul><li>Measuring and modeling anisotropic reflection, Ward (SIGGRAPH 1992) </li></ul><ul><li>An anisotropic phong BRDF model, Ashikhmin and Shirley (JGT) </li></ul><ul><li>Non-linear approximation of reflectance functions, Lafortune et. al. (SIGGRAPH 1977) </li></ul><ul><li>Generalization of Lambert's reflectance model, Oren and Nayar (SIGGRAPH 1994) </li></ul><ul><li>Predicting reflectance functions from complex surfaces, Westin et. al. (SIGGRAPH 1992) </li></ul><ul><li>A microfacet-based BRDF generator, Ashikmin et. al. (SIGGRAPH 2000) </li></ul><ul><li>Illumination in diverse codimensions, Banks (SIGGRAPH 1994) </li></ul><ul><li>An accurate illumination model for objects coated with multilayer films, Hirayama et. al. (Eurographics 2000) </li></ul><ul><li>Diffraction shaders, Stam (SIGGRAPH 1999) </li></ul><ul><li>Simulating diffraction, Stam, in GPU Gems (AW) </li></ul>References - Background
    86. 87. References - Techniques <ul><li>Experimental validation of analytical BRDF models, Ngan et. al. (SIGGRAPH 2004 Sketch) </li></ul><ul><li>Normal distribution functions and multiple surfaces, Fournier (Graphics Interface 1992) </li></ul><ul><li>From structure to reflectance, Fournier and Lalonde, in “Cloth Modeling and Animation” (AK Peters) </li></ul><ul><li>Efficient rendering of spatial bi-directional reflectance distribution functions, McAllister et. al. (Graphics Hardware 2002) </li></ul><ul><li>An efficient representation for irradiance environment maps, Ramamoorthi and Hanrahan (SIGGRAPH 2001) </li></ul><ul><li>Precomputed radiance transfer for real-time rendering in dynamic, low-frequency lighting environments, Sloan et. al. (SIGGRAPH 2002) </li></ul><ul><li>The plenoptic function and the elements of early vision, Adelson and Bergen, in Computational Models of Visual Processing (MIT) </li></ul><ul><li>Steerable illumination textures, Ashikhmin and Shirley (ToG) </li></ul><ul><li>Polynomial Texture Maps, Malzbender et. al. (SIGGRAPH 2001) </li></ul><ul><li>Interactive subsurface scattering for translucent meshes, Hao et. al. (Si3D 2003) </li></ul><ul><li>Clustered principal components for precomputed radiance transfer, Sloan et. al. (SIGGRAPH 2003) </li></ul><ul><li>All-frequency precomputed radiance transfer for glossy objects, Liu et. al. (EGSR 2004) </li></ul><ul><li>Triple product wavelet integrals for all-frequency relighting, Ng et. al. (SIGGRAPH 2004) </li></ul><ul><li>All-frequency shadows using non-linear wavelet lighting approximation, Ng et. al. (SIGGRAPH 2003) </li></ul><ul><li>The irradiance volume, Greger et. al. (IEEE CG&A Mar.98) </li></ul><ul><li>Spherical harmonic gradients for mid-range illumination, Annen et. al. (EGSR 2004) </li></ul><ul><li>Ambient occlusion fields, Kontkanen and Laine (Si3D 2005) </li></ul><ul><li>Normal Distribution Mapping, Olano and North (Tech Report, 1996) </li></ul>
    87. 88. Acknowledgements <ul><li>Shanon Drone, John Rapp, Jason Sandlin and John Steed for demo help </li></ul><ul><li>Michael Ashikhmin, Henrik Wann Jensen, Kazufumi Kaneda, Eric Lafortune, David McAllister, Addy Ngan, Michael Oren, Kenneth Torrance, Gregory Ward, and Stephen Westin for permission to use images </li></ul><ul><li>Naty Hoffman for use of his background slides </li></ul>
    88. 89. © 2005 Microsoft Corporation. All rights reserved. Microsoft, is a registered trademark of Microsoft Corporation in the United States and/or other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.
    89. 90. © 2005 Microsoft Corporation. All rights reserved. Microsoft, is a registered trademark of Microsoft Corporation in the United States and/or other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.