Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

1,752 views

Published on

Published in:
Science

No Downloads

Total views

1,752

On SlideShare

0

From Embeds

0

Number of Embeds

141

Shares

0

Downloads

47

Comments

0

Likes

3

No embeds

No notes for slide

camera (describing how photons are gathered and displayed)

Materials emit photons, scatter them and absorb them. That’s what’s happening in the scene. The final ouput of all this effects is captured by the camera system. And that’s it!

Materials also exist independant of actual setup. They take input, modify it, output new values.

GI is very dependant on the scene itself. GI is a consequence of the scene (scene’s geometry and materials applied). Therefore, GI is an effect. In real-time graphics, we simulate effects.Path tracing is not a simulation, it is an evaluation of the processes that happen in the scene (emit-brdf modify-bounce). But in real-time graphics, GI is one of the consqeunces of the light transport. In reality, GI is the process itself, but we want to simplify it and simulate it, and therefore, we’ve created different effects that can be separated and evaluated independently (shadows, ao, indirect illumination...) and they are all in fact part of a overall bigger thing called – global illumination.

GI is a set of algorithms that calculate how much and what kind of light arrives at a certain point in scene. You can think of it as irradiance calculation. It combines effects such as direct lighting, ambient occlussion, indirect illumination, shadowing, caustics... GI algorithms are approaches how to effectively and efficiently compute irradiance in a spherical domain, for every point in space. GI algorithms can include all mentioned effects (shadow, ao, indirect illumination...), or some parts, but they are used to efficiently compute irradiance values that will be used for evaluation of the radiance, using the material’s properties, that will present the final output of the rendering, just before being processed by the camera.

IN CG, we always simulate some effects, while in reality they all happen simultaneously. We are just trying to mimic a nature the best we can.

There are also some offline techniques that use geometric approximation like surfels.

Most of the techniques will just briefly cover the algorithm. This talk is not intended to give an in-depth explanation of the techniques but rather provide an insight what different approaches are used and how people were thinking about the problem and what strategies were then developed.SSDO: https://people.mpi-inf.mpg.de/~ritschel/Papers/SSDO.pdfDeep G-buffer: http://graphics.cs.williams.edu/papers/DeepGBuffer14/

Surfels: http://graphics.pixar.com/library/PointBasedGlobalIlluminationForMovieProduction/

Paper: http://www.cs.cornell.edu/courses/cs6630/2012sp/slides/Boyadzhiev-Matzen-InstantRadiosity.pdf

Instant radiosity: http://www.bpeers.com/blog/?itemid=517

Paper: http://www.cs.cornell.edu/courses/cs6630/2012sp/slides/Boyadzhiev-Matzen-InstantRadiosity.pdf

Instant radiosity: http://www.bpeers.com/blog/?itemid=517

SH lighting: (paper) http://www.research.scea.com/gdc2003/spherical-harmonic-lighting.pdf, (slides) https://graphics.cg.uni-saarland.de/fileadmin/cguds/courses/ss15/ris/slides/RIS18Green.pdf

An efficient representation of irradiance maps: https://cseweb.ucsd.edu/~ravir/papers/envmap/envmap.pdf

Stupid SH tricks: http://www.ppsloan.org/publications/StupidSH36.pdf

BRDF shading using SH: http://www.ppsloan.org/publications/shbrdf_final17.pdf

http://codeflow.org/entries/2012/aug/25/webgl-deferred-irradiance-volumes/

Paper: http://www.cs.jhu.edu/~misha/ReadingSeminar/Papers/Sloan02.pdf

PRT course: http://www0.cs.ucl.ac.uk/staff/j.kautz/PRTCourse/

Slides: http://www.gamedev.net/topic/649604-what-on-earth-is-far-cry-3s-deferred-radiance-transfer/

More Crytek: http://www.crytek.com/cryengine/cryengine3/presentations/cascaded-light-propagation-volumes-for-real-time-indirect-illumination (this has awesome ppt slides)

Paper: http://www.jiapingwang.com/files/shadebot_sig13.pdf

Video: https://vimeo.com/70734281

Video: https://vimeo.com/70734281

Video: https://vimeo.com/70734281

abstract-algorithm.com

I shamelessly stole bunch of pictures from different papers and presentation slides, I hope they don’t mind. Thanks!

- 1. GLOBAL ILLUMINATION (BLACK) PHOTONS EVERYWHERE Dragan Okanovic @abstractalgo
- 2. PROBLEMS OF COMPUTER GRAPHICS generate digital imagery, so it looks “real” only two problems: 1. materials: bsdf brdf (diffuse, glossy, specular reflections) btdf (refraction& transmission) bssdf (subsurface scattering) emitting 2. camera resolution + fov lens flare aberrations bokeh dof hdr & tonemapping bloom & glow motion blur anti-aliasing filmgrain ...
- 3. GLOBAL ILLUMINATION GI is a consequence of how photons are scattered around the scene GI is an effect, i.e. doesn’t exist per-se and is dependent of the scene In a CG terminology, GI is a set of algorithms that compute (ir)radiance for any given point in space, in the spherical domain That computed irradiance is then used in combination with material’s properties at that particular point in space, for final calculation of the radiance Radiance is used as the input to the camera system global illumination sub-effects: shadows ambient occlusion color bleed/indirect illumination caustics volumetric lighting
- 4. shadows check if surface is lit directly ambient occlusion check how “occluded” the surface is and how hard is for the light to reach that point in space color bleed / indirect illumination is reflected light strong enough so even diffuse surfaces “bleed” their color on surrounding (non-emitters behave like light source) caustics is enough of the light reflected/refracted to create some interesting bright patterns volumetric lighting how does participating media interact with the light global illumination describes how light is scattered around the scene, how light is transported through the scene what interesting visual effects start appearing because of such light transport
- 5. sh ao sh + ind.illum. sh sh + vol. + ind.illum. sh + caustics + ind.illum. + ao sh + ind.illum. + ao
- 6. FORMULATION OF THE PROBLEM analytically calculate or approximate the irradiance over the sphere, for a certain point in space, in a converged state how much each point [A] contributes to every other [B] in the scene how much [A->B] influences point [A] how much does that influence [B] back .... recursive, but it can converge and reach a certain equilibrium [A->B] [[A->B]->A] [[[A->B]->A]->B] [all light bounces]
- 7. ALGORITHMS pathtracing radiosity photon mapping RSM (reflective shadow maps) instant radiosity irradiance volumes LPV (light propagation volumes) deferred radiance transfer volumes SVOGI (sparse voxel octree GI, voxel cone tracing) RRF (radiance regression function) SSDO (screen-space directional occlusion) deep G-buffer surfels PRT and SH
- 8. PATHTRACING sample the hemisphere over the point with Monte Carlo for every other sample, do the same thing recursively for each surface-light path interaction, we evaluate the incoming light against the bsdf of the material straighforward implementation of light bounces very computationally exhaustive, not real-time very good results, ground truth all effects
- 9. RADIOSITY for each surface element (patch), calculate how well it can see all other patches (view factor) progressively recalculate the illumination of a certain patch from all other patches start with direct illumination injected and iterate until convergence (or good enough) not real-time only diffuse reflections can be precomputed and it is viewport-independent
- 10. REFLECTIVE SHADOW MAPS (RSM) generate RSM from lights perspective: depth, position, normal, flux sample RSM to approximate lighting the idea is used in other more popular algorithms
- 11. INSTANT RADIOSITY ray trace from the light source into the scene for each hit, generate VPL and render the scene with it gather the results mix between sampling and radiosity not real-time
- 12. INSTANT RADIOSITY V2 don’t raytrace, but instead use RSM use RSM to approximate where to place VPLs deferred render with many lights
- 13. PHOTON MAPPING shoot photons from light source into the scene gather nearby photon to calculate approximate radiance good results good for caustics not real-time
- 14. SPHERICAL HARMONICS “spherical Fourier decomposition” Legendre basis functions that can be added together to represent the spherical domain function
- 15. IRRADIANCE VOLUMES calculate lighting at the point in space and save in SH representation build grid of such SH values interpolate in space (trilinear) build acceleration structure for efficiency (octree)
- 16. PRECOMPUTED RADIANCE TRANSFER precomputed SH for a object that accounts for self- shadowing and self-interreflection independent of the lighting
- 17. PRT
- 18. DEFERRED RADIANCE TRANSFER VOLUMES bake manually/auto placed probes that hold PRT data create grid and inject PRT probes into it, interpolated between manually selected locations use local PRT probe * lighting to get the illumination data
- 19. [CASCADED] LIGHT PROPAGATION VOLUMES ([C]LPV) generate RSM generate VPLs using RSM inject VPL data into 3D grid of SH probes propagate light contribution within the grid Sample lit surface elements Grid initialization Light propagation in the grid Scene illumination with the grid
- 20. VPL VPL VPL Discretize initial VPL distribution by the regular grid and SH A set of regularly sampled VPLs of the scene from light position generate RSM generate VPLs using RSM inject VPL into 3D grid propagate light contribution within the grid Propagate light iteratively going from one cell to another Cascaded grids
- 21. VOXEL CONE TRACING (SPARSE VOXEL OCTREE GI) rasterize scene into 3d texture generate mip levels and octree for textures sample with cone tracing
- 22. VOXEL CONE TRACING (SPARSE VOXEL OCTREE GI) shadows AO specular reflections indirect illumination
- 23. RADIANCE REGRESSION FUNCTION (RRF) train neural network on the scene (get RRF) use RRF to evaluate for a given point in a space
- 24. RRF
- 25. THANKS!

No public clipboards found for this slide

Be the first to comment