CEDEC 2018 - Towards Effortless Photorealism Through Real-Time RaytracingElectronic Arts / DICE
Real-time raytracing holds the promise of simplifying rendering pipelines, eliminating artist-intensive workflows, and ultimately delivering photorealistic images. This talk by Tomasz Stachowiak provides a glimpse of the future through the lens of SEED's PICA PICA demo: a game made for artificial intelligence agents, with procedural level assembly, and no precomputation. We dive into technical details of several advanced rendering algorithms, and discuss how Microsoft's DirectX Raytracing technology allows for their intuitive implementation. Several challenges remain -- we will take a look at some of them, discuss how real-time raytracing fits in the spectrum of solutions, and start to plot the course towards robust and artist-friendly image synthesis.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
This session presents a detailed programmer oriented overview of our SPU based shading system implemented in DICE's Frostbite 2 engine and how it enables more visually rich environments in BATTLEFIELD 3 and better performance over traditional GPU-only based renderers. We explain in detail how our SPU Tile-based deferred shading system is implemented, and how it supports rich material variety, High Dynamic Range Lighting, and large amounts of light sources of different types through an extensive set of culling, occlusion and optimization techniques.
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
CEDEC 2018 - Towards Effortless Photorealism Through Real-Time RaytracingElectronic Arts / DICE
Real-time raytracing holds the promise of simplifying rendering pipelines, eliminating artist-intensive workflows, and ultimately delivering photorealistic images. This talk by Tomasz Stachowiak provides a glimpse of the future through the lens of SEED's PICA PICA demo: a game made for artificial intelligence agents, with procedural level assembly, and no precomputation. We dive into technical details of several advanced rendering algorithms, and discuss how Microsoft's DirectX Raytracing technology allows for their intuitive implementation. Several challenges remain -- we will take a look at some of them, discuss how real-time raytracing fits in the spectrum of solutions, and start to plot the course towards robust and artist-friendly image synthesis.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
This session presents a detailed programmer oriented overview of our SPU based shading system implemented in DICE's Frostbite 2 engine and how it enables more visually rich environments in BATTLEFIELD 3 and better performance over traditional GPU-only based renderers. We explain in detail how our SPU Tile-based deferred shading system is implemented, and how it supports rich material variety, High Dynamic Range Lighting, and large amounts of light sources of different types through an extensive set of culling, occlusion and optimization techniques.
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This talk provides additional details around the hybrid real-time rendering pipeline we developed at SEED for Project PICA PICA.
At Digital Dragons 2018, we presented how leveraging Microsoft's DirectX Raytracing enables intuitive implementations of advanced lighting effects, including soft shadows, reflections, refractions, and global illumination. We also dove into the unique challenges posed by each of those domains, discussed the tradeoffs, and evaluated where raytracing fits in the spectrum of solutions.
Talk by Graham Wihlidal (Frostbite Labs) at GDC 2017.
Checkerboard rendering is a relatively new technique, popularized recently by the introduction of the PlayStation 4 Pro. Many modern game engines are adding support for it right now, and in this talk, Graham will present an in-depth look at the new implementation in Frostbite, which is used in shipping titles like 'Battlefield 1' and 'Mass Effect Andromeda'. Despite being conceptually simple, checkerboard rendering requires a deep integration into the post-processing chain, in particular temporal anti-aliasing, dynamic resolution scaling, and poses various challenges to existing effects. This presentation will cover the basics of checkerboard rendering, explain the impact on a game engine that powers a wide range of titles, and provide a detailed look at how the current implementation in Frostbite works, including topics like object id, alpha unrolling, gradient adjust, and a highly efficient depth resolve.
Course presentation at SIGGRAPH 2014 by Charles de Rousiers and Sébastian Lagarde at Electronic Arts about transitioning the Frostbite game engine to physically-based rendering.
Make sure to check out the 118 page course notes on: http://www.frostbite.com/2014/11/moving-frostbite-to-pbr/
During the last few months, we have revisited the concept of image quality in Frostbite. The core of our approach was to be as close as possible to a cinematic look. We used the concept of reference to evaluate the accuracy of produced images. Physically based rendering (PBR) was the natural way to achieve this. This talk covers all the different steps needed to switch a production engine to PBR, including the small details often bypass in the literature.
The state of the art of real-time PBR techniques allowed us to achieve good overall results but not without production issues. We present some techniques for improving convolution time for image based reflection, proper ambient occlusion handling, and coherent lighting units which are mandatory for level editing.
Moreover, we have managed to reduce the quality gap, highlighted by our systematic reference comparison, in particular related to rough material handling, glossy screen space reflection, and area lighting.
The technical part of PBR is crucial for achieving good results, but represents only the top of the iceberg. Frostbite has become the de facto high-end game engine within Electronic Arts and is now used by a large amount of game teams. Moving all these game teams from “old fashion” lighting to PBR has required a lot of education, which have been done in parallel of the technical development. We have provided editing and validation tools to help the transition of art production. In addition, we have built a flexible material parametrisation framework to adapt to the various authoring tools and game teams’ requirements.
SIGGRAPH 2018 - Full Rays Ahead! From Raster to Real-Time RaytracingElectronic Arts / DICE
In this presentation part of the "Introduction to DirectX Raytracing" course, Colin Barré-Brisebois of SEED discusses some of the challenges the team had to go through when going from raster to real-time raytracing for Project PICA PICA.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
This presentation gives an overview of the rendering techniques used in KILLZONE 2. We put the main focus on the lighting and shadowing techniques of our deferred shading engine and how we made them play nicely with anti-aliasing.
A Practical and Robust Bump-mapping Technique for Today’s GPUs (slides)Mark Kilgard
I presented this on May 8, 2000 to the Stanford Shading Group in Palo Alto, California. The presentation explains how to use the, then state-of-the-art, NVIDIA register combiners of the GeForce 256 to implement per-pixel bump mapping, a technique that is now ubiquitous in most 3D computer games.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This talk provides additional details around the hybrid real-time rendering pipeline we developed at SEED for Project PICA PICA.
At Digital Dragons 2018, we presented how leveraging Microsoft's DirectX Raytracing enables intuitive implementations of advanced lighting effects, including soft shadows, reflections, refractions, and global illumination. We also dove into the unique challenges posed by each of those domains, discussed the tradeoffs, and evaluated where raytracing fits in the spectrum of solutions.
Talk by Graham Wihlidal (Frostbite Labs) at GDC 2017.
Checkerboard rendering is a relatively new technique, popularized recently by the introduction of the PlayStation 4 Pro. Many modern game engines are adding support for it right now, and in this talk, Graham will present an in-depth look at the new implementation in Frostbite, which is used in shipping titles like 'Battlefield 1' and 'Mass Effect Andromeda'. Despite being conceptually simple, checkerboard rendering requires a deep integration into the post-processing chain, in particular temporal anti-aliasing, dynamic resolution scaling, and poses various challenges to existing effects. This presentation will cover the basics of checkerboard rendering, explain the impact on a game engine that powers a wide range of titles, and provide a detailed look at how the current implementation in Frostbite works, including topics like object id, alpha unrolling, gradient adjust, and a highly efficient depth resolve.
Course presentation at SIGGRAPH 2014 by Charles de Rousiers and Sébastian Lagarde at Electronic Arts about transitioning the Frostbite game engine to physically-based rendering.
Make sure to check out the 118 page course notes on: http://www.frostbite.com/2014/11/moving-frostbite-to-pbr/
During the last few months, we have revisited the concept of image quality in Frostbite. The core of our approach was to be as close as possible to a cinematic look. We used the concept of reference to evaluate the accuracy of produced images. Physically based rendering (PBR) was the natural way to achieve this. This talk covers all the different steps needed to switch a production engine to PBR, including the small details often bypass in the literature.
The state of the art of real-time PBR techniques allowed us to achieve good overall results but not without production issues. We present some techniques for improving convolution time for image based reflection, proper ambient occlusion handling, and coherent lighting units which are mandatory for level editing.
Moreover, we have managed to reduce the quality gap, highlighted by our systematic reference comparison, in particular related to rough material handling, glossy screen space reflection, and area lighting.
The technical part of PBR is crucial for achieving good results, but represents only the top of the iceberg. Frostbite has become the de facto high-end game engine within Electronic Arts and is now used by a large amount of game teams. Moving all these game teams from “old fashion” lighting to PBR has required a lot of education, which have been done in parallel of the technical development. We have provided editing and validation tools to help the transition of art production. In addition, we have built a flexible material parametrisation framework to adapt to the various authoring tools and game teams’ requirements.
SIGGRAPH 2018 - Full Rays Ahead! From Raster to Real-Time RaytracingElectronic Arts / DICE
In this presentation part of the "Introduction to DirectX Raytracing" course, Colin Barré-Brisebois of SEED discusses some of the challenges the team had to go through when going from raster to real-time raytracing for Project PICA PICA.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
This presentation gives an overview of the rendering techniques used in KILLZONE 2. We put the main focus on the lighting and shadowing techniques of our deferred shading engine and how we made them play nicely with anti-aliasing.
A Practical and Robust Bump-mapping Technique for Today’s GPUs (slides)Mark Kilgard
I presented this on May 8, 2000 to the Stanford Shading Group in Palo Alto, California. The presentation explains how to use the, then state-of-the-art, NVIDIA register combiners of the GeForce 256 to implement per-pixel bump mapping, a technique that is now ubiquitous in most 3D computer games.
Volumetric Lighting for Many Lights in Lords of the FallenBenjamin Glatzel
In this session I’m going to give you an in-depth insight into the design and the implementation of the volumetric lighting system we’ve developed for ‘Lords of the Fallen’. The system allows the simulation of countless volumetric lighting effects in parallel while still being a feasible solution on next-gen consoles.
This presentation was held at the Digital Dragons 2014 conference.
Videos shown during the talk are available here: http://bglatzel.movingblocks.net/publications
The most important part of a modern PostFX pipeline is picking the right color model to support. This way the whole PostFX pipeline can use 32-bit render targets and at the same time have increased color representation and luminance representation.
An illumination model, also called a lighting model and sometimes referred to as a shading model, is used to calculate the intensity of light that we should see at a given point on the surface of an object.
Caustic Object Construction Based on Multiple Caustic PatternsBudianto Tandianus
Was presented in WSCG 2012 ( http://www.wscg.cz/ ) in Plzen, Prague.
Is published in Journal of WSCG, Vol.20, No.1, pp.37-46, ISSN 1213-6972, Union Agency, 2012.
Green Custard Friday Talk 17: Ray TracingGreen Custard
In Green Custard's 17th Friday talk, Jonathan explores the subject of Ray Tracing.
Topics covered:
- What is ray tracing?
- How do we ray trace?
- Backwards ray tracing
- Shadow rays
- Object intersections
- Reflected rays
- Transmitted rays
- Local colour
- Ambient and diffuse
- Specular reflection
- Local colour formula
- Materials and textures
- Distributed ray tracing
- Global illumination
Green Custard is a custom software development consultancy. To discover more about their work and the team visit www.green-custard.com.
Similar to Paris Master Class 2011 - 07 Dynamic Global Illumination (20)
A description of the next-gen rendering technique called Triangle Visibility Buffer. It offers up to 10x - 20x geometry compared to Deferred rendering and much higher resolution. Generally it aligns better with memory access patterns in modern GPUs compared to Deferred Lighting like Clustered Deferred Lighting etc.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Event Management System Vb Net Project Report.pdfKamal Acharya
In present era, the scopes of information technology growing with a very fast .We do not see any are untouched from this industry. The scope of information technology has become wider includes: Business and industry. Household Business, Communication, Education, Entertainment, Science, Medicine, Engineering, Distance Learning, Weather Forecasting. Carrier Searching and so on.
My project named “Event Management System” is software that store and maintained all events coordinated in college. It also helpful to print related reports. My project will help to record the events coordinated by faculties with their Name, Event subject, date & details in an efficient & effective ways.
In my system we have to make a system by which a user can record all events coordinated by a particular faculty. In our proposed system some more featured are added which differs it from the existing system such as security.
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxR&R Consult
CFD analysis is incredibly effective at solving mysteries and improving the performance of complex systems!
Here's a great example: At a large natural gas-fired power plant, where they use waste heat to generate steam and energy, they were puzzled that their boiler wasn't producing as much steam as expected.
R&R and Tetra Engineering Group Inc. were asked to solve the issue with reduced steam production.
An inspection had shown that a significant amount of hot flue gas was bypassing the boiler tubes, where the heat was supposed to be transferred.
R&R Consult conducted a CFD analysis, which revealed that 6.3% of the flue gas was bypassing the boiler tubes without transferring heat. The analysis also showed that the flue gas was instead being directed along the sides of the boiler and between the modules that were supposed to capture the heat. This was the cause of the reduced performance.
Based on our results, Tetra Engineering installed covering plates to reduce the bypass flow. This improved the boiler's performance and increased electricity production.
It is always satisfying when we can help solve complex challenges like this. Do your systems also need a check-up or optimization? Give us a call!
Work done in cooperation with James Malloy and David Moelling from Tetra Engineering.
More examples of our work https://www.r-r-consult.dk/en/cases-en/
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
6. Requirement for Real-Time GI
• Pre-calculated lighting disadvantages
– hard to mimic a 24 hour cycle
– storing those light or radiosity maps on disk or even
the DVD / Blu-ray required a lot of memory
– streaming the light or radiosity maps from disk or
hard-drive through hardware to the GPU consumes
valuable memory bandwidth
– geometry with light maps or radiosity maps is not
destructible anymore (this is a reason to avoid any
solution with pre-computed maps)
– while the environment is lit nicely, it is hard to light
characters in a consistent way with this environment
7. Requirement for Real-Time GI
• Following Next-gen Game requirements:
– As less as possible pre-calculated
– As less as possible memory consumption
-> as much as possible is calculated on the fly
– 24 hour sun movement
– Destructible Geometry
8. Ambient Cube
• Ambient Cube (as used in Half-Life 2)
(http://www2.ati.com/developer/gdc/D3DTutorial10_Half-Life2_Shading.pdf)
->Interpolate between six color values
• Six colors are stored spatially in level
data:
– Represent ambient light flowing through
that volume in space
– Application looks this up for each model
to determine the six ambient colors to
use for a given model
• HL2: local lights that aren’t important
enough to go directly into the vertex
shader are also added to the ambient
cube
9. Diffuse Cube Mapping
• [Brennan]
• Steps
– Render environment into a cube map or use static
map
– Blur it
– Fetch it by indexing with the vertex normal
• Vertex normal is used to index the cube map with the
intention that it returns the total amount of light
scattered from this direction
12. Screen-Space Ambient Occlusion
• See for the original approach [Kajalin]
• Requires only Depth Buffer in original approach
• Uses a sphere-like sampling kernel
13. Screen-Space Ambient Occlusion
• Computes the amount of solid geometry around a
point
• The ratio between solid and empty space
approximates the amount of occlusion
P
1
P
2
P
3
P1 is located on a flat plane -> 0.5
P2 is located in the corner -> 0.75
P3 is located on the edge -> 0.25
14. Screen-Space Ambient Occlusion
• A very simplified version of the source code comes
down to:
// very simplified version …
float fPixelDepth = Depth.Sample( DepthSampler, IN.tex.xy ).r;
for ( int i = 0; i < 8; i++ )
{
float fSampleDepth = Depth.Sample(DepthSampler, vRotatedKernel.xy).r;
float fDelta = max( fSampleDepth - fPixelDepth, 0 );
fOcclusion += fDelta;
}
15. Screen-Space Ambient Occlusion
• Designing a SSAO Kernel
– Distance attenuation
-> not necessary if the density of the samples is higher closer to the
center of the kernel
– Noise is necessary because even 16 taps spread out are not enough
– Running on a quarter-sized render target will make large kernels like
16-tap possible
18. Screen-Space Global Illumination
• Every pixel in the G-Buffer is considered a secondary
light source -> point lights
• The normal is considered the light direction
• Radiant intensity emitted at point p in direction ω
Φp– the flux defines the brightness of the pixel light
np – normal at point p spatial emission
19. Screen-Space Global Illumination
• Irradiance at surface point p with normal n due to
pixel light p is thus
Φp– the flux defines the brightness of the pixel light
np – normal at point p spatial emission
• Evaluating indirect irradiance at a surface point x with
normal x
PTarget
PSampling
V
NormalSampling
NormalTarget
20. Screen-Space Global Illumination
• PSampling contribution to Ptarget(Outgoing cosinus)
ISampling = NSampling.-V
• Ptarget contribution (Incoming cosinus)
Itarget = NTarget.V
• Attenuation:
A = 1.0 / D2
• Result
Result = Isampling * Itarget * A * ColorScreenSpaceRT
21. Screen-Space Global Illumination
• Issues
– All the issues from SSAO
– Does not consider occlusion for the indirect light sources
-> color changes based on viewing angle
-> Different with reflective shadow maps …
22. Reflective Shadow Maps
• Extended shadow map [Dachsbacher] - Multiple-Render Target (MRT)
that holds
– Depth Buffer
– Normal
– World space position
– Flux
• Each pixel in the MRT is considered a secondary light source; considered
local point lights
-> called pixel lights
• Radiant intensity emitted at point p in direction ω
Φp– the flux defines the brightness of the pixel light
np – normal at point p spatial emission
23. • Irradiance at surface point p with normal n due to
pixel light p is thus
Φp– the flux defines the brightness of the pixel light
np – normal at point p spatial emission
• Evaluating indirect irradiance at a surface point x
with normal x
• Does not consider occlusion for the indirect light
sources -> color changes based on viewing angle
Reflective Shadow Maps
24. Reflective Shadow Maps
• Challenge: a typical RSM is 512x512 -> evaluation of
all those too expensive
• Importance Sampling -> evaluate about 400 samples
How to pick the sampling points:
1. Uniformly distributed random numbers are applied to polar coordinates
2. Then they compensate for the varying density by weighting the achieved samples
ξ1- uniformly distributed random number
ξ2- uniformly distributed random number
• Precompute sampling pattern and
re-use it for all indirect light computations
• Still too expensive -> Screen-Space interpolation
25. Reflective Shadow Maps
• Screen-Space Interpolation
1. pass – compute the indirect illumination for a low-resolution image in screen-space
2. pass – render full-res + check if we can interpolate from the four surrounding low-
res samples
• A low-res sample is regarded as suitable for interpolation if
– the sample’s normal is similar to the pixel’s normal,
– and if its world space location is close to the pixel’s location
• Each samples contribution is weighted by the factors used for bi-linear interpolation
– If not all samples of the four samples are used -> normalization
– If all three or four samples are considered as suitable -> interpolate between those
26. Reflective Shadow Maps
• How to pick important pixel lights
1. Not visible in shadow map: x is not directly illuminated -> not visible in shadow map
2. Normals point away: x-1 and x-2 are relatively close, but since their normal points away from x they do not contribute
3. Same floor: x1 is very close but lies on the same floor -> won’t contribute
4. 1 – 3 not applicable: x2 will contribute
5. What is important: distance between x and a pixel light xp in the shadow map is a reasonable approximation for their distance in world space
-> if the depth values with respect to the light source differ significantly, the world space distance is much bigger and we thus overestimate the
influence
-> important indirect lights will always be close, and these must also be close in the shadow map
27. Splatting Indirect Illumination
•Build on RSM
•Instead of iterating over all image pixels and gathering
indirect light from RSM
-> Selects a subset of RSM pixels == pixel lights and
distributes their light to the image pixels, using a splat
in screen-space
-> Deferred lighting idea re-visited
28. Splatting Indirect Illumination
• Stores
– Flux – N.L * Color
– Normal – just world-space normal
– Position – offset in neg. direction of normal
– Gloss value – gloss value is instead of specular component
• in Reflective Shadow map
29. Splatting Indirect Illumination
•How to create a Reflective Shadow Map:
RESULT_RSM psCreateRSM( FRAGMENT_RSM fragment )
{
RESULT_RSM result;
float3 worldSpacePos = fragment.wsPos.xyz;
float3 lightVector = normalize( lightPosition - worldSpacePos );
float3 normal = normalize( fragment.normal );
// compute flux
float4 flux = materialColor * saturate( dot( normal.xyz, lightVector ) );
// we squeeze everything into two floats quadrupels, as vertex texture fetches are expensive!
float squeezedFlux = encodeFlux( flux.xyz );
float phongExponent = 1.0f;
float3 mainLightDirection = normal;
if ( dot( mainLightDirection, lightVector ) < 0.0f ) normal *= -1.0f;
result.color[ 0 ] = float4( mainLightDirection, squeezedFlux );
// we move position a little bit back in the direction of the normal
// this is done to avoid rendering problems along common boundaries of walls -> illumination integral has a
// singularity there
result.color[ 1 ] = float4( worldSpacePos - normal * 0.2f, phongExponent );
return result;
}
30. Splatting Indirect Illumination
• How do we fetch the RSM?
• Splat position: pre-calculated distribution of sample
points in light space
-> 3D poisson disk distribution
Left - Sample points in light space / Right – Splats in main view
31. Splatting Indirect Illumination
Pseudo code
1.Fetch pre-calculated distribution of sample points in
light space in a texture
2.Calculate size of ellipsoid bounding volume in screen-
space (see article for code)
33. Splatting Indirect Illumination
• Going back to the original Reflective Shadow Map
algorithm:
• Irradiance at surface point p with normal n due to
pixel light p is thus
Φp – the flux defines the brightness of the pixel light
np – normal at point p spatial emission
• Evaluating indirect irradiance at a surface point x
with normal x
PTarget
PSampling
V
NormalSampling
NormalTarget
34. Splatting Indirect Illumination
• PSampling contribution to Ptarget(Outgoing cosinus)
ISampling = NSampling.-V
• Ptarget contribution (Incoming cosinus)
Itarget = NTarget.V
• Attenuation:
A = 1.0 / (D2
* Scale + Bias)
• Result
Result = Isampling * Itarget * A * ColorScreenSpaceRT
36. Splatting Indirect Illumination
• Let’s single-step through the splatting pixel shader
float4 psSI( FRAGMENT_SI fragment ) : COLOR
{
float4 result;
// get surface info for this pixel
float4 wsPos = tex2Dproj( DSSample0, fragment.pos2D );
float4 normal = tex2Dproj( DSSample1, fragment.pos2D ) * 2.0f - 1.0f;
// decode world space position (8 bit render target!)
wsPos = ( wsPos - 0.5f ) * WSSIZE;
// compute lighting
float l2, lR, cosThetaI, cosThetaJ, Fij, phongExp;
// lightPos comes from the RSM read in the vertex shader
// lightPos – pixel light
// wsPos – fragment position
float3 R = fragment.lightPos - wsPos.xyz; // R = vector from fragment to pixel light
l2 = dot( R, R ); // squared length of R (needed again later)
R *= rsqrt( l2 ); // normalize R
lR = ( 1.0f / ( distBias + l2 * INV_WS_SIZE2_PI * 2 ) ); // distance attenuation (there's a global scene scaling factor "...WS_SIZE...")
37. Splatting Indirect Illumination
// lightDir comes from RSM read in vertex shader
// this is the world space normal stored in the RSM of the pixel light == direction of the pixel light
cosThetaI = saturate( dot( fragment.lightDir, -R ) ); // outgoing cosine
phongExp = fragment.lightDir.w;
// with a phong like energy attenuation and widening of the high intensity region
if ( phongExp > 1.0f )
cosThetaI = pow( cosThetaI, phongExp * l2 );
// compare world-space normal at fragment that gets rendered with R
cosThetaJ = saturate( dot( normal, R ) ); // incoming cosine
Fij = cosThetaI * cosThetaJ * lR; // putting everything together
#ifdef SMOOTH_FADEOUT
// screen-space position of center of splat
float3 t1 = fragment.center2D.xyz / fragment.center2D.w;
// screen-space position of the pixel we are currently shading
float3 t2 = fragment.pos2D.xyz / fragment.pos2D.w;
// lightFlux.w holds fade out value based on distance between camera and the pixel light position
// xy is based on screen-space and z is based on the distance between the camera and the pixel position
float fadeOutFactor = saturate( 2 - 6.667 * length( t1.xy - t2.xy ) / fragment.lightFlux.w );
Fij *= fadeOutFactor;
#endif
result = fragment.lightFlux * Fij; // transfer energy!
return result;
}
Green arrow – R
Orange arrow – normal fragment
Blue arrow – pixel light direction
39. Conclusion
• Screen-Space Global Illumination
– allows to generate the impression of color bleeding with the
lowest possible amount of memory
– does not require any transform
– ”fits” into a PostFX pipeline
– has all the issues of SSAO + false colors
• Reflective Shadow Maps + SII
– higher quality than SSGI
– “fits” into a Deferred Lighting engine and any Cascaded Shadow
Map approach
– requires one or more additional G-Buffers from the point of
view of the light
– more expensive
40. References
• [Brennan] Chris Brennan, “Diffuse Cube Mapping”, ShaderX, pp. 287 - 289
• [Dachsbacher] Carsten Dachsbacher, Marc Stamminger, “Reflective Shadow Maps”,
http://www.vis.uni-stuttgart.de/~dachsbcn/download/rsm.pdf
http://www.vis.uni-stuttgart.de/~dachsbcn/publications.html
• [DachsbacherSii] Carsten Dachsbacher, Marc Stamminger, “Splatting Indirect Illumination”,
http://www.vis.uni-stuttgart.de/~dachsbcn/download/sii.pdf
• [Kajalin] Vladimir Kajalin, “Screen-Space Ambient Occlusion”, pp. 413 – 424, ShaderX7
• [Loos] Bradford James Loos, Peter-Pike Sloan, “VolumetricObscurance”,
http://www.cs.utah.edu/~loos/publications/vo/vo.pdf
• [Nichols] Greg Nichols, Chris Wyman, “Multiresolution Splatting for Indirect Illumination”,
http://www.cs.uiowa.edu/~cwyman/publications/files/techreports/UICS-TR-08-04.pdf
• [Ritschel] Tobias Ritschel, “Imperfect Shadow Maps for Efficient Computation of Indirect
Illumination”, http://www.uni-koblenz.de/~ritschel/
• [Zink] Jason Zink, “Screen Space Ambient Occlusion”,
http://wiki.gamedev.net/index.php/D3DBook:Screen_Space_Ambient_Occlusion
Editor's Notes
Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
Instead of computing the amount of occlusion produced by geometry hits through ray-tracing, we approximate it by the amount of solid geometry around a point.
ω – greek small letter omega
Φ – greek capital letter Phi
ω – greek small letter omega
ω – greek small letter omega
Irradience - the amount of light or other radiant energy striking a given area of a surface; illumination