For this year's keynote at High Performance Graphics 2018, Colin Barré-Brisebois from SEED discussed the state of the art in real-time game ray tracing. He explored some of the connections between offline and real-time game ray tracing, and presented some of the open problems. Colin exposed a few potential solutions to those problems, and also proposed a call-to-arms on topics where the ray tracing research community and the games industry should unite in order to solve such open problems.
This talk provides additional details around the hybrid real-time rendering pipeline we developed at SEED for Project PICA PICA.
At Digital Dragons 2018, we presented how leveraging Microsoft's DirectX Raytracing enables intuitive implementations of advanced lighting effects, including soft shadows, reflections, refractions, and global illumination. We also dove into the unique challenges posed by each of those domains, discussed the tradeoffs, and evaluated where raytracing fits in the spectrum of solutions.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
Course presentation at SIGGRAPH 2014 by Charles de Rousiers and Sébastian Lagarde at Electronic Arts about transitioning the Frostbite game engine to physically-based rendering.
Make sure to check out the 118 page course notes on: http://www.frostbite.com/2014/11/moving-frostbite-to-pbr/
During the last few months, we have revisited the concept of image quality in Frostbite. The core of our approach was to be as close as possible to a cinematic look. We used the concept of reference to evaluate the accuracy of produced images. Physically based rendering (PBR) was the natural way to achieve this. This talk covers all the different steps needed to switch a production engine to PBR, including the small details often bypass in the literature.
The state of the art of real-time PBR techniques allowed us to achieve good overall results but not without production issues. We present some techniques for improving convolution time for image based reflection, proper ambient occlusion handling, and coherent lighting units which are mandatory for level editing.
Moreover, we have managed to reduce the quality gap, highlighted by our systematic reference comparison, in particular related to rough material handling, glossy screen space reflection, and area lighting.
The technical part of PBR is crucial for achieving good results, but represents only the top of the iceberg. Frostbite has become the de facto high-end game engine within Electronic Arts and is now used by a large amount of game teams. Moving all these game teams from “old fashion” lighting to PBR has required a lot of education, which have been done in parallel of the technical development. We have provided editing and validation tools to help the transition of art production. In addition, we have built a flexible material parametrisation framework to adapt to the various authoring tools and game teams’ requirements.
This talk provides additional details around the hybrid real-time rendering pipeline we developed at SEED for Project PICA PICA.
At Digital Dragons 2018, we presented how leveraging Microsoft's DirectX Raytracing enables intuitive implementations of advanced lighting effects, including soft shadows, reflections, refractions, and global illumination. We also dove into the unique challenges posed by each of those domains, discussed the tradeoffs, and evaluated where raytracing fits in the spectrum of solutions.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
Course presentation at SIGGRAPH 2014 by Charles de Rousiers and Sébastian Lagarde at Electronic Arts about transitioning the Frostbite game engine to physically-based rendering.
Make sure to check out the 118 page course notes on: http://www.frostbite.com/2014/11/moving-frostbite-to-pbr/
During the last few months, we have revisited the concept of image quality in Frostbite. The core of our approach was to be as close as possible to a cinematic look. We used the concept of reference to evaluate the accuracy of produced images. Physically based rendering (PBR) was the natural way to achieve this. This talk covers all the different steps needed to switch a production engine to PBR, including the small details often bypass in the literature.
The state of the art of real-time PBR techniques allowed us to achieve good overall results but not without production issues. We present some techniques for improving convolution time for image based reflection, proper ambient occlusion handling, and coherent lighting units which are mandatory for level editing.
Moreover, we have managed to reduce the quality gap, highlighted by our systematic reference comparison, in particular related to rough material handling, glossy screen space reflection, and area lighting.
The technical part of PBR is crucial for achieving good results, but represents only the top of the iceberg. Frostbite has become the de facto high-end game engine within Electronic Arts and is now used by a large amount of game teams. Moving all these game teams from “old fashion” lighting to PBR has required a lot of education, which have been done in parallel of the technical development. We have provided editing and validation tools to help the transition of art production. In addition, we have built a flexible material parametrisation framework to adapt to the various authoring tools and game teams’ requirements.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
Graphics Gems from CryENGINE 3 (Siggraph 2013)Tiago Sousa
This lecture covers rendering topics related to Crytek’s latest engine iteration, the technology which powers titles such as Ryse, Warface, and Crysis 3. Among covered topics, Sousa presented SMAA 1TX: an update featuring a robust and simple temporal antialising component; performant and physically-plausible camera related post-processing techniques such as motion blur and depth of field were also covered.
Bindless Deferred Decals in The Surge 2Philip Hammer
These are the slides for my talk at Digital Dragons 2019 in Krakow.
Update: The recordings are online on youtube now:
https://www.youtube.com/watch?v=e2wPMqWETj8
CEDEC 2018 - Towards Effortless Photorealism Through Real-Time RaytracingElectronic Arts / DICE
Real-time raytracing holds the promise of simplifying rendering pipelines, eliminating artist-intensive workflows, and ultimately delivering photorealistic images. This talk by Tomasz Stachowiak provides a glimpse of the future through the lens of SEED's PICA PICA demo: a game made for artificial intelligence agents, with procedural level assembly, and no precomputation. We dive into technical details of several advanced rendering algorithms, and discuss how Microsoft's DirectX Raytracing technology allows for their intuitive implementation. Several challenges remain -- we will take a look at some of them, discuss how real-time raytracing fits in the spectrum of solutions, and start to plot the course towards robust and artist-friendly image synthesis.
SIGGRAPH 2018 - Full Rays Ahead! From Raster to Real-Time RaytracingElectronic Arts / DICE
In this presentation part of the "Introduction to DirectX Raytracing" course, Colin Barré-Brisebois of SEED discusses some of the challenges the team had to go through when going from raster to real-time raytracing for Project PICA PICA.
With the highest-quality video options, Battlefield 3 renders its Screen-Space Ambient Occlusion (SSAO) using the Horizon-Based Ambient Occlusion (HBAO) algorithm. For performance reasons, the HBAO is rendered in half resolution using half-resolution input depths. The HBAO is then blurred in full resolution using a depth-aware blur. The main issue with such low-resolution SSAO rendering is that it produces objectionable flickering for thin objects (such as alpha-tested foliage) when the camera and/or the geometry are moving. After a brief recap of the original HBAO pipeline, this talk describes a novel temporal filtering algorithm that fixed the HBAO flickering problem in Battlefield 3 with a 1-2% performance hit in 1920x1200 on PC (DX10 or DX11). The talk includes algorithm and implementation details on the temporal filtering part, as well as generic optimizations for SSAO blur pixel shaders. This is a joint work between Louis Bavoil (NVIDIA) and Johan Andersson (DICE).
Game engines have long been in the forefront of taking advantage of the ever increasing parallel compute power of both CPUs and GPUs. This talk is about how the parallel compute is utilized in practice on multiple platforms today in the Frostbite game engine and how we think the parallel programming models, hardware and software in the industry should look like in the next 5 years to help us make the best games possible
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
Secrets of CryENGINE 3 Graphics TechnologyTiago Sousa
In this talk, the authors will describe an overview of a different method for deferred lighting approach used in CryENGINE 3, along with an in-depth description of the many techniques used. Original file and videos at http://crytek.com/cryengine/presentations
In this talk, we present results from the real-time raytracing research done at SEED, a cross-disciplinary team working on cutting-edge, future graphics technologies and creative experiences at Electronic Arts. We explain in detail several techniques from “PICA PICA”, a real-time raytracing experiment featuring a mini-game for self-learning AI agents in a procedurally-assembled world. The approaches presented here are intended to inspire developers and provide a glimpse of a future where real-time raytracing powers the creative experiences of tomorrow.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
Graphics Gems from CryENGINE 3 (Siggraph 2013)Tiago Sousa
This lecture covers rendering topics related to Crytek’s latest engine iteration, the technology which powers titles such as Ryse, Warface, and Crysis 3. Among covered topics, Sousa presented SMAA 1TX: an update featuring a robust and simple temporal antialising component; performant and physically-plausible camera related post-processing techniques such as motion blur and depth of field were also covered.
Bindless Deferred Decals in The Surge 2Philip Hammer
These are the slides for my talk at Digital Dragons 2019 in Krakow.
Update: The recordings are online on youtube now:
https://www.youtube.com/watch?v=e2wPMqWETj8
CEDEC 2018 - Towards Effortless Photorealism Through Real-Time RaytracingElectronic Arts / DICE
Real-time raytracing holds the promise of simplifying rendering pipelines, eliminating artist-intensive workflows, and ultimately delivering photorealistic images. This talk by Tomasz Stachowiak provides a glimpse of the future through the lens of SEED's PICA PICA demo: a game made for artificial intelligence agents, with procedural level assembly, and no precomputation. We dive into technical details of several advanced rendering algorithms, and discuss how Microsoft's DirectX Raytracing technology allows for their intuitive implementation. Several challenges remain -- we will take a look at some of them, discuss how real-time raytracing fits in the spectrum of solutions, and start to plot the course towards robust and artist-friendly image synthesis.
SIGGRAPH 2018 - Full Rays Ahead! From Raster to Real-Time RaytracingElectronic Arts / DICE
In this presentation part of the "Introduction to DirectX Raytracing" course, Colin Barré-Brisebois of SEED discusses some of the challenges the team had to go through when going from raster to real-time raytracing for Project PICA PICA.
With the highest-quality video options, Battlefield 3 renders its Screen-Space Ambient Occlusion (SSAO) using the Horizon-Based Ambient Occlusion (HBAO) algorithm. For performance reasons, the HBAO is rendered in half resolution using half-resolution input depths. The HBAO is then blurred in full resolution using a depth-aware blur. The main issue with such low-resolution SSAO rendering is that it produces objectionable flickering for thin objects (such as alpha-tested foliage) when the camera and/or the geometry are moving. After a brief recap of the original HBAO pipeline, this talk describes a novel temporal filtering algorithm that fixed the HBAO flickering problem in Battlefield 3 with a 1-2% performance hit in 1920x1200 on PC (DX10 or DX11). The talk includes algorithm and implementation details on the temporal filtering part, as well as generic optimizations for SSAO blur pixel shaders. This is a joint work between Louis Bavoil (NVIDIA) and Johan Andersson (DICE).
Game engines have long been in the forefront of taking advantage of the ever increasing parallel compute power of both CPUs and GPUs. This talk is about how the parallel compute is utilized in practice on multiple platforms today in the Frostbite game engine and how we think the parallel programming models, hardware and software in the industry should look like in the next 5 years to help us make the best games possible
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
Secrets of CryENGINE 3 Graphics TechnologyTiago Sousa
In this talk, the authors will describe an overview of a different method for deferred lighting approach used in CryENGINE 3, along with an in-depth description of the many techniques used. Original file and videos at http://crytek.com/cryengine/presentations
In this talk, we present results from the real-time raytracing research done at SEED, a cross-disciplinary team working on cutting-edge, future graphics technologies and creative experiences at Electronic Arts. We explain in detail several techniques from “PICA PICA”, a real-time raytracing experiment featuring a mini-game for self-learning AI agents in a procedurally-assembled world. The approaches presented here are intended to inspire developers and provide a glimpse of a future where real-time raytracing powers the creative experiences of tomorrow.
A technical deep dive into the DX11 rendering in Battlefield 3, the first title to use the new Frostbite 2 Engine. Topics covered include DX11 optimization techniques, efficient deferred shading, high-quality rendering and resource streaming for creating large and highly-detailed dynamic environments on modern PCs.
At SIGGRAPH 2018, Colin Barré-Brisebois presented PICA PICA running on NVIDIA's new Turing architecture, with performance comparisons with Volta. Developed by Henrik Halén of SEED a technique for real-time raytraced transparent shadows was also presented, as well as an experiment with rough glass.
A presentation I did for China GDC 2011.
I cover the basic of visibility optimization as well as present some practical examples of visibility systems used in modern video games.
A talk given to students at the University of Texas's Game Development program. General information about my experiences in the game industry (from ~10 years ago), as well as more recent work around the game industry.
GDC2019 - SEED - Towards Deep Generative Models in Game DevelopmentElectronic Arts / DICE
Deep learning is becoming ubiquitous in Machine Learning (ML) research, and it's also finding its place in industry-related applications. Specifically, deep generative models have proven incredibly useful at generating and remixing realistic content from scratch, making themselves a very appealing technology in the field of AI-enhanced content authoring. As part of this year's Machine Learning Tutorial at the Game Developers Conference 2019 (GDC), Jorge Del Val from SEED will cover in an accessible manner the fundamentals of deep generative modeling, including some common algorithms and architectures. He will also discuss applications to game development and explore some recent advances in the field.
The attendee will gain basic understanding of the fundamentals of generative models and how to implement them. Also, attendees will grasp potential applications in the field of game development to inspire their work and companies. This talk does not require a mathematical or machine learning background, although previous knowledge on either of those is beneficial.
Henrik Halén (Lead Rendering Programmer) at Electronic Arts presented "Style and Gameplay in the Mirror's Edge" at SIGGRAPH 2010's Stylized Rendering in Games. https://www.cs.williams.edu/~morgan/SRG10/
Syysgraph 2018 - Modern Graphics Abstractions & Real-Time Ray TracingElectronic Arts / DICE
Graham Wihlidal and Colin Barré-Brisebois of SEED attended SyysGraph 2018 in Helsinki and presented to the group. The first section described aspects of Halcyon's rendering architecture, including information on explicit heterogeneous and virtual multi-GPU, render graph, and the remote render proxy backend. The second section discussed real-time ray tracing approaches and current open problems. The following day, this presentation was also given as a lecture at Aalto University.
Graham Wihlidal from SEED attended the Munich Khronos Meetup and presented some aspects of Halcyon's rendering architecture, as well as details of the Vulkan implementation. Graham presented components like high-level render command translation, render graph, and shader compilation.
CEDEC 2018 - Functional Symbiosis of Art Direction and ProceduralismElectronic Arts / DICE
This talk by SEED's Anastasia Opara covers the approach for procedural layout generation and placement in Project PICA PICA. The project posed a unique challenge as the levels were not created for humans, but for self-learning AI agents. Therefore, the level system had be flexible to meet the agents’ needs and ensure navigability, gameplay elements as well as adhere to the art direction.
We used Houdini from the very early stages to the final release: from co-creating art-direction to exporting final levels into our in-house RnD engine Halcyon. From this talk, you will learn how in a team of only 3 artists we created a functional symbiosis of art direction and procedural system in under 2 months as well as what challenges and solutions we had during our ‘procedural journey’.
EPC 2018 - SEED - Exploring The Collaboration Between Proceduralism & Deep Le...Electronic Arts / DICE
Proceduralism is a powerful language of rules, dependencies and patterns that can generate content indistinguishable from a manually produced one. Yet there are new opportunities that hold a great potential to enhance the existing techniques. In this talk, SEED's Anastasia Opara shares some of the early tests of marrying Proceduralism and Deep Learning and discusses how it can contribute to the current workflows.
You can view a recording of the presentation from 2018's Everything Procedural Conference here:
https://www.youtube.com/watch?v=dpYwLny0P8M
Human mechanisms of representing the surrounding world in a form of ‘language’ is an outstanding ability that enables us to store the information as internal compact abstractions. Proceduralism is also a form of language, where we view the world through rules, dependencies and patterns. And though rules are often perceived as something rigid, their engineering is a fluid and creative task, where analyzing our own thought framework often fuels the design process.
The past few years have seen a sharp increase in the complexity of rendering algorithms used in modern game engines. Large portions of the rendering work are increasingly written in GPU computing languages, and decoupled from the conventional “one-to-one” pipeline stages for which shading languages were designed. Following Tim Foley’s talk from SIGGRAPH 2016’s Open Problems course on shading language directions, we explore example rendering algorithms that we want to express in a composable, reusable and performance-portable manner. We argue that a few key constraints in GPU computing languages inhibit these goals, some of which are rooted in hardware limitations. We conclude with a call to action detailing specific improvements we would like to see in GPU compute languages, as well as the underlying graphics hardware.
This talk was originally given at SIGGRAPH 2017 by Andrew Lauritzen (EA SEED) for the Open Problems in Real-Time Rendering course.
This talk presents the approach Frostbite took to add support for HDR displays. It will summarize Frostbite's previous post processing pipeline and what the issues were. Attendees will learn the decisions made to fix these issues, improve the color grading workflow and support high quality HDR and SDR output. This session will detail the display mapping used to implement the"grade once, output many" approach to targeting any display and why an ad-hoc approach as opposed to filmic tone mapping was chosen. Frostbite retained 3D LUT-based grading flexibility and the accuracy differences of computing these in decorrelated color spaces will be shown. This session will also include the main issues found on early adopter games, differences between HDR standards, optimizations to achieve performance parity with the legacy path and why supporting HDR can also improve the SDR version.
Takeaway
Attendees will learn how and why Frostbite chose to support High Dynamic Range [HDR] displays. They will understand the issues faced and how these were resolved. This talk will be useful for those still to support HDR and provide discussion points for those who already do.
Intended Audience
The intended audience is primarily rendering engineers, technical artists and artists; specifically those who focus on grading and lighting and those interested in HDR displays. Ideally attendees will be familiar with color grading and tonemapping.
Talk by Graham Wihlidal (Frostbite Labs) at GDC 2017.
Checkerboard rendering is a relatively new technique, popularized recently by the introduction of the PlayStation 4 Pro. Many modern game engines are adding support for it right now, and in this talk, Graham will present an in-depth look at the new implementation in Frostbite, which is used in shipping titles like 'Battlefield 1' and 'Mass Effect Andromeda'. Despite being conceptually simple, checkerboard rendering requires a deep integration into the post-processing chain, in particular temporal anti-aliasing, dynamic resolution scaling, and poses various challenges to existing effects. This presentation will cover the basics of checkerboard rendering, explain the impact on a game engine that powers a wide range of titles, and provide a detailed look at how the current implementation in Frostbite works, including topics like object id, alpha unrolling, gradient adjust, and a highly efficient depth resolve.
Talk by Yuriy O’Donnell at GDC 2017.
This talk describes how Frostbite handles rendering architecture challenges that come with having to support a wide variety of games on a single engine. Yuriy describes their new rendering abstraction design, which is based on a graph of all render passes and resources. This approach allows implementation of rendering features in a decoupled and modular way, while still maintaining efficiency.
A graph of all rendering operations for the entire frame is a useful abstraction. The industry can move away from “immediate mode” DX11 style APIs to a higher level system that allows simpler code and efficient GPU utilization. Attendees will learn how it worked out for Frostbite.
Presentation by Andrew Hamilton and Ken Brown from DICE at GDC 2016.
Photogrammetry has started to gain steam within the Games Industry in recent years. At DICE, this technique was first used on Battlefield and they fully embraced the technology and workflow for Star Wars: Battlefront. This talk will cover their research and development, planning and production, techniques, key takeaways and plans for the future. The speakers will cover photogrammetry as a technology, but more than that, show that it's not a magic bullet but instead a tool like any other that can be used to help achieve your artistic vision and craft.
Takeaway
Come and learn how (and why) photogrammetry was used to create the world of Star Wars. This talk will cover Battlefront's use of of the technology from pre-production to launch as well as some of their philosophies around photogrammetry as a tool. Many visuals will be included!
Intended Audience
A content creator friendly talk intended for pretty much any developer, especially those involved in 3D content creation. It is not a technical talk focused on the code or engineering of photogrammetry. The speakers will quickly cover all basics, so absolutely no prerequisite knowledge required.
In this technical presentation Johan Andersson shows how the Frostbite 3 game engine is using the low-level graphics API Mantle to deliver significantly improved performance in Battlefield 4 on PC and future games from Electronic Arts. He will go through the work of bringing over an advanced existing engine to an entirely new graphics API, the benefits and concrete details of doing low-level rendering on PC and how it fits into the architecture and rendering systems of Frostbite. Advanced optimization techniques and topics such as parallel dispatch, GPU memory management, multi-GPU rendering, async compute & async DMA will be covered as well as sharing experiences of working with Mantle in general.
Technical talk from the AMD GPU14 Tech Day by Johan Andersson in the Frostbite team at DICE/EA about Battlefield 4 on PC which is the first title that will use 'Mantle' - a very high-performance low-level graphics API being in close collaboration by AMD and DICE/EA to get the absolute best performance and experience in Frostbite games on PC.
Talk by Johan Andersson (DICE/EA) in the Beyond Programmable Shading Course at SIGGRAPH 2012.
The other talks in the course can be found here: http://bps12.idav.ucdavis.edu/
For Battlefield 3, DICE took on its most difficult challenge so far. To raise the bar for character quality in games we developed our own deformation rig, combined it with the powerful ANT animation system (used in FIFA) and extensive motion capture usage. To create a believable experience we built and managed enormous amount of assets and ways of keeping these organized. The rigging process was one of the most challenging aspects of production, with the smallest change requiring an update for almost every single asset. With a modular rigging system and a flexible animation pipeline the production team could deliver on time and quality.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
2. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Who Am I?
Senior Rendering Engineer at SEED - EA
Cross-discipline pioneering group within Electronic Arts
Combining creativity with applied research
Exploring the future of interactive entertainment with the goal to
enable anyone to create their own games and experiences
Offices in Stockholm, Los Angeles and Montréal
Previously
Principal Rendering Engineer / Technical Director at WB Games Montréal
Rendering Engineer at EA Montréal and DICE
3. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
PICA PICA
Exploratory mini-game & world
For self-learning AI agents to play,
not humans
“Imitation Learning with Concurrent
Actions in 3D Games” [Harmer 2018]
SEED’s Halcyon R&D framework
Goals
Explore hybrid rendering with DXR
Clean and consistent visuals
Procedural worlds [Opara 2018]
No precomputation
4. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Agenda
Motivations
State-of-the-Art
Open Problems and Beyond
5. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Why use ray tracing?
6. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Why not!
12. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Hybrid Rendering Pipeline
13. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Hybrid Rendering Pipeline
Direct shadows
(raytrace or raster)
Lighting
(compute + raytrace)
Reflections
(raytrace or compute)
Deferred shading
(raster)
Global Illumination
(compute and raytrace)
Post processing
(compute)
Transparency & Translucency
(raytrace and compute)
Ambient occlusion
(raytrace or compute)
14. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Reflections
Launch rays from G-Buffer
Trace at half resolution
¼ ray/pixel for reflection
¼ ray/pixel for reflected shadow
Reconstruct at full resolution
Also supports:
Arbitrary normals
Spatially-varying roughness
Extended info:
GDC 2018 & DD 2018
15. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Reflection Pipeline
Importance
sampling
Screen-space
reflection
Ray Tracing
Spatial
reconstruction
Envmap gap fill
Bilateral cleanup
Temporal
accumulation
16. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Materials
Multiple microfacet layers
Rapidly experiment with different looks
Bake down a number of layers for production
Energy conserving
Automatic Fresnel between layers
Inspired by [Weidlich 2007]
Arbitrarily Layered Micro-Facet Surfaces
Unified for all lighting & rendering
Raster, hybrid, path-traced reference
17. Raw Ray Trace Output Ray Tracing
Importance
sampling
23. Raw Ray Trace Output Ray Tracing
Importance
sampling
24. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Shadows
Launch a ray towards light
Ray misses Not in shadow
RAY_FLAG_SKIP_CLOSEST_HIT_SHADER
Handled by Miss Shader
Shadowed = !payload.miss;
Soft shadows?
Random direction from cone [PBRT]
Cone width drives penumbra
[1;N] rays & filtering
We used SVGF [Schied 2017]
Temporal accumulation
Multi-pass weighted blur
Variance-driven kernel size
26. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
AO
Random cosine hemi sampling
Launch from g-buffer normals
AO = payload.miss ? 1.0 : 0.0
Integral of the visibility function
over the hemisphere Ω for the
point p on a surface with normal
n with respect to the projected
solid angle 𝑛
𝑝
𝐴 𝑝 =
1
𝜋 Ω
𝑉𝑝, 𝑤( 𝑛 ⋅ 𝑤)𝑑𝜔
32. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Transparency & Translucency
Ray Tracing enables accurate light scattering
Refractions
Order-independent (OIT)
Variable roughness
Handles multiple IOR transitions
Beer-Lambert absorption
Translucency
Light scattering inside a medium
Improvement over Translucency in Frostbite
[Barré-Brisebois 2011]
Glass and Translucency
33. Transparency
Works for clear and rough glass
Clear
No filtering required
Rough / Blurry
Open cone angle with Uniform Cone
Sampling [PBRT]
Wider cone → more samples
Or temporal filtering
Apply phase function & BTDF
S E E D // Shiny Pixels and Beyond: Rendering Research at SEED
34. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Translucency Breakdown
For every valid position & normal
35. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Translucency Breakdown
For every valid position & normal
Flip normal and push (ray) inside
36. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Translucency Breakdown
For every valid position & normal
Flip normal and push (ray) inside
Launch rays in uniform sphere dist.
(Importance-sample phase function)
37. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Translucency Breakdown
For every valid position & normal
Flip normal and push (ray) inside
Launch rays in uniform sphere dist.
(Importance-sample phase function)
Compute lighting at intersection
38. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Translucency Breakdown
For every valid position & normal
Flip normal and push (ray) inside
Launch rays in uniform sphere dist.
(Importance-sample phase function)
Compute lighting at intersection
Gather all samples
39. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Translucency Breakdown
For every valid position & normal
Flip normal and push (ray) inside
Launch rays in uniform sphere dist.
(Importance-sample phase function)
Compute lighting at intersection
Gather all samples
Update value in texture
40. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Translucency Filtering
Can denoise spatially and/or temporally
Temporal: build an update heuristic
Reactive enough for moving lights & objects
Exponential moving average can be OK
Variance-adaptive mean estimation
Based on exponential moving average
Adaptive hysteresis
41. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Improve image
consistency
Minimize “non-art”
overhead steps that
artists have to endure!
[Ritchell 2011]
Global Illumination
No precomputation
No manual parametrization
Adaptive Refinement
Static + Dynamic Scenes
User-Generated Content &
Experiences
44. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Spatial Storage
Can’t afford to solve every frame
Our GI budget: 250k rays / frame
World space surfels
Easy to accumulate
Distribute on the fly
No parameterization
Smooth result by construction
Position, normal, radius, animation info
45. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Surfel Skinning
46. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Surfel Screen Application
Render like deferred light sources
Diffuse only
Smoothstep distance attenuation
Mahalanobis metric
Squared dot of normals angular weight
Inspired by [Lehtinen 2008]
World-space 3D culling
Uniform grid
Each cell holds list of surfels
Find one cell per pixel, use all from list
48. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Surfel Placement
Surfel Spawning From Camera @ 1% speed
49. DXR & mGPU!
Ray
Generation
Copy Sub Regions
Copy Sub Regions
GPU1 GPU2 GPU3 GPU4
Explicit Heterogenous Multi-GPU
Parallel Fork-Join Style
Resources copied through system
memory using copy queue
Minimize PCI-E transfers
Approach
Run ray generation on primary GPU
Copy results in sub-regions to other GPUs
Run tracing phases on separate GPUs
Copy tracing results back to primary GPU
Run filtering on primary GPU
Trace
GPU 2
Trace
GPU 1
Trace
GPU 3
Trace
GPU 4
Filter
51. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Noise vs Ghosting vs Perf.
RTRT opens the door to an entirely new class of techniques for games
Still have to build techniques around trade-offs
More samples? less noise, better convergence, worse performance
Reuse samples? better performance, more memory, can add ghosting
The story remains the same: devise algorithms around real constraints
Noise, ghosting, performance, memory, bandwidth
Key discussion elements for your papers for gamedev adoption
RTRT as a high-end feature for a while
Performance is not going to happen overnight - can’t plug RT and just call it a day
Make RT techniques “pluggable” as transition happens
52. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Game Ray Tracing Constraints
Gamedev constraints?
The “Less-Than-N Milliseconds Threshold”:
Game & platform specific cut-off at which a technique can make it into a game
Significant change to aesthetics? Something else will have to go
Can only allocate so much memory and frame time
Sounds limiting… but there are some options
Techniques with unilateral visual gains easily justifiable
Pluggable advantage to ray tracing:
SSR RT Reflections, SSAO RTAO
Dynamic vs Static GI Harder to make pluggable… unless it’s incremental
53. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Transparency
Transparency is far from being solved!
Glass
Clear/rough + filtered + shadowed
Particles & Volumetric Effects
Can use miss shader to update volumes
Raymarching in hit shaders?
Non-trivial blending & filtering
PICA PICA: texture-space OIT with
refractions and scattering
Not perfect, but one step closer
Denoising techniques don’t work so well with transparency (and 1-2 spp)
Transparency in the Maxwell Renderer
54. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Partial Coverage
Foliage
Can still do alpha test
Animated becomes a real problem
Defocus effects such as motion blur
and depth of field are still intractable
for the same reasons
Need denoising techniques that handle
partial coverage @ 1-2 spp
PBRT (Matt Pharr, Wenzel Jakob, Greg Humphreys)
55. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Specialized Denoising vs Reconstruction
PICA PICA: specialized denoising & reconstruction for view-
dependent, light-dependent or any other term you can
monitor variance across frames [Stachowiak2018]
Can use general denoising on the whole image, but…
Denoising filter around a specific term will achieve greater quality
Potentially converges faster
One algorithm to rule-them-all?
Lately…
Gradient Estimation for Real-Time Adaptive Temporal Filtering
[Schied 2018]
Also, lots of progress with DL approaches
56. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Validate Against Ground Truth!
Validating against ground truth is key when building RTRT!
And easy to add to your existing real-time engine
Toggle between hybrid and path-tracer when working on a feature
Rapidly compare results against ground truth
Toggle between non-RT techniques and RT
i.e.: SSR RT reflections, SSAO RTAO
Check performance & check quality (and where you can cut corners)
PICA PICA: used constantly during production
Multi-layer material & specular, RTAO vs SSAO, Surfel GI vs path-traced GI
No additional maintenance required between shared code
Because of interop!
57. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Taking Advantage of Interop
DirectX offers easy interoperability between raster, compute and ray tracing
Ray Tracing, rasterization and compute shaders can share code & types
Evaluate your actual HLSL material shaders - directly usable for a hybrid ray tracing pipeline
The output from one stage can feed data for another
i.e.: Write to UAVs, read in next stage
i.e.: Prepare rays to be launched, and trace on another (i.e.: mGPU)
i.e.: Can update Shader Table from the GPU
Interop extends opportunities for solving new sparse problems
Interleave raster, compute and ray tracing
Use the power of each stage to your advantage
58. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Ray Tracing Literature
“Not much work done in the literature on the algorithm side for RTRT”
RTRT literature has focused on accelerating “correct” ray tracing
Lots new challenges with game ray tracing
Literature has to adapt to game ray tracing constraints
Games budgeted amount rays/frame, rays/pixel, fixed frame times & memory budgets
Games Light transport caches: surfels, voxels, lightmaps
New metrics for RTRT papers to be adopted by games
Q: How many rays-per-pixel needed to have real-time perfect indirect diffuse?
Q: What has to happen to get fully robust PBR real-time raytraced lighting?
With caustics and complex reflections & refractions…
59. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
DirectX Ray Tracing
DXR is somewhat of a black box
Opens up RT to the masses with some level of abstraction to make things simpler
You can still build your own RT approach of DXR
Blackbox = IHVs can optimize for the common usecase
Research on acceleration structures will be limited to
IHVs because of the obfuscated nature of the API
Still a need! Get a job at an IHV?
Need ways to accelerate for non-obvious primitives
Ray-triangle with a BVH is the obvious one
How about cone-BVH for ray-marching? (à-la-Aaltonen in Claybook)
60. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
DXR Acceleration Structure
Is DXR’s Top + Bottom Acceleration Structure (AS) = best approach for RTRT?
Performance is not 100% clear
Top vs bottom counts?
Break tri meshes into bottom level blobs?
How many top items is too many?
Frequency of updates vs performance?
Throw everything static into one massive BVH?
Many factors can affect performance
Ray Tracing requires knowing everything about the scene
Hard to say how this scales to large (open) dynamic game worlds
Mesh 1 Mesh 2 Mesh 3
Bottom
Accel
Inst. 1 Inst. 2 Inst. 3 Inst. 4 Inst. 5
Top
Accel
Shader
Table
R R R R M M …… …
H H H H H H H H H H H H H H H
61. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Dynamic Scenes
Dynamic scenes are not fully solved yet
Many animated characters…
Moving environments…
Moving foliage & vegetation…
Massive open worlds…
User generated content…
User created experiences…
…
Not clear if the current DXR Top/Bottom
acceleration structure is the best approach
for massively dynamic environments
Complex & Large Scale Environments in EA BioWare’s Anthem
62. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Acceleration Structures (1/)
Best tracing performance comes from one massive static BVH?
Viable for a big static scene, but most games are not static…
Top Level: has to be rebuilt whenever anything interesting happens
Animated objects, objects inserted/removed, LOD swap
Bottom Level: is there a sweet spot?
Lots of individual rigid objects: might have to bake transforms and merge into larger ones
How much memory can you spare for better tracing performance?
2+ levels hierarchies?
+ : Allow to split worlds into update frequencies, and avoid updates if nothing has changed
- : Tracking the object-to-world transforms becomes difficult: ray needs a transform stack
- : Apps can end up creating a large instance hierarchy of single-node trees bad
63. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Acceleration Structures (2/)
Idea: have two bottom level BVH's, one for static and one for dynamic geo?
Static:
Built & left alone
Traverse it first
Dynamic:
Rebuilt in the background as a long running task
Refitted every frame
After the rebuild swap the dynamic BVH out & refit
+ Supports many animated characters
While not taking a hit for rebuilding BVH for static geometry that doesn’t change.
- Doesn’t solve massive number of materials that games typically have…
64. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Procedural Geometry
Ray-Triangle intersection is the fast path
Well-known problem [Möller 1997] [Baldwin 2016]
Intersection of procedural geometry currently done via Intersection Shaders
Allows custom/parametric intersection code, for arbitrary shapes
Current (slow) path for procedural geometry
Procedural: hair, water, and particles are still a challenge
Hair: all done parametrically via intersection shaders?
Water: can be done via triangles, but reflections + refractions are challenging
Also, handle view-dependent LODing (ie.: water patches) vs ray-dependent LODing?
IHVs: Procedural == unpredictable
Completely programmable unpredictable performance and hard to optimize/accelerate
65. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Coherency is key for RTRT performance
Coherent adjacent work performing similar operations & memory access
Camera rays, texture-space shading
Incoherent thrash caches, kills performance
Reflection, shadows, refraction, Monte Carlo
Managing Coherency
66. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Coherency is key for RTRT performance
Coherent adjacent work performing similar operations & memory access
Camera rays, texture-space shading
Incoherent thrash caches, kills performance
Reflection, shadows, refraction, Monte Carlo
Hardware?
2009: Caustics Graphics @ 50-100M incoherent rays/sec
PICA PICA: 1920 x 1080 x 60FPS x 2.25 rpp = ~280M rps
Managing Coherency
67. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Coherency is key for RTRT performance
Coherent adjacent work performing similar operations & memory access
Camera rays, texture-space shading
Incoherent thrash caches, kills performance
Reflection, shadows, refraction, Monte Carlo
Hardware?
2009: Caustics Graphics @ 50-100M incoherent rays/sec
PICA PICA: 1920 x 1080 x 60FPS x 2.25 rpp = ~280M rps
Software?
Disney’s Hyperion [Eisenacher 2013]
Sort large out-of-core ray batches & ray-hits for deferred sharing
More R&D and tailor RTRT pipelines to reduce incoherency
[Eisenacher 2013]
Managing Coherency
68. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Coherency: Ray Batch Sizes
Example: 1K materials, fire 100k secondary rays
(Potentially) need to run 1k shaders each on a batch of 100 hits. Viable workload?
Can’t just shove all material variations in the SBT and let IHVs optimize, right?
Tipping point: how many materials are in typical game scene and how aggressive they need
to be merged?
Reduce shader variations by 1 order of magnitude, or 2 orders of magnitude?
Uber Shaders + Texture Space Shading?
PICA PICA Uber Shaders: Rebinding constants quite frequently was not an issue
Use texture-space gbuffers, or is that too much memory?
Great progress lately!
IHV / ISV / API / Research to work together to figure out the best way to solve this!
69. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Global Illumination
RTRT doesn’t completely solve real-time GI
Open problem even in offline rendering
Variance still too high
Can reduce frequency
Prefiltering, path-space filtering
Denoising & reconstruction
Pinhole GI & lighting is not solved
Incoherent shading intractable performance
Have to resort to caching to amortize shading
PICA PICA: caching of GI via surfels
Some issues: only spawn surfels from what you see
Need to solve GI for user-generated content
[Ritchell 2011]
70. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Hybrid++
Ray Tracing primary visibility: we can’t only rely on RT primary visibility
Raster still shows some advantages for primary opaque visibility over RT
A different story with hardware ray tracing? Who knows
New combinations of raster, compute and ray tracing?
PICA PICA was a first step, but what’s next?
71. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Texture Level-of-Detail
What about texture level of detail?
Mipmapping [Williams 1983] is the standard method to avoid texture aliasing:
Screen-space pixel maps to approximately one texel in the mipmap hierarchy
Supported by all GPUs for rasterization via shading quad and derivatives
Left: level-of-detail (λ), partial derivatives and the parallelogram-approximated texture-space footprint of a pixel. Right: mipmap chain
72. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Texture Level-of-Detail
No shading quads for ray tracing!
Traditionally: Ray Differentials
Estimates the footprint of a pixel by
computing world-space derivatives of the
ray with respect to the image plane
Have to differentiate (virtual offset) rays
Heavier payload (12 floats) for subsequent
rays (can) affect performance. Optimize!
Alternative: always sample mip 0 with bilinear filtering (with extra samples)
Leads to aliasing and additional performance cost
Ray Differentials [Igehy99]
Surface
P(x + ∂x)
D(x + ∂x)P(x)
∂D(x)
∂P(x)
D(x)
D(x)
x
y
R(x)
R(x + ∂x)
∂x
73. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Texture Level-of-Detail
Together with Research, we
developed a texture LOD technique for ray tracing:
Heuristic based on triangle properties, a curvature
estimate, distance, and incident angle
Similar quality to ray differentials with single trilinear lookup
Single value stored in the payload
Upcoming publication in Ray Tracing Gems
Tomas Akenine-Möller (NV), Jim Nilsson (NV), Magnus
Andersson (NV), Colin Barré-Brisebois (EA), Robert Toth
Barely scratched the surface – still work to do!
Preprint: https://t.co/opJPBiZ6au
74. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Summary
Real-time ray tracing brings the game to another level
Still have a lots of work to do to bridge offline and real-time
DXR provides a playground where research and gamedevs can collaborate
even more!
Would love to team up with you to tackle some of the challenges!
75. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
PICA PICA ASSETS
Now available on Sketchfab & SEED’s website. Free use for your R&D & papers!
glTF + FBX + textures. License: CC BY-NC 4.0 https://skfb.ly/6AJCp
76. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Thanks
SEED PICA PICA Team
Tomas Akenine-Möller
Sebastian Aaltonen
Johan Andersson
Joshua Barczak
Jasper Bekkers
Carlos Gonzalez-Ochoa
Jon Greenberg
Henrik Halén
John Hable
Steve Hill
Andrew Lauritzen
Aaron Lefohn
Krzysztof Narkowicz
Richard Schreyer
Tomasz Stachowiak
Graham Wihlidal
Thanks to everyone who shared feedback!
77. S E E D / / S E A R C H F O R E X T R A O R D I N A R Y E X P E R I E N C E S D I V I S I O N
S T O C K H O L M – L O S A N G E L E S – M O N T R É A L – R E M O T E
W W W . E A . C O M / S E E D
W E ‘ R E H I R I N G !
78. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
Questions?
Thoughts on RTRT?
79. SEED // Game Ray Tracing: State-of-the-Art and Open Problems
References [Andersson & Barré-Brisebois 2018] Andersson, Johan and Barré-Brisebois, Colin. “Shiny Pixels and Beyond: Real-Time Ray Tracing at SEED”, online.
[Baldwin 2016] Baldwin, Doug and Webber, Michael. “Fast Ray-Triangle Intersections by Coordinate Transformation”, online.
[Barré-Brisebois 2011] Barré-Brisebois, Colin and Bouchard, Marc. “Approximating Translucency for a Fast, Cheap and Convincing Subsurface Scattering Look”, online.
[Barré-Brisebois 2017] Barré-Brisebois, Colin. “A Certain Slant of Light: Past, Present and Future Challenges of Global Illumination in Games”, online.
[Eisenacher 2013] Eisenacher, Christia et. al. “Sorted Deferred Shading for Production Path Tracing”, online.
[Harmer 2018] Harmer et. al. “Imitation Learning with Concurrent Actions in 3D Games”, online.
[Hillaire 2018] Hillaire, Sébastien. "Real-time Ray Tracing for Interactive Global Illumination Workflows in Frostbite", online.
[Igehy 1999] Igehy, Homan. “Tracing Ray Differentials”, online.
[Jimenez 2016] Jimenez, Jorge et. al. "Practical Realtime Strategies for Accurate Indirect Occlusion", online.
[Lauritzen 2017] Lauritzen, Andrew. “Future Directions for Compute-for-Graphics”, online.
[Möller 1997] Möller, Tomas and Trumbore, Ben. "Fast, Minimum Storage Ray-Triangle Intersection“, online.
[Opara 2018] Opara, Anastasia. “Creativity of Rules and Patterns”, online.
[PBRT] Pharr, Matt. Jakob, Wenzel and Humphreys, Greg. “Physically Based Rendering”, Book, http://www.pbrt.org/.
[Ritchel 2011] Ritschel, Tobias et. al. “The State of the Art in Interactive Global Illumination”, online.
[Schied 2017] Schied, Christoph et. al. “Spatiotemporal Variance-Guided Filtering: Real-Time Reconstruction for Path-Traced Global Illumination”, online.
[Schied 2018] Schied, Christoph et. al. “Gradient Estimation for Real-Time Adaptive Temporal Filtering”, online.
[Stachowiak 2015] Stachowiak, Tomasz. “Stochastic Screen-Space Reflections”, online.
[Stachowiak 2018] Stachowiak, Tomasz. “Stochastic All The Things: Ray Tracing in Hybrid Real-Time Rendering”, online.
[Weidlich 2007] Weidlich, Andrea and Wilkie, Alexander. “Arbitrarily Layered Micro-Facet Surfaces”, online.
[Williams 1983] Williams, Lance. “Pyramidal Parametrics”, online.