This talk provides additional details around the hybrid real-time rendering pipeline we developed at SEED for Project PICA PICA.
At Digital Dragons 2018, we presented how leveraging Microsoft's DirectX Raytracing enables intuitive implementations of advanced lighting effects, including soft shadows, reflections, refractions, and global illumination. We also dove into the unique challenges posed by each of those domains, discussed the tradeoffs, and evaluated where raytracing fits in the spectrum of solutions.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
For this year's keynote at High Performance Graphics 2018, Colin Barré-Brisebois from SEED discussed the state of the art in real-time game ray tracing. He explored some of the connections between offline and real-time game ray tracing, and presented some of the open problems. Colin exposed a few potential solutions to those problems, and also proposed a call-to-arms on topics where the ray tracing research community and the games industry should unite in order to solve such open problems.
CEDEC 2018 - Towards Effortless Photorealism Through Real-Time RaytracingElectronic Arts / DICE
Real-time raytracing holds the promise of simplifying rendering pipelines, eliminating artist-intensive workflows, and ultimately delivering photorealistic images. This talk by Tomasz Stachowiak provides a glimpse of the future through the lens of SEED's PICA PICA demo: a game made for artificial intelligence agents, with procedural level assembly, and no precomputation. We dive into technical details of several advanced rendering algorithms, and discuss how Microsoft's DirectX Raytracing technology allows for their intuitive implementation. Several challenges remain -- we will take a look at some of them, discuss how real-time raytracing fits in the spectrum of solutions, and start to plot the course towards robust and artist-friendly image synthesis.
SIGGRAPH 2018 - Full Rays Ahead! From Raster to Real-Time RaytracingElectronic Arts / DICE
In this presentation part of the "Introduction to DirectX Raytracing" course, Colin Barré-Brisebois of SEED discusses some of the challenges the team had to go through when going from raster to real-time raytracing for Project PICA PICA.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
For this year's keynote at High Performance Graphics 2018, Colin Barré-Brisebois from SEED discussed the state of the art in real-time game ray tracing. He explored some of the connections between offline and real-time game ray tracing, and presented some of the open problems. Colin exposed a few potential solutions to those problems, and also proposed a call-to-arms on topics where the ray tracing research community and the games industry should unite in order to solve such open problems.
CEDEC 2018 - Towards Effortless Photorealism Through Real-Time RaytracingElectronic Arts / DICE
Real-time raytracing holds the promise of simplifying rendering pipelines, eliminating artist-intensive workflows, and ultimately delivering photorealistic images. This talk by Tomasz Stachowiak provides a glimpse of the future through the lens of SEED's PICA PICA demo: a game made for artificial intelligence agents, with procedural level assembly, and no precomputation. We dive into technical details of several advanced rendering algorithms, and discuss how Microsoft's DirectX Raytracing technology allows for their intuitive implementation. Several challenges remain -- we will take a look at some of them, discuss how real-time raytracing fits in the spectrum of solutions, and start to plot the course towards robust and artist-friendly image synthesis.
SIGGRAPH 2018 - Full Rays Ahead! From Raster to Real-Time RaytracingElectronic Arts / DICE
In this presentation part of the "Introduction to DirectX Raytracing" course, Colin Barré-Brisebois of SEED discusses some of the challenges the team had to go through when going from raster to real-time raytracing for Project PICA PICA.
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
Talk by Graham Wihlidal (Frostbite Labs) at GDC 2017.
Checkerboard rendering is a relatively new technique, popularized recently by the introduction of the PlayStation 4 Pro. Many modern game engines are adding support for it right now, and in this talk, Graham will present an in-depth look at the new implementation in Frostbite, which is used in shipping titles like 'Battlefield 1' and 'Mass Effect Andromeda'. Despite being conceptually simple, checkerboard rendering requires a deep integration into the post-processing chain, in particular temporal anti-aliasing, dynamic resolution scaling, and poses various challenges to existing effects. This presentation will cover the basics of checkerboard rendering, explain the impact on a game engine that powers a wide range of titles, and provide a detailed look at how the current implementation in Frostbite works, including topics like object id, alpha unrolling, gradient adjust, and a highly efficient depth resolve.
Course presentation at SIGGRAPH 2014 by Charles de Rousiers and Sébastian Lagarde at Electronic Arts about transitioning the Frostbite game engine to physically-based rendering.
Make sure to check out the 118 page course notes on: http://www.frostbite.com/2014/11/moving-frostbite-to-pbr/
During the last few months, we have revisited the concept of image quality in Frostbite. The core of our approach was to be as close as possible to a cinematic look. We used the concept of reference to evaluate the accuracy of produced images. Physically based rendering (PBR) was the natural way to achieve this. This talk covers all the different steps needed to switch a production engine to PBR, including the small details often bypass in the literature.
The state of the art of real-time PBR techniques allowed us to achieve good overall results but not without production issues. We present some techniques for improving convolution time for image based reflection, proper ambient occlusion handling, and coherent lighting units which are mandatory for level editing.
Moreover, we have managed to reduce the quality gap, highlighted by our systematic reference comparison, in particular related to rough material handling, glossy screen space reflection, and area lighting.
The technical part of PBR is crucial for achieving good results, but represents only the top of the iceberg. Frostbite has become the de facto high-end game engine within Electronic Arts and is now used by a large amount of game teams. Moving all these game teams from “old fashion” lighting to PBR has required a lot of education, which have been done in parallel of the technical development. We have provided editing and validation tools to help the transition of art production. In addition, we have built a flexible material parametrisation framework to adapt to the various authoring tools and game teams’ requirements.
Graphics Gems from CryENGINE 3 (Siggraph 2013)Tiago Sousa
This lecture covers rendering topics related to Crytek’s latest engine iteration, the technology which powers titles such as Ryse, Warface, and Crysis 3. Among covered topics, Sousa presented SMAA 1TX: an update featuring a robust and simple temporal antialising component; performant and physically-plausible camera related post-processing techniques such as motion blur and depth of field were also covered.
We present the technology and ideas behind the unique lighting in MIRRORS EDGE from DICE. Covering how DICE adopted Global illumination into their lighting process and Illuminate Labs current toolbox of state of the art lighting technology.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
In this talk, we present results from the real-time raytracing research done at SEED, a cross-disciplinary team working on cutting-edge, future graphics technologies and creative experiences at Electronic Arts. We explain in detail several techniques from “PICA PICA”, a real-time raytracing experiment featuring a mini-game for self-learning AI agents in a procedurally-assembled world. The approaches presented here are intended to inspire developers and provide a glimpse of a future where real-time raytracing powers the creative experiences of tomorrow.
At SIGGRAPH 2018, Colin Barré-Brisebois presented PICA PICA running on NVIDIA's new Turing architecture, with performance comparisons with Volta. Developed by Henrik Halén of SEED a technique for real-time raytraced transparent shadows was also presented, as well as an experiment with rough glass.
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
Talk by Graham Wihlidal (Frostbite Labs) at GDC 2017.
Checkerboard rendering is a relatively new technique, popularized recently by the introduction of the PlayStation 4 Pro. Many modern game engines are adding support for it right now, and in this talk, Graham will present an in-depth look at the new implementation in Frostbite, which is used in shipping titles like 'Battlefield 1' and 'Mass Effect Andromeda'. Despite being conceptually simple, checkerboard rendering requires a deep integration into the post-processing chain, in particular temporal anti-aliasing, dynamic resolution scaling, and poses various challenges to existing effects. This presentation will cover the basics of checkerboard rendering, explain the impact on a game engine that powers a wide range of titles, and provide a detailed look at how the current implementation in Frostbite works, including topics like object id, alpha unrolling, gradient adjust, and a highly efficient depth resolve.
Course presentation at SIGGRAPH 2014 by Charles de Rousiers and Sébastian Lagarde at Electronic Arts about transitioning the Frostbite game engine to physically-based rendering.
Make sure to check out the 118 page course notes on: http://www.frostbite.com/2014/11/moving-frostbite-to-pbr/
During the last few months, we have revisited the concept of image quality in Frostbite. The core of our approach was to be as close as possible to a cinematic look. We used the concept of reference to evaluate the accuracy of produced images. Physically based rendering (PBR) was the natural way to achieve this. This talk covers all the different steps needed to switch a production engine to PBR, including the small details often bypass in the literature.
The state of the art of real-time PBR techniques allowed us to achieve good overall results but not without production issues. We present some techniques for improving convolution time for image based reflection, proper ambient occlusion handling, and coherent lighting units which are mandatory for level editing.
Moreover, we have managed to reduce the quality gap, highlighted by our systematic reference comparison, in particular related to rough material handling, glossy screen space reflection, and area lighting.
The technical part of PBR is crucial for achieving good results, but represents only the top of the iceberg. Frostbite has become the de facto high-end game engine within Electronic Arts and is now used by a large amount of game teams. Moving all these game teams from “old fashion” lighting to PBR has required a lot of education, which have been done in parallel of the technical development. We have provided editing and validation tools to help the transition of art production. In addition, we have built a flexible material parametrisation framework to adapt to the various authoring tools and game teams’ requirements.
Graphics Gems from CryENGINE 3 (Siggraph 2013)Tiago Sousa
This lecture covers rendering topics related to Crytek’s latest engine iteration, the technology which powers titles such as Ryse, Warface, and Crysis 3. Among covered topics, Sousa presented SMAA 1TX: an update featuring a robust and simple temporal antialising component; performant and physically-plausible camera related post-processing techniques such as motion blur and depth of field were also covered.
We present the technology and ideas behind the unique lighting in MIRRORS EDGE from DICE. Covering how DICE adopted Global illumination into their lighting process and Illuminate Labs current toolbox of state of the art lighting technology.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
In this talk, we present results from the real-time raytracing research done at SEED, a cross-disciplinary team working on cutting-edge, future graphics technologies and creative experiences at Electronic Arts. We explain in detail several techniques from “PICA PICA”, a real-time raytracing experiment featuring a mini-game for self-learning AI agents in a procedurally-assembled world. The approaches presented here are intended to inspire developers and provide a glimpse of a future where real-time raytracing powers the creative experiences of tomorrow.
At SIGGRAPH 2018, Colin Barré-Brisebois presented PICA PICA running on NVIDIA's new Turing architecture, with performance comparisons with Volta. Developed by Henrik Halén of SEED a technique for real-time raytraced transparent shadows was also presented, as well as an experiment with rough glass.
Syysgraph 2018 - Modern Graphics Abstractions & Real-Time Ray TracingElectronic Arts / DICE
Graham Wihlidal and Colin Barré-Brisebois of SEED attended SyysGraph 2018 in Helsinki and presented to the group. The first section described aspects of Halcyon's rendering architecture, including information on explicit heterogeneous and virtual multi-GPU, render graph, and the remote render proxy backend. The second section discussed real-time ray tracing approaches and current open problems. The following day, this presentation was also given as a lecture at Aalto University.
Slides of Nathan Piasco ICRA 2019 oral presentation about the paper "Learning Scene Geometry for Visual Localization in Challenging Conditions". Best paper in Robot Vision Finalist
This talk is about our experiences gained during making of the Killzone Shadow Fall announcement demo.
We’ve gathered all the hard data about our assets, memory, CPU and GPU usage and a whole bunch of tricks.
The goal of talk is to help you to form a clear picture of what’s already possible to achieve on PS4.
The past few years have seen a sharp increase in the complexity of rendering algorithms used in modern game engines. Large portions of the rendering work are increasingly written in GPU computing languages, and decoupled from the conventional “one-to-one” pipeline stages for which shading languages were designed. Following Tim Foley’s talk from SIGGRAPH 2016’s Open Problems course on shading language directions, we explore example rendering algorithms that we want to express in a composable, reusable and performance-portable manner. We argue that a few key constraints in GPU computing languages inhibit these goals, some of which are rooted in hardware limitations. We conclude with a call to action detailing specific improvements we would like to see in GPU compute languages, as well as the underlying graphics hardware.
This talk was originally given at SIGGRAPH 2017 by Andrew Lauritzen (EA SEED) for the Open Problems in Real-Time Rendering course.
How the Universal Render Pipeline unlocks games for you - Unite Copenhagen 2019Unity Technologies
Learn how the Boat Attack demo was created using the Universal Render Pipeline. These slides offer an in-depth look at the features used in the demo, including Shader Graph, Custom Render Passes, Camera Callback, and more.
Speaker:
Andre McGrail - Unity Technologies
Watch the session on YouTube: https://youtu.be/ZPQdm1T7aRs
Chris Varekamp (Philips Group Innovation, Research): Depth estimation, Proces...AugmentedWorldExpo
A talk from the Develop Track at AWE USA 2018 - the World's #1 XR Conference & Expo in Santa Clara, California May 30- June 1, 2018.
Chris Varekamp (Philips Group Innovation, Research): Depth estimation, Processing & Rendering for Dynamic 6DoF VR
In this talk I will discuss how a real-time depth-based processing chain can be built using our experience in stereo-to-depth conversion for autostereoscopic displays.
http://AugmentedWorldExpo.com
This is a slide for Fully Convolutional Refined Auto-Encoding Generative Adversarial Networks for 3D Multi Object Scenes which is my work at Stanford AI Lab as a visiting scholar.
Special thanks to Christopher Choy and Prof. Silvio Savarese.
Github:
https://github.com/yunishi3/3D-FCR-alphaGAN
Upcoming rendering technology including scriptable render pipelines, advanced lighting options and more.
Presenter: Arisa Scott (Graphis Product Manager, Unity Technologies)
Similar to DD18 - SEED - Raytracing in Hybrid Real-Time Rendering (20)
GDC2019 - SEED - Towards Deep Generative Models in Game DevelopmentElectronic Arts / DICE
Deep learning is becoming ubiquitous in Machine Learning (ML) research, and it's also finding its place in industry-related applications. Specifically, deep generative models have proven incredibly useful at generating and remixing realistic content from scratch, making themselves a very appealing technology in the field of AI-enhanced content authoring. As part of this year's Machine Learning Tutorial at the Game Developers Conference 2019 (GDC), Jorge Del Val from SEED will cover in an accessible manner the fundamentals of deep generative modeling, including some common algorithms and architectures. He will also discuss applications to game development and explore some recent advances in the field.
The attendee will gain basic understanding of the fundamentals of generative models and how to implement them. Also, attendees will grasp potential applications in the field of game development to inspire their work and companies. This talk does not require a mathematical or machine learning background, although previous knowledge on either of those is beneficial.
Henrik Halén (Lead Rendering Programmer) at Electronic Arts presented "Style and Gameplay in the Mirror's Edge" at SIGGRAPH 2010's Stylized Rendering in Games. https://www.cs.williams.edu/~morgan/SRG10/
Graham Wihlidal from SEED attended the Munich Khronos Meetup and presented some aspects of Halcyon's rendering architecture, as well as details of the Vulkan implementation. Graham presented components like high-level render command translation, render graph, and shader compilation.
CEDEC 2018 - Functional Symbiosis of Art Direction and ProceduralismElectronic Arts / DICE
This talk by SEED's Anastasia Opara covers the approach for procedural layout generation and placement in Project PICA PICA. The project posed a unique challenge as the levels were not created for humans, but for self-learning AI agents. Therefore, the level system had be flexible to meet the agents’ needs and ensure navigability, gameplay elements as well as adhere to the art direction.
We used Houdini from the very early stages to the final release: from co-creating art-direction to exporting final levels into our in-house RnD engine Halcyon. From this talk, you will learn how in a team of only 3 artists we created a functional symbiosis of art direction and procedural system in under 2 months as well as what challenges and solutions we had during our ‘procedural journey’.
EPC 2018 - SEED - Exploring The Collaboration Between Proceduralism & Deep Le...Electronic Arts / DICE
Proceduralism is a powerful language of rules, dependencies and patterns that can generate content indistinguishable from a manually produced one. Yet there are new opportunities that hold a great potential to enhance the existing techniques. In this talk, SEED's Anastasia Opara shares some of the early tests of marrying Proceduralism and Deep Learning and discusses how it can contribute to the current workflows.
You can view a recording of the presentation from 2018's Everything Procedural Conference here:
https://www.youtube.com/watch?v=dpYwLny0P8M
Human mechanisms of representing the surrounding world in a form of ‘language’ is an outstanding ability that enables us to store the information as internal compact abstractions. Proceduralism is also a form of language, where we view the world through rules, dependencies and patterns. And though rules are often perceived as something rigid, their engineering is a fluid and creative task, where analyzing our own thought framework often fuels the design process.
This talk presents the approach Frostbite took to add support for HDR displays. It will summarize Frostbite's previous post processing pipeline and what the issues were. Attendees will learn the decisions made to fix these issues, improve the color grading workflow and support high quality HDR and SDR output. This session will detail the display mapping used to implement the"grade once, output many" approach to targeting any display and why an ad-hoc approach as opposed to filmic tone mapping was chosen. Frostbite retained 3D LUT-based grading flexibility and the accuracy differences of computing these in decorrelated color spaces will be shown. This session will also include the main issues found on early adopter games, differences between HDR standards, optimizations to achieve performance parity with the legacy path and why supporting HDR can also improve the SDR version.
Takeaway
Attendees will learn how and why Frostbite chose to support High Dynamic Range [HDR] displays. They will understand the issues faced and how these were resolved. This talk will be useful for those still to support HDR and provide discussion points for those who already do.
Intended Audience
The intended audience is primarily rendering engineers, technical artists and artists; specifically those who focus on grading and lighting and those interested in HDR displays. Ideally attendees will be familiar with color grading and tonemapping.
Talk by Yuriy O’Donnell at GDC 2017.
This talk describes how Frostbite handles rendering architecture challenges that come with having to support a wide variety of games on a single engine. Yuriy describes their new rendering abstraction design, which is based on a graph of all render passes and resources. This approach allows implementation of rendering features in a decoupled and modular way, while still maintaining efficiency.
A graph of all rendering operations for the entire frame is a useful abstraction. The industry can move away from “immediate mode” DX11 style APIs to a higher level system that allows simpler code and efficient GPU utilization. Attendees will learn how it worked out for Frostbite.
Presentation by Andrew Hamilton and Ken Brown from DICE at GDC 2016.
Photogrammetry has started to gain steam within the Games Industry in recent years. At DICE, this technique was first used on Battlefield and they fully embraced the technology and workflow for Star Wars: Battlefront. This talk will cover their research and development, planning and production, techniques, key takeaways and plans for the future. The speakers will cover photogrammetry as a technology, but more than that, show that it's not a magic bullet but instead a tool like any other that can be used to help achieve your artistic vision and craft.
Takeaway
Come and learn how (and why) photogrammetry was used to create the world of Star Wars. This talk will cover Battlefront's use of of the technology from pre-production to launch as well as some of their philosophies around photogrammetry as a tool. Many visuals will be included!
Intended Audience
A content creator friendly talk intended for pretty much any developer, especially those involved in 3D content creation. It is not a technical talk focused on the code or engineering of photogrammetry. The speakers will quickly cover all basics, so absolutely no prerequisite knowledge required.
In this technical presentation Johan Andersson shows how the Frostbite 3 game engine is using the low-level graphics API Mantle to deliver significantly improved performance in Battlefield 4 on PC and future games from Electronic Arts. He will go through the work of bringing over an advanced existing engine to an entirely new graphics API, the benefits and concrete details of doing low-level rendering on PC and how it fits into the architecture and rendering systems of Frostbite. Advanced optimization techniques and topics such as parallel dispatch, GPU memory management, multi-GPU rendering, async compute & async DMA will be covered as well as sharing experiences of working with Mantle in general.
Technical talk from the AMD GPU14 Tech Day by Johan Andersson in the Frostbite team at DICE/EA about Battlefield 4 on PC which is the first title that will use 'Mantle' - a very high-performance low-level graphics API being in close collaboration by AMD and DICE/EA to get the absolute best performance and experience in Frostbite games on PC.
Talk by Johan Andersson (DICE/EA) in the Beyond Programmable Shading Course at SIGGRAPH 2012.
The other talks in the course can be found here: http://bps12.idav.ucdavis.edu/
For Battlefield 3, DICE took on its most difficult challenge so far. To raise the bar for character quality in games we developed our own deformation rig, combined it with the powerful ANT animation system (used in FIFA) and extensive motion capture usage. To create a believable experience we built and managed enormous amount of assets and ways of keeping these organized. The rigging process was one of the most challenging aspects of production, with the smallest change requiring an update for almost every single asset. With a modular rigging system and a flexible animation pipeline the production team could deliver on time and quality.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
4. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
“PICA PICA”
▪ Exploratory mini-game & world
▪ For our self-learning AI agents to
play, not for humans ☺
▪ “Imitation Learning with Concurrent
Actions in 3D Games” [Harmer 2018]
▪ Uses SEED’s Halcyon R&D engine
▪ Goals
▪ Explore hybrid rendering with DXR
▪ Clean and consistent visuals
▪ Procedural worlds [Opara 2018]
▪ No precomputation
5. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Agenda
▪ Motivation
▪ Reflections
▪ Translucency and Transparency
▪ Global Illumination
▪ Shadows
6. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Why use raytracing?
11. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Is this a fair comparison?
▪ “Classic” game solutions exist
▪ Local reflection volumes
▪ Planar reflections
▪ Lightmaps
▪ “Sky visibility” maps
▪ …
13. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Hybrid Rendering Pipeline
Direct shadows
(raytrace or raster)
Direct lighting (compute) Reflections
(raytrace or compute)
Deferred shading (raster)
Global Illumination (raytrace) Post processing (compute)Transparency & Translucency
(raytrace)
Ambient occlusion
(raytrace or compute)
14. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Materials
▪ Multiple microfacet layers
▪ Rapidly experiment with different looks
▪ Bake down a number of layers for
production
▪ Energy conserving
▪ Automatic Fresnel between layers
▪ Inspired by Arbitrarily Layered Micro-
Facet Surfaces [Weidlich 2007]
▪ Unified for all lighting & rendering
modes
▪ Raster, hybrid, path-traced reference
Objects with Multi-Layered Materials
15. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Raytraced Reflections
▪ Launch rays from G-Buffer
▪ Raytrace at half resolution
▪ ¼ ray/pixel for reflection
▪ ¼ ray/pixel for reflected shadow
▪ Reconstruct at full resolution
▪ Arbitrary normals
▪ Spatially-varying roughness
Raytraced Reflections
16. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Pipeline
Importance
sampling
Screen-space
reflection Raytracing
Spatial
reconstruction
Envmap gap fill
Bilateral cleanup
Temporal
accumulation
17. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Sampling
▪ Microfacet importance sampling
▪ Normal distribution function (NDF) part of BRDF
▪ Very few rays, so quality matters
▪ Low-discrepancy Halton sequence
▪ Cranley-Patterson rotation per pixel [PBRT]
18. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Sampling
▪ Microfacet importance sampling
▪ Normal distribution function (NDF) part of BRDF
▪ Very few rays, so quality matters
▪ Low-discrepancy Halton sequence
▪ Cranley-Patterson rotation per pixel [PBRT]
19. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Sampling
▪ Microfacet importance sampling
▪ Normal distribution function (NDF) part of BRDF
▪ Very few rays, so quality matters
▪ Low-discrepancy Halton sequence
▪ Cranley-Patterson rotation per pixel [PBRT]
20. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Sampling
▪ Microfacet importance sampling
▪ Normal distribution function (NDF) part of BRDF
▪ Very few rays, so quality matters
▪ Low-discrepancy Halton sequence
▪ Cranley-Patterson rotation per pixel [PBRT]
21. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Sampling
▪ Microfacet importance sampling
▪ Normal distribution function (NDF) part of BRDF
▪ Very few rays, so quality matters
▪ Low-discrepancy Halton sequence
▪ Cranley-Patterson rotation per pixel [PBRT]
22. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Sampling
▪ Microfacet importance sampling
▪ Normal distribution function (NDF) part of BRDF
▪ Very few rays, so quality matters
▪ Low-discrepancy Halton sequence
▪ Cranley-Patterson rotation per pixel [PBRT]
▪ Re-roll if ray below horizon
23. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Sampling
▪ Microfacet importance sampling
▪ Normal distribution function (NDF) part of BRDF
▪ Very few rays, so quality matters
▪ Low-discrepancy Halton sequence
▪ Cranley-Patterson rotation per pixel [PBRT]
▪ Re-roll if ray below horizon
24. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Layered Material Sampling
▪ One ray for the whole stack
▪ Stochastically select a layer
▪ Based on layer visibility estimate
▪ Bottom layers occluded by top layers
▪ View-dependent due to Fresnel
▪ Approximate with Schlick
▪ Sample BRDF of selected layer
▪ Multiple layers can generate same direction
▪ Query each layer about generated direction
▪ Add up probabilities Multi-Layered Materials [Weidlich 2007]
25. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reflection Raygen
▪ Read generated ray
▪ TraceRay()
▪ Same hit shaders as path-tracing reference
▪ Output packed in 2x RGBA 16F
Color R Color G Color B G-buffer Depth
Dir X * Hit T Dir Y * Hit T Dir Z * Hit T Inverse PDF
27. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Why Stochastic?
▪ Gives right answer, but
▪ Produces noise
▪ Thrashes caches
▪ Alternative: post-filter
▪ Needs arbitrarily large kernel
▪ Introduces bleeding
▪ Sensitive to aliasing
▪ Meet in the middle?
▪ Bias samplers
▪ Combine with spatial filtering
▪ High variance -> increase bias
▪ More research needed!
Image Space Gathering [Robison 2009]
30. +Spatial reconstruction
▪ Based on Stochastic Screen-Space Reflections [Stachowiak 2015]
▪ For every full-res pixel, use 16 half-res ray hits
▪ Poisson-disk distribution around pixel
▪ Scale by local BRDF
▪ Divide by ray PDF
▪ Ratio estimator [Heitz 2018]
result = 0.0
weightSum = 0.0
for pixel in neighborhood:
weight = localBrdf(pixel.hit) / pixel.hitPdf
result += color(pixel.hit) * weight
weightSum += weight
result /= weightSum
33. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Spatial Sample Pattern
▪ Four sample sets for pixels in 2x2 quad
▪ Threshold blue noise into four classes
34. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Spatial Sample Pattern
▪ Four sample sets for pixels in 2x2 quad
▪ Threshold blue noise into four classes
35. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Spatial Sample Pattern
▪ Four sample sets for pixels in 2x2 quad
▪ Threshold blue noise into four classes
▪ Class determines sample set
36. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Spatial Sample Pattern
▪ Four sample sets for pixels in 2x2 quad
▪ Threshold blue noise into four classes
▪ Class determines sample set
▪ Take 16 samples from each set in a disk
▪ Disjoint high-quality samples
40. +Bilateral cleanup
▪ Secondary bilateral filter
▪ Much dumber than reconstruction
▪ Introduces blur
▪ Only run where variance high
▪ Variance from spatial reconstruction
▪ Temporally smoothed with hysteresis = 0.5
▪ Variable kernel width, sample count
41. +Bilateral cleanup
▪ Secondary bilateral filter
▪ Much dumber than reconstruction
▪ Introduces blur
▪ Only run where variance high
▪ Variance from spatial reconstruction
▪ Temporally smoothed with hysteresis = 0.5
▪ Variable kernel width, sample count
Variance estimate
44. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Temporal Reprojection
▪ Two sources of velocity
▪ Per-pixel motion vector
▪ Reprojection of hit point
45. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Temporal Reprojection
▪ Neither reprojection method robust
Per-pixel motion vector Reprojection of hit point
46. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Dual-Source Reprojection
▪ Sample history using both
reprojection strategies
▪ Estimate local statistics during
spatial reconstruction
▪ RGB mean
▪ RGB standard deviation
▪ Of used hit samples
▪ Weigh contributions
▪ rgb_dist = (rgb – rgb_mean) / rgb_dev
▪ dist = (rgb – rgb_mean) / rgb_dev
▪ w = exp2(-10 * luma(dist))
47. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Reprojection Clamp
▪ Build color box from local statistics
▪ rgb_min = rgb_mean – rgb_dev
▪ rgb_max = rgb_mean + rgb_dev
▪ Clamp reprojected values to box
▪ rgb = clamp(rgb, rgb_min, rgb_max)
▪ Same idea as in temporal anti-aliasing
[Karis 2014]
▪ Weigh clamped contributions
▪ rgb_sum += val * w
▪ Introduces bias
▪ Fixes most ghosting
48. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Transparency & Translucency
▪ Raytracing enables accurate light scattering
▪ Refractions
▪ Order-independent (OIT)
▪ Variable roughness
▪ Handles multiple IOR transitions
▪ Beer-Lambert absorption
▪ Translucency
▪ Light scattering inside a medium
▪ Inspired by Translucency in Frostbite
[Barré-Brisebois 2011]
Glass and Translucency
49. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Integration
▪ Multiple samples required for rough
refractions and translucency
▪ Multi-layer screen-filtering complicated
▪ Layered framebuffers?
▪ Per-pixel linked lists?
▪ Filter in 2D or between layers?
▪ More research needed!
▪ Accumulate in texture-space instead
▪ Stable integration domain
▪ Static texture allocation
▪ 512*512 per object
▪ Time-sliced, ~1M rays/frame
50. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Translucency Breakdown
▪ For every valid position & normal
51. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Translucency Breakdown
▪ For every valid position & normal
▪ Flip normal and push (ray) inside
52. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Translucency Breakdown
▪ For every valid position & normal
▪ Flip normal and push (ray) inside
▪ Launch rays in uniform sphere dist.
▪ (Importance-sample phase function)
53. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Translucency Breakdown
▪ For every valid position & normal
▪ Flip normal and push (ray) inside
▪ Launch rays in uniform sphere dist.
▪ (Importance-sample phase function)
▪ Compute lighting at intersection
54. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Translucency Breakdown
▪ For every valid position & normal
▪ Flip normal and push (ray) inside
▪ Launch rays in uniform sphere dist.
▪ (Importance-sample phase function)
▪ Compute lighting at intersection
▪ Gather all samples
55. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Translucency Breakdown
▪ For every valid position & normal
▪ Flip normal and push (ray) inside
▪ Launch rays in uniform sphere dist.
▪ (Importance-sample phase function)
▪ Compute lighting at intersection
▪ Gather all samples
▪ Update value in texture
56. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Translucency Filtering
▪ Can denoise spatially and/or temporally
▪ Temporal: build an update heuristic
▪ Reactive enough for moving lights & objects
▪ Exponential moving average can be OK
▪ Variance-adaptive mean estimation
▪ Based on exponential moving average
▪ Adaptive hysteresis
57. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Transparency
▪ Similar approach to translucency
▪ Launch ray using view’s origin and direction
▪ Refract based on medium's index of refraction
▪ Sample a BSDF for rough refraction
▪ Trace a ray in the scene & sample lighting
▪ Tint the result
▪ Chromatic aberration from interface
▪ Beer-Lambert absorption in medium
58. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Transparency
▪ Similar approach to translucency
▪ Launch ray using view’s origin and direction
▪ Refract based on medium's index of refraction
▪ Sample a BSDF for rough refraction
▪ Trace a ray in the scene & sample lighting
▪ Tint the result
▪ Chromatic aberration from interface
▪ Beer-Lambert absorption in medium
▪ Pen: we don’t handle transparent shadows yet
59. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Global
Illumination
▪ Important for consistency
▪ Minimize artist overhead
▪ We want a technique with:
▪ No precomputation
▪ No parametrization (UVs, proxies)
▪ Support for static and dynamic scenes
▪ Adaptive refinement
62. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Spatial Storage
▪ Can’t afford to solve every frame
▪ Our GI budget: 250k rays / frame
▪ World space vs screen space
▪ World space is stable
▪ Screen-space fully dynamic
▪ World space surfels
▪ Easy to accumulate
▪ Distribute on the fly
▪ No parameterization
▪ Smooth result by construction
▪ Position, normal, radius, animation info
63. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Skinning
64. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Skinning
65. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Screen Application
▪ Render like deferred light sources
▪ Diffuse only
▪ Smoothstep distance attenuation
▪ Mahalanobis metric
▪ Squared dot of normals angular weight
▪ Inspired by [Lehtinen 2008]
▪ World-space 3D culling
▪ Uniform grid
▪ Each cell holds list of surfels
▪ Find one cell per pixel, use all from list
68. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement
Surfel Spawning From Camera @ 1% speed
69. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement
Surfel Spawning From Camera @ 1% speed
70. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement
Surfel Spawning From Camera @ 1% speed
71. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement Algorithm
▪ Iterative screen-space hole filling
72. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement Algorithm
▪ Iterative screen-space hole filling
▪ Calculate surfel coverage for each pixel
73. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement Algorithm
▪ Iterative screen-space hole filling
▪ Calculate surfel coverage for each pixel
74. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement Algorithm
▪ Iterative screen-space hole filling
▪ Calculate surfel coverage for each pixel
75. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement Algorithm
▪ Iterative screen-space hole filling
▪ Calculate surfel coverage for each pixel
▪ Find lowest coverage in tile (16x16 pixels)
76. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Surfel Placement Algorithm
▪ Iterative screen-space hole filling
▪ Calculate surfel coverage for each pixel
▪ Find lowest coverage in tile (16x16 pixels)
▪ Probabilistically spawn surfel on pixel
▪ Chance proportional to pixel’s projected area
▪ G-Buffer depth and normal
▪ Continue where coverage below threshold
77. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Irradiance Calculation
▪ Path trace from surfels
▪ Unidirectional, explicit light connections
▪ More samples for new surfels
Scene geometry
Surfel
Sky
Light source
78. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Irradiance Calculation
▪ Path trace from surfels
▪ Unidirectional, explicit light connections
▪ More samples for new surfels
▪ Limit light path length
▪ Sample previous GI at last bounce
▪ "Infinite" bounces amortized over frames
Query surfels at hit point
Surfel
Light source
79. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Monte Carlo Integration
▪ Monte Carlo assumes
immutable integrand
▪ Plain old mean estimation
▪ Can’t use in dynamic GI
▪ Hard reset undesirable
Interactive GI in Frostbite [Hillaire 2018]
ҧ𝑥0 = 0
ҧ𝑥 𝑛+1 = 𝑙𝑒𝑟𝑝 ҧ𝑥 𝑛, 𝑥 𝑛+1,
1
𝑛 + 1
80. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Adaptive Integration
▪ Give up on fully converging
▪ Exponential moving average
▪ Estimate short-term statistics
▪ Mean, variance
▪ Adapt blend factor 𝑘
▪ Reduce if variance high
▪ Increase if mean far from short-term
statistics
ҧ𝑥0 = 0
ҧ𝑥 𝑛+1 = 𝑙𝑒𝑟𝑝 ҧ𝑥 𝑛, 𝑥 𝑛+1, 𝑘
81. Adaptive integration in 1D
Exponential moving averageAdaptive mean
Short-term
distribution bounds
82. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Shadows
▪ Hard shadows trivial
▪ Launch a ray towards light position
▪ Check for any hit
▪ Soft shadows easy
▪ Sample direction from uniform cone [PBRT]
▪ Cone width determines penumbra
83. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Shadows
▪ Hard shadows trivial
▪ Launch a ray towards light position
▪ Check for any hit
▪ Soft shadows easy
▪ Sample direction from uniform cone [PBRT]
▪ Cone width determines penumbra
▪ … Easy but noisy
▪ 1 visibility sample per pixel not enough
▪ Filter instead of shooting more
84. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Shadow Filtering
▪ Based on Nvidia’s SVGF [Schied 2017]
▪ Temporal accumulation
▪ Multi-pass weighted blur
▪ Kernel size driven by variance estimate
▪ No shadow-specific heuristics
Accumulate
temporally
Blur
Estimate
variance
Blur some
more
(Over)simplified diagram of SVGF
Input
Output
86. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Shadow Filtering
▪ Modified to reduce ghosting
▪ TAA-like color clip
▪ Variance-based bounds (5x5 kernel) [Salvi 2016]
Baseline SVGF +TAA-like clip
87. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Summary
▪ Replace fine-tuned hacks with unified approaches
▪ Reconstruction and filtering instead
▪ New tool in the box
▪ Solve sparse and incoherent problems
▪ Not a silver bullet
▪ Use rays wisely
▪ Pica Pica shoots ~2.25 rays per pixel
▪ Real-time visuals with (almost) path traced quality
achievable today
88. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Thanks
▪ SEED
▪ Johan Andersson
▪ Colin Barré-Brisebois
▪ Jasper Bekkers
▪ Joakim Bergdahl
▪ Ken Brown
▪ Dean Calver
▪ Dirk de la Hunt
▪ Jenna Frisk
▪ Paul Greveson
▪ Henrik Halen
▪ Effeli Holst
▪ Andrew Lauritzen
▪ Magnus Nordin
▪ Niklas Nummelin
▪ Anastasia Opara
▪ Kristoffer Sjöö
▪ Ida Winterhaven
▪ Graham Wihlidal
▪ Microsoft
▪ Chas Boyd
▪ Ivan Nevraev
▪ Amar Patel
▪ Matt Sandy
▪ NVIDIA
▪ Tomas Akenine-Möller
▪ Nir Benty
▪ Jiho Choi
▪ Peter Harrison
▪ Alex Hyder
▪ Jon Jansen
▪ Aaron Lefohn
▪ Ignacio Llamas
▪ Henry Moreton
▪ Martin Stich
89. S E E D / / S E A R C H F O R E X T R A O R D I N A R Y E X P E R I E N C E S D I V I S I O N
S T O C K H O L M – L O S A N G E L E S – M O N T R É A L – R E M O T E
W W W . E A . C O M / S E E D
W E ‘ R E H I R I N G !
90. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Questions?
91. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
References▪ [Harmer 2018] Jack Harmer, Linus Gisslén, Henrik Holst, Joakim Bergdahl, Tom Olsson, Kristoffer Sjöö and Magnus Nordin“ Imitation Learning with
Concurrent Actions in 3D Games”. available online
▪ [Barré-Brisebois 2011] Barré-Brisebois, Colin and Bouchard, Marc. “Approximating Translucency for a Fast, Cheap and Convincing Subsurface Scattering
Look”, available online.
▪ [Barré-Brisebois 2017] Barré-Brisebois, Colin. “A Certain Slant of Light: Past, Present and Future Challenges of Global Illumination in Games”, available
online.
▪ [Igehy 1999] Igehy, Homan. “Tracing Ray Differentials”, available online.
▪ [PBRT] Pharr, Matt. Jakob, Wenzel and Humphreys, Greg. “Physically Based Rendering”, Book, http://www.pbrt.org/.
▪ [Schied 2017] Schied, Christoph et. Al. “Spatiotemporal Variance-Guided Filtering: Real-Time Reconstruction for Path-Traced Global Illumination”, NVIDIA
Research, available online.
▪ [Stachowiak 2015] Stachowiak, Tomasz. “Stochastic Screen-Space Reflections”, available online.
▪ [Weidlich 2007] Weidlich, Andrea and Wilkie, Alexander. “Arbitrarily Layered Micro-Facet Surfaces”, available online.
▪ [Williams 1983] Williams, Lance. “Pyramidal Parametrics”, available online.
▪ [Robison 2009] Austin Robison, Peter Shirley. "Image Space Gathering", available online.
▪ [Heitz 2018] Eric Heitz, Stephen Hill, Morgan McGuire. "Combining Analytic Direct Illumination and Stochastic Shadows", available online.
▪ [Karis 2014] Brian Karis. "High-Quality Temporal Supersampling", available online.
▪ [Lehtinen 2008] Jaakko Lehtinen, Matthias Zwicker, Emmanuel Turquin, Janne Kontkanen, Frédo Durand, François Sillion, Timo Aila. "A Meshless Hierarchical
▪ Representation for Light Transport“, available online.
▪ [Jimenez 2016] Jorge Jimenez, Xian-Chun Wu, Angelo Pesce, Adrian Jarabo. "Practical Realtime Strategies for Accurate Indirect Occlusion", available online.
▪ [Opara 2018] Anastasia Opara. “Creativity of Rules and Patterns”, available online.
▪ [Salvi 2016] Marco Salvi. "An excursion in temporal super sampling", available online.
▪ [Hillaire 2018] Sébastien Hillaire. "Real-time Raytracing for Interactive Global Illumination Workflows in Frostbite", available online.
92. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Bonus
93. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Texture Level-of-Detail
What about texture level of detail?
▪ Mipmapping [Williams 1983] is the standard method to avoid texture aliasing:
▪ Screen-space pixel maps to approximately one texel in the mipmap hierarchy
▪ Supported by all GPUs for rasterization via shading quad and derivatives
Left: level-of-detail (λ), partial derivatives and the parallelogram-approximated texture-space footprint of a pixel. Right: mipmap chain
94. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Texture Level-of-Detail
No shading quads for ray tracing!
▪ Traditionaly: Ray Differentials
▪ Estimates the footprint of a pixel by
computing world-space derivatives of the
ray with respect to the image plane
▪ Have to differentiate (virtual offset) rays
▪ Heavier payload (12 floats) for subsequent
rays (can) affect performance. Optimize!
▪ Alternative: always sample mip 0 with bilinear filtering (with extra samples)
▪ Leads to aliasing and additional performance cost
Ray Differentials [Igehy99]
Surface
P(x + ∂x)
D(x + ∂x)P(x)
∂D(x)
∂P(x)
D(x)
D(x)
x
y
R(x)
R(x + ∂x)
∂x
95. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
Texture Level-of-Detail
Together with Research, we
developed a texture LOD technique for raytracing:
▪ Heuristic based on triangle properties, a curvature
estimate, distance, and incident angle
▪ Similar quality to ray differentials with single trilinear lookup
▪ Single value stored in the payload for subsequent rays
▪ Upcoming publication by:
▪ Tomas Akenine-Möller (NV), Magnus Andersson (NV), Colin
Barré-Brisebois (EA), Jim Nilsson (NV), Robert Toth (NV)
Ground Truth, Ray Differentials, Ours, Mip0
96. S E E D // Stochastic all the things: Raytracing in hybrid real-time rendering
Programming and Technology Track, Digital Dragons 2018
mGPU
▪ Explicit Heterogenous Multi-GPU
▪ Parallel Fork-Join Style
▪ Resources copied through system
memory using copy queue
▪ Minimize PCI-E transfers
▪ Approach
▪ Run ray generation on primary GPU
▪ Copy results in sub-regions to other GPUs
▪ Run tracing phases on separate GPUs
▪ Copy tracing results back to primary GPU
▪ Run filtering on primary GPU
Ray
Generation
Copy Sub Regions
Copy Sub Regions
GPU1 GPU2 GPU3 GPU4
Trace
GPU 2
Trace
GPU 1
Trace
GPU 3
Trace
GPU 4
Filter