Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Epic Games Japan hold a meeting named "Lightmass Deep Dive" on July 30, 2016.
Osamu Satio of Square Enix Osaka gave a presentation about their Lightmass Operation for Large Console Games. EGJ translated the slide for the presentation to English and published it.
There are some movies in the slide. So we recommend downloading this slide.
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...Unity Technologies
The Elder Scrolls Blades strove to produce high-quality visuals on modern mobile devices. This talk will describe the challenges of achieving that level of quality in procedurally generated 3D environments.
Speakers:
Simon-Pierre Thibault - Bethesda Game Studios
Sergei Savchenko - Bethesda Game Studios
Watch the session here: https://youtu.be/KbxiGH6igBk
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Epic Games Japan hold a meeting named "Lightmass Deep Dive" on July 30, 2016.
Osamu Satio of Square Enix Osaka gave a presentation about their Lightmass Operation for Large Console Games. EGJ translated the slide for the presentation to English and published it.
There are some movies in the slide. So we recommend downloading this slide.
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...Unity Technologies
The Elder Scrolls Blades strove to produce high-quality visuals on modern mobile devices. This talk will describe the challenges of achieving that level of quality in procedurally generated 3D environments.
Speakers:
Simon-Pierre Thibault - Bethesda Game Studios
Sergei Savchenko - Bethesda Game Studios
Watch the session here: https://youtu.be/KbxiGH6igBk
Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...Unity Technologies
In this session, the Unity Demo team provides their best tips and tricks for optimizing detailed, complex environment scenes for modern console performance.
Speakers:
Rob Thompson (Unity Technologies)
Adding more visuals without affecting performanceSt1X
Smallest viable set of performance optimizations recommendations for game artists. This presentation targets artist that have little knowledge about computer hardware capabilities and limitations.
This talk is about our experiences gained during making of the Killzone Shadow Fall announcement demo.
We’ve gathered all the hard data about our assets, memory, CPU and GPU usage and a whole bunch of tricks.
The goal of talk is to help you to form a clear picture of what’s already possible to achieve on PS4.
The most important part of a modern PostFX pipeline is picking the right color model to support. This way the whole PostFX pipeline can use 32-bit render targets and at the same time have increased color representation and luminance representation.
4,000 Adams at 90 Frames Per Second | Yi Fei BoonJessica Tams
Delivered at Casual Connect Asia 2017. This session will offer the steps and explanation of techniques used to create, manage and render a large crowd in a game without killing the performance. It will cover instancing, baking of animations into textures, and skinning on GPU in the vertex shaders.
Henrik Halén (Lead Rendering Programmer) at Electronic Arts presented "Style and Gameplay in the Mirror's Edge" at SIGGRAPH 2010's Stylized Rendering in Games. https://www.cs.williams.edu/~morgan/SRG10/
Unite 2016 Tokyoで登壇した『Unityを使った個人ゲーム開発における「収益化」の現状と未来』の続編です。登壇者のゲーム作品『Back in 1995』の振り返りと現在の取り組みから、Unityの力によってどのように活動を拡大していったのか紹介します。また、この5年で大きく変化した、日本のインディーを取り巻く環境についてご紹介します。
・『狂気講演』から5年、あのゲームは結局売れたのか?
・日本のインディーを取り巻く環境の変化
・開発を効率化する:Unity Services事例
・インディー創作活動を持続するために必要なこと
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
1. Zhenzhong Yi
Studio Technical Director, miHoYo
Head of Genshin Impact Console Development
From Mobile to Console: Genshin
Impact’s rendering technology on
Console
2. Genshin Impact
• Open-world action RPG
• Multiplatform support, including
PS4, PC, and mobile versions
• Long game life-cycle with
numerous updates over time
• Features an “anime” rendering
style
3. About Myself
• Zhenzhong Yi
• Over 10 years of console game development experience. Worked for companies both in China
and the US before joining miHoYo, including:
• Microsoft Xbox - Seattle, WA
• Avalanche Studios - New York City, NY
• Zindagi Games - Los Angeles, CA
• Ubisoft Shanghai
• Returned to China and joined miHoYo in early 2019 to build the console development team
6. • Unity is an extremely flexible engine
• Unity’s concise coding style allowed us to more conveniently customize
the development of Genshin Impact's rendering pipeline
• Unity China’s technical support also provided great assistance and
cooperation, so we truly appreciate their help
7. • More than half of our efforts were focused on making full use of the
console’s hardware architecture in development and optimization
• We accumulated tons of experience through this process and had several
technical breakthroughs. However…
• According to Sony NDA restrictions:
• We cannot share hardware-related content
• We cannot share information concerning low-level optimization
• We will not be discussing the CPU, I/O, or other such modules today
9. • Genshin Impact’s engine has two different customized rendering pipelines
• PC and console ➔ “console rendering pipeline”
• Android and iOS ➔ “mobile rendering pipeline”
• The overall tone is PBR based stylized rendering
• The game’s versions are under simultaneous development, with mobile serving as the
primary development platform, and console development following close behind
• The selection of technologies must consider resource production and possible
runtime costs of all platforms
Console Rendering Pipeline Overview
10. • Based on PBR, which keeps the light and shadow effects
of the entire environment uniform
• Not completely energy conservative
• The lighting models of different materials are altered based on art
requirements
• The game’s light and shadow effects are calculated in
real-time
• Time of day
• Dynamic weather
Console Rendering Pipeline Overview
11. • High-Resolution output
• PS4 Pro – native 4k resolution
• Standard PS4 – 1440P rendering resolution, 1080P output resolution
• A clearer image is more conducive to our game’s art style
• Extensive use of compute shaders
• Over half of the features in the console pipeline were implemented via
compute shaders
Console Rendering Pipeline Overview
12. • Lighting and materials appear very different from more realistic-looking
games, since the art team had very special requirements for them
• Dirty, noisy, dark
• Fresh, bright, clean, anime-like
• We kept these 2 groups of words in mind at all times
Console Rendering Pipeline Overview
13. • A deeply customized Unity engine for mobile
• We couldn’t just rely on Unity's own PS4 platform implementation
• Nearly non-existent development resources
• Initially, the console team consisted of one member: me!
• Whole studio severely lacked console development experience
• Many other things to handle: Sony Accounts, PSN Store, PS4 TRC, etc.
• A tight development deadline
• Starting from zero with a timeline of approximately a year and a half
What We Started With
14. Principles We Followed
• When transforming the rendering pipeline for console, we adhered to the following ideas:
• Avoid excessive features
• Cool-looking technologies do not necessarily match the style of our game
• Choose practical and mature technologies whenever possible
• No time for trial and error
• It’s best to have interaction between multiple technologies
• Systematic transformation allows the resulting picture to appear more unified
• The benefits of improved picture are exponential: 1 + 1 > 2
16. • Selected a few technical points from scene lighting and
shadow effects
• Shared ideas on how to upgrade Genshin Impact’s
visual quality
• Focused on methodology
17. Shadows
• To match the art style, shadows needed to provide enough
details up close while still covering a large enough area
18. • Cascaded shadow map + Poisson noise soft shadows
• Did not use the usual 4 cascades, instead we used up to 8 cascades
• More cascades bring better shadow effects, but also bring more
performance overhead
• More draw calls resulting in more CPU overhead
• More cascades resulting in more GPU overhead
Shadows
19. • CPU optimization
• Used shadow caching to reduce draw call count
• First 4 cascades are updated with every frame
• Last 4 cascades are updated via interleaving
• Total 5 cascades updated each frame
• All cascades are updated at least once every 8 frames
Shadows
20. Shadows
• GPU optimization
• We used a screen space shadow map
• Each pixel will do 11 samples based on Poisson disc to generate soft shadows,
and to eliminate banding, sampling patterns are rotated
• The cost of the entire pass is about 2 to 2.6ms
• Do we really need to do such intensive calculations for every single pixel?
21. Shadows
• GPU optimizations
• Soft shadows are only useful at edges of shadows
• A full-screen mask map is generated to mark the shadow, non-shadow, and
penumbra areas
• Soft shadows are only calculated for pixels marked in penumbra areas
• 2 – 2.6ms ➔ 1.3 – 1.7ms
23. ⚫ Low resolution calculation
⚫ ¼ x ¼ resolution
⚫ 16 pixels correspond to a mask value
⚫ Calculate each pixel to determine if it’s in shadow or not,
merge the results to get the mask pixels
⚫ Accurate, but slow, and must be done 16 times
⚫ Optimization: Select a small number of sample
points to calculate, get approximate results
⚫ A few samples cannot perfectly represent the 4x4 tile
⚫ Blur the mask map to enlarge the penumbra area
Mask Generation
26. • Characters and scenery that are already in shadow appear to be floating
• Using ambient occlusion (AO), characters and scenery in the shadows will cast
faint soft shadows around them
• The game uses 3 different kinds of AO technology
• HBAO provides more details in small areas
• AO Volume provides a wide range of AO for static objects
• Capsule AO provides a wide range of AO for characters
Multiple AO Technologies
31. • AO Volume generates a larger range of AO than HBAO
• For example, a tabletop can cast large AO on the floor
• HBAO cannot provide such effects
• Occlusion information for each object is generated offline in
object local space
• AO value is calculated during runtime via this local space
occlusion information
AO Volume
34. • AO Volume solves the large range AO problem for static objects
• But the game’s characters have skeletal animations
• The occlusion information cannot be just generated offline
• We used capsule AO technology for characters
• Skeletal animations are used to update capsules
• The occlusion is directional
Capsule AO
35. • ½ x ½ resolution AO render target, bilateral upsampled to the full res. AO texture
• Bilateral filtered Gaussian blur is applied to eliminate noise. Remember, no noise!
• Points for further optimization:
• The bilateral filter has lots of repeated calculations
• 2 pass blur + 1 pass upsample = multiple AO reads and writes
• Solution:
• Complete the compute shader in a single pass
Applying AO
36. • Implemented clustered deferred lighting, supports 1024 lights in view
• Screen is divided into 64 x 64 pixel tiles, with each tile sliced into 16 clusters in depth
direction
Local Lighting
38. • Nearly 100 lights in view can cast real-time shadow
• We could support more, but these are enough
• Dynamically adjust shadow map resolution based on distance and priorities
• Baked static shadow texture + dynamic object shadows
• With lots of local light, the large amount of baked shadow textures
puts a heavy load on both the game’s capacity and I/O
Local Light Shadows
39. • Static scene shadow texture is baked offline and then compressed
• Use compute shader to decompress at runtime, the decompression speed is
very fast
• On a base PS4, about 0.05ms to decompress a 1k x 1k shadow texture
Local Light Shadows
40. Local Light Shadow Texture Compression
• 2 x 2 block encode into 32 bits used for every 4 depth values
• Plane equation mode or packed floating-point mode
• 64 bits with optional high-precision compression
• Quadtree is used to merge encoded data, further increasing compression rate
• 16 x 16 blocks form a single tile, each tile has 1 quadtree
• Reference: [Bo Li, 2019]
41. • Compression Rate: In default precision mode, compression rate of a typical
indoor scene is about 20:1 – 30:1, high precision mode is about 40 – 70% of that
• Compression of shadow texture is essential, the capacity can be reduced by an
order of magnitude
Local Light Shadow Texture Compression
44. Local Light Shadow Texture Compression
• The size of a 2k x 2k texture is reduced from 8MB to 274.4KB
• The default precision compression rate reaches up to 29.85:1
• The difference between the high-precision compressed texture and the
uncompressed texture is indistinguishable to the naked eye
• The resulting texture size is 583.5KB with an approximate compression ratio
of 14:1
46. Volumetric Fog
• If the local light has a projection texture, volumetric fog will produce the
corresponding effect
47. • Physically-based light scattering
• The volumetric fog can be controlled with different parameters in different areas
• Volumetric fog is illuminated by the light source
• Temporal filter is used for multi-frame blending which results in a smoother and more
stable fog image
• The GPU cost is less than 1ms
Volumetric Fog
48. Volumetric Fog
• Camera view based
• The view frustum is divided into voxels and aligned with the clusters of clustered
deferred lighting
• The fog parameters are saved in texture and loaded into the world via streaming
• Injected into voxel while calculating
• Take local lights into account
• Ray marching to get volumetric information
50. God Rays
• A separate pass to generate god rays
• ½ x ½ resolution
• Generated via ray marching, sampling up to 5 shadow cascades
• Provides adjustable parameters for the art team to add god rays on top of
volumetric fog
51. God Rays
• Volumetric fog can also generate god rays
• But the resulting effect in-game was not satisfactory to the art team
• The voxels’ resolution wasn’t enough
• The intensity of god rays relies on the density of the
volumetric fog, but dense fog on screen looks dirty
• Separately generated god rays have a higher resolution, sharper
edges, and more room for adjustment by the art team
56. • Used reflection probes to provide reflection information for the scene
• Time of day + dynamic weather means we cannot use baked cubemap
• Offline bake scene data into a mini G-Buffer
• Runtime generates cubemaps using real-time lighting condition
• The art team can place multiple reflection probes wherever necessary
Reflection Probes
57. • Reflection probes are updated at runtime
• The process consists of three steps in total: relight, convolve, then compress
• Compute shader is used to simultaneously process all 6 faces of a cubemap
• Calculations are done in multiple frames with one probe being processed at a
time, looping continuously
Reflection Probes
61. • After relighting, the reflection probes contain the current lighting condition,
from which we can extract ambient information
• This is then saved as 3-band SH
• After the reflection probes are updated, the corresponding ambient probes
are then automatically updated
• Implemented using compute shader
Ambient Probes
62. • Relight does not consider shadows for the sake of performance and size of data
• Both reflection probes and ambient probes will leak light
• Ground located within a shadow will appear too bright
• Shadows are baked offline and saved as shadow SH
• In the same way, we save the local light's lighting information into a local light SH
• Finally, during the relight step we add the shadow SH and local light SH
Improvements on Image-Based Lighting
67. • Reflection probes are divided into indoor and outdoor types in order to handle
different indoor and outdoor lighting conditions
• Our art team uses the interior mesh to mark which pixels are affected by an
indoor lighting environment
• The ambient probes will accordingly generate different ambient lighting for
indoor and outdoor environments
Interior Marks
71. Screen Space Reflection (SSR)
• Different technique than that used for water surface reflections
• On a PS4 Pro, the GPU overhead is around 1.5ms
• A temporal filter is used to increase stability
• Hi-Z buffer is used for acceleration and allow rays trace up to the full screen
• During the reflection, the color buffer of the previous frame is sampled
74. • As was visible in the previous images, even without SSR, the game will still
have the reflection information provided by reflection probes
• Deferred reflection pass simultaneously performs the reflection and ambient
calculations
• The AO value is considered when reflection information is added to the
lighting calculation, which then effectively reduces issues of leaking light
Runtime Reflection System
75. • HDR: High Dynamic Range Display
• SMPTE ST 2084 Transfer Function
• Supports a max brightness of 10,000 nits
• WCG: Wide Color Gamut
• The Rec. 2020 color space covers 75.8% of the
CIE 1931 color space, compared to Rec. 709,
which is commonly used by HDTV and only
covers 35.9%
• (Image source: Wikipedia)
HDR Display
76. HDR Display
• Here’s a breakdown of Genshin Impact’s console rendering pipeline for SDR
and HDR modes:
77. • Starting from 1.2, Genshin Impact will support HDR10 on PS4
• No SDR tone mapping, color grading or gamma correction
• WCG color grading LUT is made in software like DaVinci
• WCG color LUT pass takes less than 0.05ms
• Including white balance, WCG color grading and color expansion
• ACES pipeline (RRT + ODT) was not used as it does not fit our game’s style
• By blending HDR output with tone mapping results, the scene within the SDR
brightness range looks closer to the SDR version
HDR Display
78. • The issue of consistency within the SDR brightness range
• SDR: EOTF_BT1886 (OETF_sRGB (color)) != color
• HDR: Inverse_PQ(PQ(color)) = color
• An OOTF curve is added at the end of the HDR pipeline to simulate
the error caused by EOTF_BT1886 (OETF_sRGB (color)) conversion
HDR Display
79. • Many art resources rely on hue shift generated by the tone mapping curve
• There is no tone map in a hue-preserving HDR pipeline
Addressing the Issue of Hue Shift
╳
SDR (Hue Shift)
=
HDR (Hue Preserving)
80. • Blackbody radiation solution
• Temperature is used to calculate color
• Requires the art team to modify their assets
• Genshin Impact uses a method of simulating hue shift in the shader
• Does not require any modification of assets
• Final color grading now comes with a tone mapping curve, so there’s no need
for mixing tone mapping as we mentioned before
• Calculations are combined in the LUT
Addressing the Issue of Hue Shift
81. • Console rendering pipeline overview
• Analysis of key technical points
• Summary and conclusion
Presentation Outline
82. • Global players' acceptance of Genshin Impact on the PlayStation 4 has far
exceeded our expectations
• However, there are still many areas that require further optimization, and
we will continue to make more improvements
• Better performance
• Faster loading times
• More stability
• More graphics features that suit the game’s style
83. • Genshin Impact is only the beginning and is our first venture in console
development
• Time is limited, resources are limited
• The team requires talent of all sorts
• We’ve gained experience in combining realistic rendering technology with
anime rendering style
• There is still much we’d like to do, especially with the next generation of
consoles arriving now
• We look forward to more opportunities to exchange development
experiences
84. • We will continue to develop future products
• The game programming team is continually recruiting
new members
• Especially hiring console development-related
positions
• Join our team of tech otakus saving the world!
We Are Hiring!
85. With Special Thanks To:
• The entire Genshin Impact engine team
• Every member who contributed to Genshin
Impact’s console development ♥
• Special member of the console team, Lulu🐱
• A very special thanks to Wenli Chen, Terry Liu,
and all the global tech support specialists at
Sony for all the support they have given us
86. References:
• Josiah Manson, 2016,“Fast Filtering of Reflection Probes”
• Li Bo, SIGGRAPH 2019, “A Scalable Real-Time Many-Shadowed-Light Rendering System”
• Michal Iwanicki, SIGGRAPH 2013, “Lighting Technology of The Last Of Us”
• Paul Malin, 2018, “HDR Display in Call of Duty®”
• Yasin Uludag, 2014, <GPU Pro 5>,“Hi-Z Screen-Space Cone-Traced Reflections”
• Nathan Reed, GDC 2012, “Ambient Occlusion Fields and Decals in Infamous 2”
• Fabian Bauer, SIGGRAPH 2019, “Creating the Atmospheric World of Red Dead Redemption 2: A
Complete and Integrated Solution”