This session describes many of the SPU techniques used in the engine used to develop KILLZONE 2 for the PlayStation 3. It first focuses on individual techniques for SPU's as well as covering how these techniques work together in the game engine each frame.
This talk is about our experiences gained during making of the Killzone Shadow Fall announcement demo.
We’ve gathered all the hard data about our assets, memory, CPU and GPU usage and a whole bunch of tricks.
The goal of talk is to help you to form a clear picture of what’s already possible to achieve on PS4.
The rendering technology of 'lords of the fallen' philip hammerMary Chan
This session is about some important aspects of the rendering pipeline of the upcoming Action-RPG "Lords of the Fallen", developed by Deck13 Interactive and CI Games for PS4, Xbox One, and PC. The topic covers several closely related areas like the deferred rendering system, image-based lighting using deferred cubemaps, deferred decals, and an approach for transparent object lighting and shadowing. More specifically, the lecture will cover several strategies to keep the G-Buffer as small and efficient as possible. This includes the description of a G-Buffer attribute-packing scheme and how per-material attributes can be exposed using special parameter lookup tables. Furthermore, a traditional problem of most deferred rendering systems is the seamless integration of transparent objects into the lighting. The lecture will present several ways to approach this problem, for example multi-pass deferred rendering, coloured transparent shadows, and a novel method for deferred particle lighting.
This presentation gives an overview of the rendering techniques used in KILLZONE 2. We put the main focus on the lighting and shadowing techniques of our deferred shading engine and how we made them play nicely with anti-aliasing.
Decima Engine: Visibility in Horizon Zero DawnGuerrilla
Download the full presentation here: http://www.guerrilla-games.com/read/decima-engine-visibility-in-horizon-zero-dawn
Abstract: Horizon Zero Dawn presented the Decima engine with new challenges in rendering large and dense environments. In particular, we needed to be able to quickly query a very large set of potential objects to find which should be visible. This talk looks at the problems we faced moving from more constrained Killzone levels to Horizon's open world, and our approach to fast visibility queries using the PS4's asynchronous compute hardware. It also covers our recent work on efficiently collecting batches of object instances during the query to reduce load on the entire rendering pipeline. A basic familiarity with GPU compute will be helpful to get the most out of this talk.
This talk is about our experiences gained during making of the Killzone Shadow Fall announcement demo.
We’ve gathered all the hard data about our assets, memory, CPU and GPU usage and a whole bunch of tricks.
The goal of talk is to help you to form a clear picture of what’s already possible to achieve on PS4.
The rendering technology of 'lords of the fallen' philip hammerMary Chan
This session is about some important aspects of the rendering pipeline of the upcoming Action-RPG "Lords of the Fallen", developed by Deck13 Interactive and CI Games for PS4, Xbox One, and PC. The topic covers several closely related areas like the deferred rendering system, image-based lighting using deferred cubemaps, deferred decals, and an approach for transparent object lighting and shadowing. More specifically, the lecture will cover several strategies to keep the G-Buffer as small and efficient as possible. This includes the description of a G-Buffer attribute-packing scheme and how per-material attributes can be exposed using special parameter lookup tables. Furthermore, a traditional problem of most deferred rendering systems is the seamless integration of transparent objects into the lighting. The lecture will present several ways to approach this problem, for example multi-pass deferred rendering, coloured transparent shadows, and a novel method for deferred particle lighting.
This presentation gives an overview of the rendering techniques used in KILLZONE 2. We put the main focus on the lighting and shadowing techniques of our deferred shading engine and how we made them play nicely with anti-aliasing.
Decima Engine: Visibility in Horizon Zero DawnGuerrilla
Download the full presentation here: http://www.guerrilla-games.com/read/decima-engine-visibility-in-horizon-zero-dawn
Abstract: Horizon Zero Dawn presented the Decima engine with new challenges in rendering large and dense environments. In particular, we needed to be able to quickly query a very large set of potential objects to find which should be visible. This talk looks at the problems we faced moving from more constrained Killzone levels to Horizon's open world, and our approach to fast visibility queries using the PS4's asynchronous compute hardware. It also covers our recent work on efficiently collecting batches of object instances during the query to reduce load on the entire rendering pipeline. A basic familiarity with GPU compute will be helpful to get the most out of this talk.
Talk by Graham Wihlidal (Frostbite Labs) at GDC 2017.
Checkerboard rendering is a relatively new technique, popularized recently by the introduction of the PlayStation 4 Pro. Many modern game engines are adding support for it right now, and in this talk, Graham will present an in-depth look at the new implementation in Frostbite, which is used in shipping titles like 'Battlefield 1' and 'Mass Effect Andromeda'. Despite being conceptually simple, checkerboard rendering requires a deep integration into the post-processing chain, in particular temporal anti-aliasing, dynamic resolution scaling, and poses various challenges to existing effects. This presentation will cover the basics of checkerboard rendering, explain the impact on a game engine that powers a wide range of titles, and provide a detailed look at how the current implementation in Frostbite works, including topics like object id, alpha unrolling, gradient adjust, and a highly efficient depth resolve.
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
For this year's keynote at High Performance Graphics 2018, Colin Barré-Brisebois from SEED discussed the state of the art in real-time game ray tracing. He explored some of the connections between offline and real-time game ray tracing, and presented some of the open problems. Colin exposed a few potential solutions to those problems, and also proposed a call-to-arms on topics where the ray tracing research community and the games industry should unite in order to solve such open problems.
With the highest-quality video options, Battlefield 3 renders its Screen-Space Ambient Occlusion (SSAO) using the Horizon-Based Ambient Occlusion (HBAO) algorithm. For performance reasons, the HBAO is rendered in half resolution using half-resolution input depths. The HBAO is then blurred in full resolution using a depth-aware blur. The main issue with such low-resolution SSAO rendering is that it produces objectionable flickering for thin objects (such as alpha-tested foliage) when the camera and/or the geometry are moving. After a brief recap of the original HBAO pipeline, this talk describes a novel temporal filtering algorithm that fixed the HBAO flickering problem in Battlefield 3 with a 1-2% performance hit in 1920x1200 on PC (DX10 or DX11). The talk includes algorithm and implementation details on the temporal filtering part, as well as generic optimizations for SSAO blur pixel shaders. This is a joint work between Louis Bavoil (NVIDIA) and Johan Andersson (DICE).
Book of the Dead: Environmental Design, Tools, and Techniques for Photo-Real ...Unity Technologies
From asset creation and placement to lighting and rendering, this talk explores how Unity's demo team used techniques like photogrammetry and post-processing volumes to accomplish a major feat: Building a realistic forest setting with a high level of detail that supports crucial story beats.
We present the technology and ideas behind the unique lighting in MIRRORS EDGE from DICE. Covering how DICE adopted Global illumination into their lighting process and Illuminate Labs current toolbox of state of the art lighting technology.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
Practical Occlusion Culling in Killzone 3Guerrilla
Killzone 3 features complex occluded environments. To cull non-visible geometry early in the frame, the game uses PlayStation 3 SPUs to rasterize a conservative depth buffer and perform fast synchronous occlusion queries against it. This talk presents an overview of the approach and key lessons learned during its development.
Presentation by Andrew Hamilton and Ken Brown from DICE at GDC 2016.
Photogrammetry has started to gain steam within the Games Industry in recent years. At DICE, this technique was first used on Battlefield and they fully embraced the technology and workflow for Star Wars: Battlefront. This talk will cover their research and development, planning and production, techniques, key takeaways and plans for the future. The speakers will cover photogrammetry as a technology, but more than that, show that it's not a magic bullet but instead a tool like any other that can be used to help achieve your artistic vision and craft.
Takeaway
Come and learn how (and why) photogrammetry was used to create the world of Star Wars. This talk will cover Battlefront's use of of the technology from pre-production to launch as well as some of their philosophies around photogrammetry as a tool. Many visuals will be included!
Intended Audience
A content creator friendly talk intended for pretty much any developer, especially those involved in 3D content creation. It is not a technical talk focused on the code or engineering of photogrammetry. The speakers will quickly cover all basics, so absolutely no prerequisite knowledge required.
This session presents a detailed programmer oriented overview of our SPU based shading system implemented in DICE's Frostbite 2 engine and how it enables more visually rich environments in BATTLEFIELD 3 and better performance over traditional GPU-only based renderers. We explain in detail how our SPU Tile-based deferred shading system is implemented, and how it supports rich material variety, High Dynamic Range Lighting, and large amounts of light sources of different types through an extensive set of culling, occlusion and optimization techniques.
Henrik Halén (Lead Rendering Programmer) at Electronic Arts presented "Style and Gameplay in the Mirror's Edge" at SIGGRAPH 2010's Stylized Rendering in Games. https://www.cs.williams.edu/~morgan/SRG10/
This talk presents the approach Frostbite took to add support for HDR displays. It will summarize Frostbite's previous post processing pipeline and what the issues were. Attendees will learn the decisions made to fix these issues, improve the color grading workflow and support high quality HDR and SDR output. This session will detail the display mapping used to implement the"grade once, output many" approach to targeting any display and why an ad-hoc approach as opposed to filmic tone mapping was chosen. Frostbite retained 3D LUT-based grading flexibility and the accuracy differences of computing these in decorrelated color spaces will be shown. This session will also include the main issues found on early adopter games, differences between HDR standards, optimizations to achieve performance parity with the legacy path and why supporting HDR can also improve the SDR version.
Takeaway
Attendees will learn how and why Frostbite chose to support High Dynamic Range [HDR] displays. They will understand the issues faced and how these were resolved. This talk will be useful for those still to support HDR and provide discussion points for those who already do.
Intended Audience
The intended audience is primarily rendering engineers, technical artists and artists; specifically those who focus on grading and lighting and those interested in HDR displays. Ideally attendees will be familiar with color grading and tonemapping.
This talk details various aspects of designing and developing videogames at Guerrilla. It highlights methods that are very similar to methods used in the CGI industry, and it illuminates some of the most important differences. And it covers the complete breadth of videogame development from artistic design to production pipelines and tool and engine development.
There are a lot of articles about games. Most of these are about particular aspects of a game like rendering or physics. All engines, however, have a binding structure that ties all aspects of the game together. Usually there is a base class (Object, Actor or Entity are common names) that all objects in the game derive from, but very little is written on the subject. Only very recently a couple of talks on game|tech have briefly touched on the subject. Still, choosing a structure to build your game on is very important. The end user might not “see” the difference between a good and a bad structure, but this choice will affect many aspects of the development process. A good structure will reduce risk and increase the efficiency of the team.
Talk by Graham Wihlidal (Frostbite Labs) at GDC 2017.
Checkerboard rendering is a relatively new technique, popularized recently by the introduction of the PlayStation 4 Pro. Many modern game engines are adding support for it right now, and in this talk, Graham will present an in-depth look at the new implementation in Frostbite, which is used in shipping titles like 'Battlefield 1' and 'Mass Effect Andromeda'. Despite being conceptually simple, checkerboard rendering requires a deep integration into the post-processing chain, in particular temporal anti-aliasing, dynamic resolution scaling, and poses various challenges to existing effects. This presentation will cover the basics of checkerboard rendering, explain the impact on a game engine that powers a wide range of titles, and provide a detailed look at how the current implementation in Frostbite works, including topics like object id, alpha unrolling, gradient adjust, and a highly efficient depth resolve.
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
For this year's keynote at High Performance Graphics 2018, Colin Barré-Brisebois from SEED discussed the state of the art in real-time game ray tracing. He explored some of the connections between offline and real-time game ray tracing, and presented some of the open problems. Colin exposed a few potential solutions to those problems, and also proposed a call-to-arms on topics where the ray tracing research community and the games industry should unite in order to solve such open problems.
With the highest-quality video options, Battlefield 3 renders its Screen-Space Ambient Occlusion (SSAO) using the Horizon-Based Ambient Occlusion (HBAO) algorithm. For performance reasons, the HBAO is rendered in half resolution using half-resolution input depths. The HBAO is then blurred in full resolution using a depth-aware blur. The main issue with such low-resolution SSAO rendering is that it produces objectionable flickering for thin objects (such as alpha-tested foliage) when the camera and/or the geometry are moving. After a brief recap of the original HBAO pipeline, this talk describes a novel temporal filtering algorithm that fixed the HBAO flickering problem in Battlefield 3 with a 1-2% performance hit in 1920x1200 on PC (DX10 or DX11). The talk includes algorithm and implementation details on the temporal filtering part, as well as generic optimizations for SSAO blur pixel shaders. This is a joint work between Louis Bavoil (NVIDIA) and Johan Andersson (DICE).
Book of the Dead: Environmental Design, Tools, and Techniques for Photo-Real ...Unity Technologies
From asset creation and placement to lighting and rendering, this talk explores how Unity's demo team used techniques like photogrammetry and post-processing volumes to accomplish a major feat: Building a realistic forest setting with a high level of detail that supports crucial story beats.
We present the technology and ideas behind the unique lighting in MIRRORS EDGE from DICE. Covering how DICE adopted Global illumination into their lighting process and Illuminate Labs current toolbox of state of the art lighting technology.
Rendering Technologies from Crysis 3 (GDC 2013)Tiago Sousa
This talk covers changes in CryENGINE 3 technology during 2012, with DX11 related topics such as moving to deferred rendering while maintaining backward compatibility on a multiplatform engine, massive vegetation rendering, MSAA support and how to deal with its common visual artifacts, among other topics.
Practical Occlusion Culling in Killzone 3Guerrilla
Killzone 3 features complex occluded environments. To cull non-visible geometry early in the frame, the game uses PlayStation 3 SPUs to rasterize a conservative depth buffer and perform fast synchronous occlusion queries against it. This talk presents an overview of the approach and key lessons learned during its development.
Presentation by Andrew Hamilton and Ken Brown from DICE at GDC 2016.
Photogrammetry has started to gain steam within the Games Industry in recent years. At DICE, this technique was first used on Battlefield and they fully embraced the technology and workflow for Star Wars: Battlefront. This talk will cover their research and development, planning and production, techniques, key takeaways and plans for the future. The speakers will cover photogrammetry as a technology, but more than that, show that it's not a magic bullet but instead a tool like any other that can be used to help achieve your artistic vision and craft.
Takeaway
Come and learn how (and why) photogrammetry was used to create the world of Star Wars. This talk will cover Battlefront's use of of the technology from pre-production to launch as well as some of their philosophies around photogrammetry as a tool. Many visuals will be included!
Intended Audience
A content creator friendly talk intended for pretty much any developer, especially those involved in 3D content creation. It is not a technical talk focused on the code or engineering of photogrammetry. The speakers will quickly cover all basics, so absolutely no prerequisite knowledge required.
This session presents a detailed programmer oriented overview of our SPU based shading system implemented in DICE's Frostbite 2 engine and how it enables more visually rich environments in BATTLEFIELD 3 and better performance over traditional GPU-only based renderers. We explain in detail how our SPU Tile-based deferred shading system is implemented, and how it supports rich material variety, High Dynamic Range Lighting, and large amounts of light sources of different types through an extensive set of culling, occlusion and optimization techniques.
Henrik Halén (Lead Rendering Programmer) at Electronic Arts presented "Style and Gameplay in the Mirror's Edge" at SIGGRAPH 2010's Stylized Rendering in Games. https://www.cs.williams.edu/~morgan/SRG10/
This talk presents the approach Frostbite took to add support for HDR displays. It will summarize Frostbite's previous post processing pipeline and what the issues were. Attendees will learn the decisions made to fix these issues, improve the color grading workflow and support high quality HDR and SDR output. This session will detail the display mapping used to implement the"grade once, output many" approach to targeting any display and why an ad-hoc approach as opposed to filmic tone mapping was chosen. Frostbite retained 3D LUT-based grading flexibility and the accuracy differences of computing these in decorrelated color spaces will be shown. This session will also include the main issues found on early adopter games, differences between HDR standards, optimizations to achieve performance parity with the legacy path and why supporting HDR can also improve the SDR version.
Takeaway
Attendees will learn how and why Frostbite chose to support High Dynamic Range [HDR] displays. They will understand the issues faced and how these were resolved. This talk will be useful for those still to support HDR and provide discussion points for those who already do.
Intended Audience
The intended audience is primarily rendering engineers, technical artists and artists; specifically those who focus on grading and lighting and those interested in HDR displays. Ideally attendees will be familiar with color grading and tonemapping.
This talk details various aspects of designing and developing videogames at Guerrilla. It highlights methods that are very similar to methods used in the CGI industry, and it illuminates some of the most important differences. And it covers the complete breadth of videogame development from artistic design to production pipelines and tool and engine development.
There are a lot of articles about games. Most of these are about particular aspects of a game like rendering or physics. All engines, however, have a binding structure that ties all aspects of the game together. Usually there is a base class (Object, Actor or Entity are common names) that all objects in the game derive from, but very little is written on the subject. Only very recently a couple of talks on game|tech have briefly touched on the subject. Still, choosing a structure to build your game on is very important. The end user might not “see” the difference between a good and a bad structure, but this choice will affect many aspects of the development process. A good structure will reduce risk and increase the efficiency of the team.
Games always have too much stuff to render, but are often set in occluded environments, where much of what is in the view frustum is not visible to the camera. Occlusion culling tries to avoid rendering objects which the player can't see.
Discussion of occlusion for games has tended to focus on the use of GPU pixel counters to determine whether or not an object is visible. This places additional stress on a limited resource, and has latency which has to be worked around, requiring a more complex pipeline.
For KILLZONE 3 we took a different approach - we use the SPUs to render a conservative depth buffer and perform queries against it. This allows us to cull objects very early in the frame, avoiding any pipeline costs for invisible objects.
This presentation talks about the ideas (and dead ends) we explored along the way, as well as explaining in detail what we ended up with.
Automatic Annotations in Killzone 3 and BeyondGuerrilla
Voxel-based technologies have been successfully used to extract navigation data for NPCs. This talk shows how voxel data was used to automatically extract cover planes and more information in Killzone 3’s toolset.
Next generation gaming brought high resolutions, very complex environments and large textures to our living rooms. With virtually every asset being inflated, it's hard to use traditional forward rendering and hope for rich, dynamic environments with extensive dynamic lighting. Deferred rendering, on the other hand, has been traditionally described as a nice technique for rendering of scenes with many dynamic lights, that unfortunately suffers from fill-rate problems and lack of anti-aliasing and very few games that use it were published.
In this talk, we will discuss our approach to face this challenge and how we designed a deferred rendering engine that uses multi-sampled anti-aliasing (MSAA). We will give in-depth description of each individual stage of our real-time rendering pipeline and the main ingredients of our lighting, post-processing and data management. We'll show how we utilize PS3's SPUs for fast rendering of a large set of primitives, parallel processing of geometry and computation of indirect lighting. We will also describe our optimizations of the lighting and our parallel split (cascaded) shadow map algorithm for faster and stable MSAA output.
Dynamic tactical position evaluation functions are procedures that use static and dynamic information about the world to make tactical decisions at run-time. By having NPCs use a procedural description of the solution to a tactical decision, the solution can vary depending on the inputs. Using static information about the world as input makes sure the tactics are applicable at any place on the map. Using dynamic inputs makes sure they are applicable in multiple situations.
Actividad prevista en el Taller Enseánza Efectiva y Aprendizaje Activo, dentro del programa de refinamiento docente de la Universiad Simón Bolivar en Caracas, Venezuela
Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...Unity Technologies
In this session, the Unity Demo team provides their best tips and tricks for optimizing detailed, complex environment scenes for modern console performance.
Speakers:
Rob Thompson (Unity Technologies)
Leszek Szczepański, Gameplay programmer, Guerrilla Games
Horizon Zero Dawn was a long, almost 7-year project. But it had to start somewhere. This session will discuss the road Guerrilla Games has traveled during the prototyping stage of Horizon Zero Dawn.
Presentation Video : http://tinyurl.com/pfhz96m
Stage 3D introduction in Adobe Flash Player and Adobe AIR lets you use techniques such as deferred lighting, screen space dynamic shadow, MRT, and more through vertex and fragment shaders. Join Jean-Philippe Doiron, Principal Architect R&D at Frima Studio, and Jean-Philippe Auclair, R&D Architect, for a deep dive into GPU programming with the new Flash Player, and discover how to produce beautiful GPU effects that are reusable in your games and applications.
BSidesDelhi 2018: Headshot - Game Hacking on macOSBSides Delhi
Presenter: Jai Verma
Abstract: Game hacking is a major contributor to the hacking economy. People shell out a lot of money for hacks which help them cheat at multiplayer games. These hacks can range from enhancing in-game abilities for gaining an upper hand in the game, to bypassing in-app purchases. With the advent of anti-cheat such as VAC (Valve Anti Cheat), the techniques employed by hackers to cheat at games have become increasingly refined. These techniques are comparable to those utilised in sophisticated malware and rootkits. These techniques include runtime process injection, function interposition, and proxying network traffic to name a few. To avoid detection, hackers even resort to custom kernel drivers to interact with the game process. This talk will detail a few of the techniques utilised by game hackers and malware developers alike, through the hacking of an open-source game. I will showcase some hacks for an open-source FPS game by describing the method in detail for the macOS platform. While the demo will be macOS specific, the approach applies to all modern operating systems.
GS-4145, Oxide discusses how Mantle enables game engine performance, by Dan B...AMD Developer Central
Presentation GS-4145, Oxide discusses how Mantle enables game engine performance, by Dan Baker, Tim Kip and Guennadi Riguer at the AMD Developer Summit (APU13) November 11-13, 2012.
Horizon Zero Dawn: An Open World QA Case StudyGuerrilla
Download the full presentation here: http://www.guerrilla-games.com/
Abstract: A retrospective of how Developer (Guerrilla) and Publisher QA (SIE) worked together in partnership to help deliver a AAA experience. The presentation will focus on the victories and the challenges we faced as part of testing such an ambitious open-world title, and leveraging automated test solutions and telemetry to inform exploratory test strategy. The presentation will talk about how we managed teams over a 21 month testing lifecycle, with a team ranging from small internal test teams, before scaling up to a team of 70+ people across the globe with tens of thousands of test hours invested. The presentation will cover the successes and challenges relating to our people; early engagement, communication, trust, agility and collaboration. Our processes; risk management, test strategy, and post launch support, and our tools; Telemetry, Risk Registers, Worldwide build delivery.
Horizon Zero Dawn: A Game Design Post-MortemGuerrilla
Download the full presentation here: http://www.guerrilla-games.com/read/horizon-zero-dawn-a-game-design-postmortem
Abstract: Going through early prototypes and delving into design decisions and processes that shaped development of the game, this talk gives insight into the journey game design went through while moving from an ambitious paper concept to a finished open world action RPG, with all of the small and large design decisions and choices that have to be made along the way.
Putting the AI Back Into Air: Navigating the Air Space of Horizon Zero DawnGuerrilla
Download the full presentation here: http://www.guerrilla-games.com/read/putting-the-ai-back-into-air
Abstract: In this talk, we explain the technology behind the aerial navigation in Horizon Zero Dawn. In Horizon, we've represented the flyable air space by use of a run-time generated height map. Queries can be done on this height map for positional information and navigation. We present a hierarchical path planning algorithm for finding a progressively more detailed path between two points. Additionally, we will touch on some gameplay related subjects, to show the additional challenges we faced in implementing the different flying behaviors, such as transitioning from air to ground and guided crash-landing.
Building Non-Linear Narratives in Horizon Zero DawnGuerrilla
Download the original PowerPoint presentation here: http://www.guerrilla-games.com/read/building-non-linear-narratives-in-horizon-zero-dawn
Open world RPGs require deep stories, which emphasize player choice and freedom. The bigger the games grow, the more complex systems managing these stories get. See how Guerrilla tackled the issue in Horizon Zero Dawn by creating a quest system which has non-linearity at its base.
Player Traversal Mechanics in the Vast World of Horizon Zero DawnGuerrilla
Download the original PowerPoint presentation here: http://www.guerrilla-games.com/read/player-traversal-mechanics-in-the-vast-world-of-horizon-zero-dawn
Paul van Grinsven shows what is needed to make Aloy traverse the vast world of Horizon Zero Dawn, with its complex and organic environments. Various traversal mechanics are covered from a gameplay programmer's perspective, focusing on the interaction between code and animations. The different systems and techniques involved in the implementation of these mechanics are explained, and Van Grinsven looks at the underlying reasoning and design decisions.
The Real-time Volumetric Cloudscapes of Horizon Zero DawnGuerrilla
Download the full presentation (including movies) from http://www.guerrilla-games.com/publications.html#schneider1508.
Abstract: Real-time volumetric clouds in games usually pay for fast performance with a reduction in quality. The most successful approaches are limited to low altitude fluffy and translucent stratus-type clouds. For Horizon: Zero Dawn, Guerrilla need a solution that can fill a sky with evolving and realistic results that closely match highly detailed reference images which represent high altitude cirrus clouds and all of the major low level cloud types, including thick billowy cumulus clouds. These clouds need to light correctly according to the time of day and other cloud-specific lighting effects. Additionally, we are targeting GPU performance of 2ms. Our solution is a volumetric cloud shader which handles the aspects of modeling, animation and lighting logically without sacrificing quality or draw time. Special emphasis will be placed on our solutions for direct-ability of cloud shapes and formations as well as on our lighting model and optimizations.
The Production and Visual FX of Killzone Shadow FallGuerrilla
With the arrival of next generation consoles comes more processing power and memory... and with that, a whole lot more possibilities. Take an exclusive look behind the scenes of this Playstation 4 launch title and hear about the challenges the team were faced with during production. Learn more about the engine's new lighting and shading tech and take a closer look at how some of the effects like dynamic dust, rain and various particle effects are achieved.
Out of Sight, Out of Mind: Improving Visualization of AI InfoGuerrilla
When AI was simple, debugging consisted of confirming that the character was simply doing the one thing you expected. Over time, debugging moved away from "what" and became more about "why?" or "why not?" The collision of information about the agents, the environment, the player and the game state creates an enormous amount of data that can affect the decisions that the characters make. In this presentation some features of ReView will be demonstrated as how it was used for debugging the multi-player bots in Killzone Shadow Fall.
The Next-Gen Dynamic Sound System of Killzone Shadow FallGuerrilla
We'll describe our new audio run-time and toolset that was built specifically for Killzone Shadow Fall on PlayStation 4. However, the ideas used are widely applicable and the focus is on integration with the game engine and fast iteration, with special attention to shortcuts to get your creative spark translated into in-game sounds as quickly as possible. We will demonstrate our implementation of these demands, which is a next-gen sound system that was designed to combine artistic freedom with high run-time performance. It should be interesting to both creative as well as technical minds who are looking for inspiration on what to expect from a modern sound design environment. To emphasize the performance advantage, we will show the point of view of both the sound designer and the programmer simultaneously. We will use examples starting with simple sounds and build up to increasingly more complex dynamic ones to illustrate the benefits of this unique approach.
Killzone Shadow Fall: Creating Art Tools For A New Generation Of GamesGuerrilla
This talk describes the tool improvements Guerrilla Games implemented to make Killzone Shadow Fall shine on the PlayStation 4. It highlights additions to the Maya pipeline, such as Viewport 2.0, Maya's coupling with in-game updates and in-engine deferred renderer features including real-time shadow-casting, volumetric lighting, hardware instancing, lens flares and color grading.
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
A Hierarchically-Layered Multiplayer Bot System for a First-Person ShooterGuerrilla
This thesis research was the basis for Killzone 2’s multi-player bots. It proposes a hierarchically structured system to control the multiplayer bots, using one AI commander for each faction, each commanding several group leaders, each commanding several individual bots. The system is evaluated on the basis of a fully implemented AI for one of the multiplayer game modes.
This presentation the describes architecture behind the multiplayer bot AI for Killzone 2. As part of this, it describes the hierarchical task network (HTN) planner, terrain analysis, and the way commander squad and individual bot AI work together for dynamic tactical gameplay.
Hollywood Actress - The 250 hottest galleryZsolt Nemeth
Hollywood Actress amazon album eminent worldwide media, female-singer, actresses, alhletina-woman, 250 collection.
Highest and photoreal-print exclusive testament PC collage.
Focused television virtuality crime, novel.
The sheer afterlife of the work is activism-like hollywood-actresses point com.
173 Illustrate, 250 gallery, 154 blog, 120 TV serie logo, 17 TV president logo, 183 active hyperlink.
HD AI face enhancement 384 page plus Bowker ISBN, Congress LLCL or US Copyright.
Scandal! Teasers June 2024 on etv Forum.co.zaIsaac More
Monday, 3 June 2024
Episode 47
A friend is compelled to expose a manipulative scheme to prevent another from making a grave mistake. In a frantic bid to save Jojo, Phakamile agrees to a meeting that unbeknownst to her, will seal her fate.
Tuesday, 4 June 2024
Episode 48
A mother, with her son's best interests at heart, finds him unready to heed her advice. Motshabi finds herself in an unmanageable situation, sinking fast like in quicksand.
Wednesday, 5 June 2024
Episode 49
A woman fabricates a diabolical lie to cover up an indiscretion. Overwhelmed by guilt, she makes a spontaneous confession that could be devastating to another heart.
Thursday, 6 June 2024
Episode 50
Linda unwittingly discloses damning information. Nhlamulo and Vuvu try to guide their friend towards the right decision.
Friday, 7 June 2024
Episode 51
Jojo's life continues to spiral out of control. Dintle weaves a web of lies to conceal that she is not as successful as everyone believes.
Monday, 10 June 2024
Episode 52
A heated confrontation between lovers leads to a devastating admission of guilt. Dintle's desperation takes a new turn, leaving her with dwindling options.
Tuesday, 11 June 2024
Episode 53
Unable to resort to violence, Taps issues a verbal threat, leaving Mdala unsettled. A sister must explain her life choices to regain her brother's trust.
Wednesday, 12 June 2024
Episode 54
Winnie makes a very troubling discovery. Taps follows through on his threat, leaving a woman reeling. Layla, oblivious to the truth, offers an incentive.
Thursday, 13 June 2024
Episode 55
A nosy relative arrives just in time to thwart a man's fatal decision. Dintle manipulates Khanyi to tug at Mo's heartstrings and get what she wants.
Friday, 14 June 2024
Episode 56
Tlhogi is shocked by Mdala's reaction following the revelation of their indiscretion. Jojo is in disbelief when the punishment for his crime is revealed.
Monday, 17 June 2024
Episode 57
A woman reprimands another to stay in her lane, leading to a damning revelation. A man decides to leave his broken life behind.
Tuesday, 18 June 2024
Episode 58
Nhlamulo learns that due to his actions, his worst fears have come true. Caiphus' extravagant promises to suppliers get him into trouble with Ndu.
Wednesday, 19 June 2024
Episode 59
A woman manages to kill two birds with one stone. Business doom looms over Chillax. A sobering incident makes a woman realize how far she's fallen.
Thursday, 20 June 2024
Episode 60
Taps' offer to help Nhlamulo comes with hidden motives. Caiphus' new ideas for Chillax have MaHilda excited. A blast from the past recognizes Dintle, not for her newfound fame.
Friday, 21 June 2024
Episode 61
Taps is hungry for revenge and finds a rope to hang Mdala with. Chillax's new job opportunity elicits mixed reactions from the public. Roommates' initial meeting starts off on the wrong foot.
Monday, 24 June 2024
Episode 62
Taps seizes new information and recruits someone on the inside. Mary's new job
At Digidev, we are working to be the leader in interactive streaming platforms of choice by smart device users worldwide.
Our goal is to become the ultimate distribution service of entertainment content. The Digidev application will offer the next generation television highway for users to discover and engage in a variety of content. While also providing a fresh and
innovative approach towards advertainment with vast revenue opportunities. Designed and developed by Joe Q. Bretz
Skeem Saam in June 2024 available on ForumIsaac More
Monday, June 3, 2024 - Episode 241: Sergeant Rathebe nabs a top scammer in Turfloop. Meikie is furious at her uncle's reaction to the truth about Ntswaki.
Tuesday, June 4, 2024 - Episode 242: Babeile uncovers the truth behind Rathebe’s latest actions. Leeto's announcement shocks his employees, and Ntswaki’s ordeal haunts her family.
Wednesday, June 5, 2024 - Episode 243: Rathebe blocks Babeile from investigating further. Melita warns Eunice to stay clear of Mr. Kgomo.
Thursday, June 6, 2024 - Episode 244: Tbose surrenders to the police while an intruder meddles in his affairs. Rathebe's secret mission faces a setback.
Friday, June 7, 2024 - Episode 245: Rathebe’s antics reach Kganyago. Tbose dodges a bullet, but a nightmare looms. Mr. Kgomo accuses Melita of witchcraft.
Monday, June 10, 2024 - Episode 246: Ntswaki struggles on her first day back at school. Babeile is stunned by Rathebe’s romance with Bullet Mabuza.
Tuesday, June 11, 2024 - Episode 247: An unexpected turn halts Rathebe’s investigation. The press discovers Mr. Kgomo’s affair with a young employee.
Wednesday, June 12, 2024 - Episode 248: Rathebe chases a criminal, resorting to gunfire. Turf High is rife with tension and transfer threats.
Thursday, June 13, 2024 - Episode 249: Rathebe traps Kganyago. John warns Toby to stop harassing Ntswaki.
Friday, June 14, 2024 - Episode 250: Babeile is cleared to investigate Rathebe. Melita gains Mr. Kgomo’s trust, and Jacobeth devises a financial solution.
Monday, June 17, 2024 - Episode 251: Rathebe feels the pressure as Babeile closes in. Mr. Kgomo and Eunice clash. Jacobeth risks her safety in pursuit of Kganyago.
Tuesday, June 18, 2024 - Episode 252: Bullet Mabuza retaliates against Jacobeth. Pitsi inadvertently reveals his parents’ plans. Nkosi is shocked by Khwezi’s decision on LJ’s future.
Wednesday, June 19, 2024 - Episode 253: Jacobeth is ensnared in deceit. Evelyn is stressed over Toby’s case, and Letetswe reveals shocking academic results.
Thursday, June 20, 2024 - Episode 254: Elizabeth learns Jacobeth is in Mpumalanga. Kganyago's past is exposed, and Lehasa discovers his son is in KZN.
Friday, June 21, 2024 - Episode 255: Elizabeth confirms Jacobeth’s dubious activities in Mpumalanga. Rathebe lies about her relationship with Bullet, and Jacobeth faces theft accusations.
Monday, June 24, 2024 - Episode 256: Rathebe spies on Kganyago. Lehasa plans to retrieve his son from KZN, fearing what awaits.
Tuesday, June 25, 2024 - Episode 257: MaNtuli fears for Kwaito’s safety in Mpumalanga. Mr. Kgomo and Melita reconcile.
Wednesday, June 26, 2024 - Episode 258: Kganyago makes a bold escape. Elizabeth receives a shocking message from Kwaito. Mrs. Khoza defends her husband against scam accusations.
Thursday, June 27, 2024 - Episode 259: Babeile's skillful arrest changes the game. Tbose and Kwaito face a hostage crisis.
Friday, June 28, 2024 - Episode 260: Two women face the reality of being scammed. Turf is rocked by breaking
Matt Rife Cancels Shows Due to Health Concerns, Reschedules Tour Dates.pdfAzura Everhart
Matt Rife's comedy tour took an unexpected turn. He had to cancel his Bloomington show due to a last-minute medical emergency. Fans in Chicago will also have to wait a bit longer for their laughs, as his shows there are postponed. Rife apologized and assured fans he'd be back on stage soon.
https://www.theurbancrews.com/celeb/matt-rife-cancels-bloomington-show/
Young Tom Selleck: A Journey Through His Early Years and Rise to Stardomgreendigital
Introduction
When one thinks of Hollywood legends, Tom Selleck is a name that comes to mind. Known for his charming smile, rugged good looks. and the iconic mustache that has become synonymous with his persona. Tom Selleck has had a prolific career spanning decades. But, the journey of young Tom Selleck, from his early years to becoming a household name. is a story filled with determination, talent, and a touch of luck. This article delves into young Tom Selleck's life, background, early struggles. and pivotal moments that led to his rise in Hollywood.
Follow us on: Pinterest
Early Life and Background
Family Roots and Childhood
Thomas William Selleck was born in Detroit, Michigan, on January 29, 1945. He was the second of four children in a close-knit family. His father, Robert Dean Selleck, was a real estate investor and executive. while his mother, Martha Selleck, was a homemaker. The Selleck family relocated to Sherman Oaks, California. when Tom was a child, setting the stage for his future in the entertainment industry.
Education and Early Interests
Growing up, young Tom Selleck was an active and athletic child. He attended Grant High School in Van Nuys, California. where he excelled in sports, particularly basketball. His tall and athletic build made him a standout player, and he earned a basketball scholarship to the University of Southern California (U.S.C.). While at U.S.C., Selleck studied business administration. but his interests shifted toward acting.
Discovery of Acting Passion
Tom Selleck's journey into acting was serendipitous. During his time at U.S.C., a drama coach encouraged him to try acting. This nudge led him to join the Hills Playhouse, where he began honing his craft. Transitioning from an aspiring athlete to an actor took time. but young Tom Selleck became drawn to the performance world.
Early Career Struggles
Breaking Into the Industry
The path to stardom was a challenging one for young Tom Selleck. Like many aspiring actors, he faced many rejections and struggled to find steady work. A series of minor roles and guest appearances on television shows marked his early career. In 1965, he debuted on the syndicated show "The Dating Game." which gave him some exposure but did not lead to immediate success.
The Commercial Breakthrough
During the late 1960s and early 1970s, Selleck began appearing in television commercials. His rugged good looks and charismatic presence made him a popular brand choice. He starred in advertisements for Pepsi-Cola, Revlon, and Close-Up toothpaste. These commercials provided financial stability and helped him gain visibility in the industry.
Struggling Actor in Hollywood
Despite his success in commercials. breaking into large acting roles remained a challenge for young Tom Selleck. He auditioned and took on small parts in T.V. shows and movies. Some of his early television appearances included roles in popular series like Lancer, The F.B.I., and Bracken's World. But, it would take a
240529_Teleprotection Global Market Report 2024.pdfMadhura TBRC
The teleprotection market size has grown
exponentially in recent years. It will grow from
$21.92 billion in 2023 to $28.11 billion in 2024 at a
compound annual growth rate (CAGR) of 28.2%. The
teleprotection market size is expected to see
exponential growth in the next few years. It will grow
to $70.77 billion in 2028 at a compound annual
growth rate (CAGR) of 26.0%.
Modern Radio Frequency Access Control Systems: The Key to Efficiency and SafetyAITIX LLC
Today's fast-paced environment worries companies of all sizes about efficiency and security. Businesses are constantly looking for new and better solutions to solve their problems, whether it's data security or facility access. RFID for access control technologies have revolutionized this.
Meet Dinah Mattingly – Larry Bird’s Partner in Life and Loveget joys
Get an intimate look at Dinah Mattingly’s life alongside NBA icon Larry Bird. From their humble beginnings to their life today, discover the love and partnership that have defined their relationship.
_7 OTT App Builders to Support the Development of Your Video Applications_.pdfMega P
Due to their ability to produce engaging content more quickly, over-the-top (OTT) app builders have made the process of creating video applications more accessible. The invitation to explore these platforms emphasizes how over-the-top (OTT) applications hold the potential to transform digital entertainment.
Experience the thrill of Progressive Puzzle Adventures, like Scavenger Hunt Games and Escape Room Activities combined Solve Treasure Hunt Puzzles online.
From the Editor's Desk: 115th Father's day Celebration - When we see Father's day in Hindu context, Nanda Baba is the most vivid figure which comes to the mind. Nanda Baba who was the foster father of Lord Krishna is known to provide love, care and affection to Lord Krishna and Balarama along with his wife Yashoda; Letter’s to the Editor: Mother's Day - Mother is a precious life for their children. Mother is life breath for her children. Mother's lap is the world happiness whose debt can never be paid.
2. The PlayStation®3’s SPUs
in the Real World
A KILLZONE 2 Case Study
Michiel van der Leeuw
Technical Director - Guerrilla Games
2
3. TAKEAWAY
• Things we did on SPUs
• What worked or didn’t work for us
• Practical advice on using SPUs
• Some food for thought
3
- Firstly a post-mortem
- Talk about what we did and if we liked it
- Iʼll try to add lots of numbers, because I like it in other presentations
- Will try to put this into perspective
- So you can apply this to your own game
4. KILLZONE 2
• Announced E3 2005
• Gold Master end 2008
• ~ 120 Average team size
• ~ 27 Programmers
4
Team:
- Peak 140 in Amsterdam, 50 at other Sony studios
- Average around 120
Coders (peak):
- 5 Network coders
- 9 Game coders
- 6 AI coders
- 7 Tech coders
- 27 Total
5. KILLZONE 2
Franchise and engine goals:
• Cinematic
• Dense
• Realistic
• Intense
Can SPU’s help?
5
Cinematic:
- Movie-like quality in everything (animation, special effects)
- Make people forget itʼs a video game
Dense:
- Dense, lots of everything
- Over-the-top
Realistic:
- Not just from a graphics point of view
- Responds to player actions
- AI behaves logically
Intense:
- Frantic, lots of everything (see a pattern)
6. SPUs OVERVIEW
• 6 x 3.2GHz of processing power
• General purpose instruction set
• 256KB local memory per SPU
• No caches! Only local code
• Very fast DMA in/out
• Libraries for scheduling
6
Problems:
- Hard to program, especially at the start
- Lack of instruction of data and instruction cache difficult
- Basically impossible to use as a ʻnormalʼ core out of the box
Virtues:
- Truly parallel
- Incredibly incredibly fast
- DMA allows hiding of all memory latency
Difficult to unlock the power
8. KILLZONE 2 SPU USAGE
• Animation • Scene graph
• AI Threat prediction • Display list building
• AI Line of fire • IBL Light probes
• AI Obstacle avoidance • Graphics post-processing
• Collision detection • Dynamic music system
• Physics • Skinning
• Particle simulation • MP3 Decompression
• Particle rendering • Edge Zlib
• etc.
44 Job types
8
9. KILLZONE 2 SPU USAGE
• Per frame
• ~ 100 physics objects simulated
• ~ 600 image-based light probes
• ~ 250 rays cast per frame
• ~ 200 animations sampled
• ~ 250 particle systems active
• ~ 6 scene graph traversals
• ~ 2000 graphics objects drawn
• ~ 75 skinning batches
9
Notes:
- Started out with simple stuff (skinning, particle vertices)
- Ditched fluid sim running on SPUs (artistic reasons, couldnʼt get it to look right)
Next up:
- Movie from game
10. KILLZONE 2 SPU USAGE
Killzone 2 - SPU usage in cut scene
10
On-screen:
- Left side = PPU and SPU activity bars
- Top is PPU
- Below is six SPUs (Flickering due to bug in my code)
- Right side = SPU activity by job
- Column 1: Name
- Column 2: Job count
- Column 3: Time taken in percentage of one SPU on a 30 Hz frame
Usage:
- Lots of different things
- Lots of jobs, about 1000
- This is still light usage, 1.5 SPUs used
- Cut-scene
- No AI or gameplay
- Normally about 4 to 5 SPUs used
11. KILLZONE 2 SPU USAGE
Killzone 2 - SPU usage in cut scene
11
12. SPUs in
GRAPHICS
12
Presentation overview
- Take a look at different areas
- Graphics
- Game code
- AI
13. LIGHT PROBE SAMPLING
• Purpose: Make dynamic objects blend in with lightmaps
of environment
• ~ 2500 static light probes per level
• Created offline during lightmap rendering
• Stored as 9x3 Spherical Harmonics in KD-tree
• When object needs to be lit by light probes
• A job is added to sample lighting for that object
• Blend four closest light probes
• Rotate result into view space
• Create 8x8 spherical mapped texture for sampling
13
Notes:
- Will explain with pictures to make it clear
- Sounds expensive, but itʼs not too bad
- Do roughly 600 light probes per frame (13% of one SPU)
- Sometimes more
- Multiple probes per object (legs, torso, arms), never bothered to optimize
17. LIGHT PROBE SAMPLING
Blended probe sampled in texture space
17
Notes:
- This is actually fake
- This is a latitude-longitude map
- In-game is spherical map
- But this seemed to explain better by the artist
18. LIGHT PROBE SAMPLING
Final shot with lightmaps and light probes
18
Notes:
- This is the final scene as you see it in-game
- Or on TV, in this case
- Or on PlayStation®Network, soon
- Can you see what is
- Lightmapped
- Lit by light probes?
19. LIGHT PROBE SAMPLING
Base scene, no light probes, no textures
19
Notes:
- Light probes replaced with single white probe
- All dynamic objects and destructibles are lit with probes
- Only floor is static in this case
20. LIGHT PROBE SAMPLING
...add light probes sampling and sunlight
20
Notes:
- All lighting and directional information in shadow is light probes
- Blend of sun light and light probes ʻpullʼ the scene together
21. LIGHT PROBE SAMPLING
...add textures
21
Notes:
- Textures add less ʻfeelingʼ than lighting
CINEMATIC - Because you forget itʼs a game, objects donʼt “pop out” of the scene
22. PARTICLE SIMULATION
• We’re quite particle heavy, per 30 Hz frame:
• ~ 250 particle systems simulated
• ~ 150 systems drawn
• ~ 3000 particles updated
• ~ 200 collision ray casts
• Difficult to optimize for multi-core
• System had grown feature-heavy over time
• Had to refactor in-place, incremental steps
• Code was quite optimized, but generic C++
• Memory accesses all over the place
22
Notes:
- A lot of our look comes from effects
- Particles very key to our look
- Not designed for PlayStation®3! Like, really...
Features:
- Geometry/mesh particles
- Collision particles (ray casts)
- Recursive system (impact causes new particle system)
- Attach sounds
- Lots of curves and tweakability
You:
- Everyone has one of these hideous beasts in their code somewhere
23. PARTICLE SIMULATION
• System was refactored for SPUs in four steps
1. Vertex generation (~1 month)
2. Particle simulation inner loop (~2 months)
3. Initialization and deletion of particles (~3 months)
4. High-level management / glue (~4 months)
• Everything now done on SPUs except
• Updating global scene graph
• Starting & stopping sounds
• Had “Learning Experience” written all over it
23
Steps:
1. Simple, vectorised when moved as well
2. Also quite straightforward, particles already configured, just need simulation
3. Tough, had to read all artist curves, tweaks, variable-sized arrays, linked lists, etc. (data all over the place)
4. Elite, had to remove memory allocation altogether, difficult edge cases
- But definitely the biggest win, SPU pain caused us to do stuff ʻthe right wayʼ which in the end helped
Notes:
- Now runs almost completely on SPUs
- Almost zero PPU overhead left
- And many many times faster than old version
- SPU memory locality = cache locality
- Custom memory allocator
- Every time we made it faster, the artists started using more of it
- Requiring us to further optimize
- Because we liked the look it gave us
- “Incremental” was the keyword
- Learning experience because we learned how memory-bound we really were, but never knew
24. CODE STYLE DIFFERENCE
PPU Version SPU Version
• Single-threaded • Heavily parallel
• Malloc & free • Linear memory block
• Pointers to objects • Embedded objects
• Input control curves • Sampled lookup tables
• Raycasts during update • Queues raycast jobs
• ~ 20ms update on PPU • < 1ms update on PPU
• ~ 15ms update on SPUs
20x Less PPU overhead!
Incredible amount of work
24
Notes:
- First looked more like game code than engine code
- Turned into lean and mean, memory-local system
- Forced to change it by SPUs architecture, but donʼt regret it in the end
Performance:
- Optimized a lot along the way
- Final system now runs faster on SPUs than same code on PPU
- DMA hides memory latency
- Runs faster on Intel Xeons as well (no longer fall off D$)
25. PARTICLE SIMULATION
Example movie
25
Notes:
- Explosions in presentation are cool
- Higher budgets than the actual game (3x higher)
At peak:
- 1700 particle emitters
- 1100 visible systems
- 120% of an SPU for update
- 110% of an SPU for vertex generation
Future:
- Code is fairly unoptimized, with optimization itʼll probably run in-game
DENSE & INTENSE - This is a large part of what makes everything so over-the-top, in your face
26. IMAGE POST-PROCESSING
• Effects done on SPU
• Motion blur
• Depth of field
• Bloom
• SPUs assist the RSX with post-processing
1. RSX prepares low-res image buffers in XDR
2. RSX triggers interrupt to start SPUs
3. SPUs perform image operations
4. RSX already starts on next frame
5. Result of SPUs processed by RSX early in next frame
• Improved version available in PlayStation®Edge 1.1!
• Similar (but different) approach in PhyreEngine now!
26
Notes:
- Moved some work normally done on RSX to SPUs
- 20% on RSX was too much
- Freed up to draw more geometry/lighting
- Ported only the low-res effects
- SPUs not fast enough for full-res effects (for us)
- 4pm talk on PhyreEngine that uses SPU-assisted RSX-post processing
- Room 3002, West Hall, called “Deferred Lighting and Post Processing on PlayStation®3”
- Different tradeoff, may suit you better
- Two techniques now available to all 3rd parties!
27. IMAGE POST-PROCESSING
RSX Time SPU Time Quality
• Comparison RSX 20% 0% Medium
• @ 30 Hz
RSX + 5 SPUs 14% 12% High
• SPUs are compute-bound
• Bandwidth not a problem
• Code can be optimized further
• Our trade-of: RSX vs. SPU time
• SPUs take longer
• But SPUs look better
• And RSX was our bottleneck
27
Notes:
- Top line is the RSX, bottom SPUs
- SPUs are slower
- At time of Killzone 2 shipping, Edge is faster!
- Dropped in at last moment
Quality:
- SPUs have higher internal precision
- Dropped in late in the day (everybody was optimizing)
- IGN picked up on it (preview vs. review code)
- Some of the quality itself
- Most of it the tweaks done with the extra budget (increased light map resolution, detail textures, etc.)
28. IMAGE POST-PROCESSING
Input quarter-res image in XDR
28
Notes:
- Output from deferred renderer after MSAA resolve
- 640x360
- No visible motion, quite static
30. IMAGE POST-PROCESSING
Input quarter-res 2D motion vectors
30
Notes:
- Projected 2D motion vector
- Written out to a rendertarget in deferred pass
31. BLOOM + LENS REFLECTION
• Takes roughly 13.1% of one SPU
• SPUs do
• Depth-dependent intensity response curve
• Hierarchical 7x7 gaussian blur (16 bit fixed point)
• Upscaling results from different levels
• Internal Lens Reflection (inspired by Masaki Kawase)
• Accumulating into results buffer
31
Notes:
- First effect we ported, most linear
- 2.6% of five SPUs
- Repeated downsample until 1/64th buffer size
- Accumulate into premultiplied alpha compositing buffer
32. MOTION BLUR
• Takes roughly 9.5% of one SPU
• Input
• Quarter-res image from deferred renderer
• Sixteenth-res 2D motion vector
• Steps:
• Blur motion vectors (dilation)
• Blur image along motion vectors (8-tap point samples)
• Combine blurred image with source
• Use motion vector amplitude as alpha
• Low-motion areas are unaffected (alpha = 0)
32
Notes:
- Second effect we ported
- 1.9% of five SPUs
- Worried about texture sampling
- Ended up not doing bilinear filtering
- Filtering on SPUs is very expensive
- Looks more grainy
- Exactly what we wanted!
- Nice artifacts may stay
- Leaves static parts untouched
33. DEPTH OF FIELD
• Takes roughly 23% of one SPU
• Input
• Quarter-res image from deferred renderer
• Quarter-res depth buffer
• Samples image in floating point
• 36 jittered disc point samples
• Much more than the 8 bilinear samples on RSX!
• Got rid of all banding / stair-stepping
• Weighted by ‘fuzziness’ value based on focus point
33
Notes:
- Last effect we ported
- 4.6% of five SPUs
- Never thought it would work
- But ended up being biggest success!
- The point samples are cheap
- So we could do a lot more!
Quality:
- Quality of effect determined largely by radius of circle of confusion
- Less by sampling quality
- The 36 jittered samples were a test
- But looked so good we kept them in
Edge bleeding:
- Intricate sampling methods to avoid edge bleeding
- From background to foreground
- But heavily blend front-to-background
- See Edge code
34. IMAGE POST-PROCESSING
Image before post
34
Notes:
- Putting it all together
- Image without post
35. IMAGE POST-PROCESSING
Image generated on the SPUs (bloom, DoF, motion blur)
35
Notes:
- Result of all of the SPU jobs combined
- Quarter-res premultiplied alpha format
- Only areas affected by post-processing are present
Visual:
- Motion blur: Actor at front
- Bloom: Sunlight
- Depth of field: Gun is out of focus (it actually always is, too bad for the detail)
36. IMAGE POST-PROCESSING
Composited image
36
Notes:
- Final image as composited by the RSX
- Blending the out-of-focus and blurry areas in
37. IMAGE POST-PROCESSING
Input quarter-res image in XDR
37
CINEMATIC - All of these techniques come directly from film. If we hadnʼt done them on SPUs we may not have been able to do them at this quality
38. SPUs in
GAME CODE
38
Notes:
- Enough graphics
- On to game code, the most difficult area to use SPUs
39. ANIMATION SAMPLING
• Using Edge Animation (in PS3 SDK)
• Our extensions for IK, look-at controller, etc.
• Work per 30 Hz frame
• ~40 animation jobs
• ~200 animations sampled
• Less than 12% SPU time on one SPU
• Was ~20% PPU time with our old code!
Edge Animation Is FAST!
39
Notes:
- Used to be bottleneck on PS2 and on PPU
- Very heavy on animation
- Blending dozens of animations (2 to 50 per humanoid)
- Large part of Edge Animation design inspired by our requirements
- High-level animation code still a bottleneck
Advice:
- If you have animation sampling showing up on PPU
- Use this, itʼs incredibly fast
REALISTIC - We require hundreds of animations to make the characters stick to the ground, move realistically. Animation is key to this, and with it
performance
40. SPUs in
ARTIFICIAL INTELLIGENCE
40
Notes:
- Thatʼs largely it for game code
- We also do raycasts from game code
- Says something about what we can use SPUs for
- Very difficult to use SPUs in pure logic code
- Also felt it wouldnʼt give us much to focus there
- Didnʼt suit our franchise needs
41. WAYPOINT COVER MAPS
• Killzone 2's cover maps are waypoint-based
• Each waypoint has a depth cubemap
• Generated off-line in pre-processing step
• Allows line-of-fire checks between waypoints
• This is how the AI understands the environment
• Suitable for SPUs
• Small data size (compressed cubemaps)
• Very compute-heavy
• Can DMA waypoint data around easily
41
Notes:
- The AI in Killzone 2 ʻseesʼ based on waypoints
- Levels are populated with thousands of waypoints, divided into zones
Reasoning:
- Allows reasoning about tactical waypoint positions
- Can a person standing on waypoint A see me if Iʼm on waypoint B?
- How can I plan a path from A to B without somebody on waypoint C seeing me?
- Large problem space to search
- But allows intelligent behavior
- More checks than we could ever do before
42. WAYPOINT COVER MAPS
Example waypoint cover map
42
Notes:
- White areas are closeby
- Red area is player position
- Depth compressed to 6 bits
- Less detail in floor/ceiling than walls
FAIL:
- Tried all sorts of novel compression techniques on cubemaps
- FAIL
- Simple solution worked best
Visual:
- Small hallway, so lots of closeby walls
- Dark area looks out into room in camera-facing direction
44. THREAT PREDICTION
• Example: You hide behind cover
• Result: AI searches for you, suppressive fire...
• How? (Killzone AI doesn’t cheat! - that much)
• AI remembers time and waypoint of last contact
• Mark waypoints where threat could move to
• If waypoints are visible then remove from list
• i.e. you can see waypoints and threat is not there!
• If waypoint list grows too long, then stop expanding
• If threat’s predicted position is a small set then
• Based on how AI’s brain is configured…
• ...attack position or search based on possible location
44
Notes:
- (run through slide first)
- AI doesnʼt cheat
- We donʼt tell it the player position, ever
- (though game scripters may cheat...)
How this works:
- When an AI loses sight or sound of a threat
- At start of AI update, fire max 4 jobs (one for each ʻlost threatʼ)
- Process current set of possible positions
- Scan all waypoints in this list to see if the player is visible there
- If found -> threat found!
- If not, expand adjacent waypoints to see which ones are visible
- Result is:
- Possible location list
- Fed back into the AI planner and ʻbrainʼ
Deals with:
- People running into a building
- You hiding behind cover and standing up to shoot
- People inside a building shooting out of windows, etc.
45. THREAT PREDICTION
Threat Prediction using SPUs and cover maps
45
Notes:
- Part of Killzone 2 demo level
- Warehouse section
- Green = waypoint grid
46. THREAT PREDICTION
Enemy
Line of sight
Friendly’s position known
Encounter between enemy and friendly
46
Notes:
- Immediately engaged by enemy
- Knows player position
48. THREAT PREDICTION
Possible Player
Player might Hiding Spots
re-appear here
Enemy can’t see these
two waypoints
Player enters cover position
48
Notes:
- Re-appearance waypoint is very important, good place to shoot!
- Helghast shoot at windows, etc. to pin you down
- NOT pre-programmed!
Adjusts to your actions!
50. LINE OF FIRE
• Problem: AI agents running into each other’s line-of-fire
• Solution: Line-of-fire annotations
• Each AI agent publishes ‘hints’
• Calculate which waypoints may be in my line-of-fire
• Published as ‘advice’ for other entities
• Most AI tasks use this advice in path planning
• SPU does many tests
• Each line-of-fire against each waypoint link
• Tapered-capsule vs. tapered-capsule
50
Notes:
- Challenge to make AI look believable in dynamic situations
- Standing in your or their friendʼs line of fire is STUPID
- Breaks all immersion
Approach:
- For each entity engaged in combat
- Check current threat and see which waypoint links lie in the line of fire
- Hints are in the form “donʼt stand here or be shot in the head, KTHXBYE”
Result:
- Flanking, walking around enemies engaged in battle
- Behavior is ʻinterpretedʼ by humans as being more intelligent
53. LINE OF FIRE
Red links = In line of fire of upper NPC
53
REALISTIC & INTENSE - Our AI code on SPUs allow the AI agents to act realistically
The most intense battles in our game are the ones where the AI is chasing you, hunting you down, responding to you.
It doesnʼt become just another pattern-matching game. These guys can be tough as nails, because theyʼre smart.
55. SCHEDULING
• Almost all SPU code we have are jobs
• Many different job managers (middleware)
• Managed by own high-level job manager
• Manages the other job managers / workloads
• Easy to write your own, or grab somebody else’s
• High-level scheduling hardcoded in main loop
• Use barriers, mutexes and semaphores sensibly
• Could use generalization, but good enough for now
• None of this is rocket science, just work
55
Notes:
- High-level scheduling decisions done by hand
- Point of this all is that we had ideas on how to over-engineer it
- But under-engineering works fine and it probably wonʼt change
- Pick whatever suits you best
- PPU and SPU can both add jobs to the queue
56. PPU THREAD
PHYSICS AI GAME CODE PRE-DRAW KICK PHYSICS AI GAME CODE
SPU JOBS
SPU 0
SPU 1
SPU 2
SPU 3
SPU 4
SPU 5
RSX PUSH BUFFER
LEGEND
Audio Particle update Forward display list
System, unzip, etc. Particle vertices Lighting display list
Havok Physics Skinning Shadowmap scene graph
AI Jobs Main scene graph Shadowmap display list
Animation Main display list Post-processing
Ray casts IBL Sampling
56
Buildup is roughly:
- draw SPU6 audio background jobs, audio encoding, nextsynth
- draw other background jobs, decompression
- show AI kicking off logic jobs, picking up the results in the same frame
- show monolithic Havok block with one SPU unclaimed
- show animation jobs starting during the game update
- show particle update jobs starting
- show PPU draw activity
- show skinning jobs starting
- show particle results being picked up on PPU
- while PPU continues, SPU starts firing off particle draw jobs
- main scene graph query begins
- query continues while particle and skinning jobs run
- when query is done, the transparent, lighting and normal geometry jobs start
- the lighting job fires off a few extra scene traversals for shadowcasting
- the geometry job fires off hundreds of lightprobe jobs
- a few shadowmap scene graph queries end and start a shadowmap displaylist builder job
- the SPU post-process jobs are added later in the frame
- this concludes all of the data that needs to be done in the frame
58. RESULTS
• SPUs cost us a lot of development time
• But our code is now future-proof
• Scales well to many cores
• Built a toolset so we can continue this path
• Supported our franchise values
• Cinematic
• Dense
• Realistic
• Intense
58
59. WHAT WE DID (WRONG)
• We started off re-writing a lot of stuff for SPUs
• Special SPU versions of code
• Minimalistic, mirrored equivalents of old structures
• Huge amount of code and data duplication
• Difficult to develop, maintain and debug
• Massive waste of time! Reverted it all!
• Learned how bloated our data structures were
• And how lean they could be
59
We first found it hard to optimize our code for SPUs because our data structures were full of pointer indirections and virtual functions. They were not friendly to the SPUʼs hardcore local
memory scheme. We also had this feeling that you had to have all your data structures for SPUs really, really optimized.
So we ended up doing what many others did and that was creating special code for SPUs that worked on simplified data structures, derived from the bloated data structures we had. This
ended up as a lot of double bookkeeping and special-case code that was hard to maintain, ugly, often incompatible, and just a bad idea.
We reached some sort of mentality shift at some point when we started to notice we could actually compile our entire engine for SPUʼs - even if it wouldnʼt run, we could at least parse the
data structures and use them directly on SPUs. So we took another good look at our data structures, and started optimizing them.
- AI coders:
- difficult to find these tasks that have small domain and large loops
- was relatively easy, not too complicated
- hard to find code whose design fits the SPUs and even worse to change design
- Failed
- A* planning in parallel (now ported to SPU for new game, but only on one SPU)
60. WHAT WE DID (RIGHT)
• All header files cross compile for PPU and SPU
• Many data structures simplified and ‘lowered’
• Low-level code is cross-platform inline in headers
• Code and data structures shared
• DMA high-level engine classes to SPU and back
• Use C++ templates in SPU code
• We try to treat them as generic CPUs and spend our time
making generic optimizations, debugging tools, etc.
60
Notes:
- This was a big mind-switch
- In the end SPUs are not ʻjust another coreʼ
- But it helps if you try to squint your eyes and treat them as one as much as possible
62. INVEST IN THE FUTURE
• The future is memory-local and excessively parallel
• SPUs are just one of these 'new architectures'
• Optimize for the concept, not the implementation
• Keep code portable, maintain only one version
• Keep time in schedule for parallelization of code
62
Notes:
- GPGPU, Larrabee, etc. all work well with similar algorithms
Side-notes:
- SPUs wouldnʼt have been so difficult if the PPU would have been faster
- But theyʼre here and if you invest in them youʼll get payoff in the future
- If you donʼt the problem wonʼt go away, yes itʼs hard
- But if you embrace them youʼll at least get the benefit (on all platforms)
63. TREAT CPU POWER AS A CLUSTER
• Many separate equal resources
ç√ • Think in workloads / jobs
• Build latency into algorithms
• Favor computation-intensive code
• Avoid random memory accesses, globals
63
Notes:
- Makes it easier to think about how to schedule things
- Excessively parallel is something we already know, use that experience
64. DON’T OPTIMIZE TOO EARLY
• Blocking DMA is just as expensive as a cache miss
• You can always low-level optimize later
ç√
• Don’t re-write systems!
• Re-factor them in-place
• Work incrementally
• Don’t port your spaghetti code
64
Notes:
- Optimized code is difficult to maintain
Our approach:
- If if it runs fine on the PPU -> DONʼT port it
- If you get it on the SPU -> COOL!
- If itʼs slow but you donʼt stall on it -> DONʼT optimize it
In general:
- BE LAZY!!
65. FOCUS ON YOUR GAME
• SPUs can add a lot to your game
ç√ • Do things you didn’t think were possible
• Takes a lot of effort to get right
• So choose the things that will matter!
65
Notes:
- Almost too simple of a message
- But canʼt stress this enough
- Donʼt just port anything
- Think:
- Will gamers notice this?
- Improve my MetaCritic score?
- SPUs helped define the look and feel of Killzone 2
- What will they do to your game?
66. RECOMMENDATIONS
• Invest in the Future
• Treat Your CPU Power as a Cluster
• Don’t Optimize Too Early
• Focus on Your Game
66
67. Thank You for Listening
Any Questions?
67
Notes:
- Run around and find me
- Up for sharing experiencing or code