• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Star Ocean 4 - Flexible Shader Managment and Post-processing
 

Star Ocean 4 - Flexible Shader Managment and Post-processing

on

  • 6,642 views

http://research.tri-ace.com/

http://research.tri-ace.com/

Statistics

Views

Total Views
6,642
Views on SlideShare
6,627
Embed Views
15

Actions

Likes
11
Downloads
0
Comments
0

3 Embeds 15

http://toymaker.tistory.com 13
http://www.techgig.com 1
http://115.112.206.131 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Star Ocean 4 - Flexible Shader Managment and Post-processing Star Ocean 4 - Flexible Shader Managment and Post-processing Presentation Transcript

    •  
    • STAR OCEAN 4 : Flexible Shader Management and Post-processing Yoshiharu Gotanda Research and Development Dept., tri-Ace Inc.
    • What is STAR OCEAN 4?
      • Star Ocean: The Last Hope
        • The latest installment of an RPG series from Square-Enix
        • Developed by tri-Ace Inc.
        • On Xbox 360
        • Released in February 2009
          • For North America and Japan
          • Coming soon for Europe
        • Developed on an in-house game engine
          • Also used for Infinite Undiscovery (Xbox 360) and Valkyrie Profile : Covenant of the Plume (NDS)
    • History of STAR OCEAN
      • Drives the studio’s development environment
        • Each game developed for each platform
        • Changed development environments every time
    • STAR OCEAN
      • Released in 1996
        • On SNES (only for Japan)
        • No clear game engine
          • All game code written in assembly
          • Used in-house script language for events
          • Used in-house art tools
    • STAR OCEAN : The Second Story
      • Released in 1998-99
        • On PlayStation
        • Used TPFC (1 st gen.)
          • Basic I/O (virtualization) and memory management, threading, math functions.
          • Written in C++
          • Also used for Valkyrie Profile
          • Used in-house script language for events
        • “ STAR OCEAN : Blue Sphere” released for Game Boy in 2001(only for Japan)
    • STAR OCEAN: Till the End of Time
      • Released in 2003-04
        • On PlayStation 2
        • Used TPFC2 and tpgcLib (2 nd gen.)
          • TPFC for PS2 and Graphics Engine
            • Supported 3dsmax
            • Custom shader engine
            • Flexible Particle System
            • HDR Rendering and Spherical Harmonics Lighting
            • Animation driven physics
              • Most cloth and accessories simulated in real-time, even within cutscenes
            • Also used for Radiata Stories and Valkyrie Profile : Silmeria
          • Used in-house interactive cutscene tool
    • STAR OCEAN: The Last Hope
      • Released in 2009
        • On Xbox 360
        • Used the 3 rd generation engine
          • Also used for Infinite Undiscovery (Xbox 360) and Valkyrie Profile : Covenant of the Plume (NDS)
        • Used in-house interactive cutscene tool
          • 99.5% of cutscenes are rendered in real-time
            • Some scenes used FMV by Square-Enix
    • 3 rd Generation Engine
      • Aska
        • Designed based upon the 2 nd gen engine
        • Integrated Game Development Environment
        • Supports Maya
        • Fully multi-threaded engine
        • Flexible shader system
        • Camera simulation
        • Flexible Particle System
        • HDR Rendering and flexible light/shadow system
        • Animation driven physics
        • Multi-platform
          • PC
          • Xbox 360
          • Nintendo DS
          • Other platforms… 
    • Video (Dev env.)
    • Flexible Shader Management
      • Fully flexible shader management
        • Artists can create materials in Maya
          • Using Hypershade interface
          • Shader binary is automatically generated according to artists setting
    • Design Policy
      • Artists create shaders in Maya
        • They can create shaders without a programmer
          • Not limited by programmers
          • Can try new ideas immediately
        • Need to train artists
          • How to set parameters and construct shaders
          • Knowledge of physics (a little bit)
          • Templates
    • Pros and cons
      • Shader generated at run-time (during development)
        • Pros
          • Don’t have to include shader binary in resource files
          • Artists can freely create shaders
          • Easily support the variation of shaders at run-time
          • Easy to manage a shader binary file
        • Cons
          • Explosion of the number of shader variations
            • Large shader binary
          • Must create possible shader variations
            • Must play through all possible content in the game.
    • Subdivided shaders
      • Implement small shader nodes with specific functionality
        • Correspond to Shader Node in Maya
        • Artists can connect each input and output freely
          • UV, Color, Normal, Alpha…
    • Shader samples - Lighting
      • Shade surfaces with the result from lighting
        • Phong
        • Anisotropic Phong
        • Blinn-Phong
        • Normalized Blinn
        • Ashikhmin
        • Kajiya-Kay
        • Marschner
          • Albedo Map
          • Specular Map
          • Gloss(shininess) Map
          • Fresnel Map
          • Offset Map
          • Translucency
          • Ambient Occlusion
    • Shader samples – Normal, UV
      • Normal generation
        • Normal Mapping
        • Parallax Mapping
        • Parallax Occlusion Mapping
      • UV generation
        • Reflection
          • Sphere
          • Dual Paraboloid
          • Cube
        • UV offset (refraction)
        • UV animation
      • Blur effects
        • Rectangle Distortion
        • Spherical Distortion
        • Radial
        • Others…
    • Shader samples – Shadow, projection
      • Shadow
        • Multiple Uniform
        • Multiple LiSPSM
        • Cascade Shadow
          • PCF
        • Projection Shadow
        • Omni directional Cube Shadow
        • Static Vertex Shadow
        • Static Shadow
      • Projector
        • Color
        • Cube
        • Normal map
    • Shader samples – Computation
      • Arbitrary Blending
        • Shader node that simply blends computed color, normal and so on.
        • Additive, subtractive, alpha blending, Photoshop-like blending and so on.
        • Normal blending
        • Absorption
      • Swizzle
      • Depth processing
        • Detail Mapping
          • Blend normal maps based on the depth
            • Can blend any kind of textures
        • Shader LOD based on the depth
      • Branch
        • Conditional shader branching
          • Can be used for optimization (depend on hardware)
    • Shader samples – Others
      • Alpha generation
        • Fresnel
        • Soft Polygon
      • Texture sampling
        • Shader node fetching a texture
          • Standard texture
          • Multipass rendered texture
          • Global texture
        • Compressed HDR texture
      • Vertex Shaders
        • Skinning
        • Volume Rendering
        • Vertex Color
        • Vertex blending
      • Particle rendering shader
        • Many kinds of particle renderers
      • And so on…
    • Shader samples – Post-processing effects
      • Tone mapping
        • Standard tone mapping
        • Film Simulation
          • Reproduce specifications of films or C-MOS sensors
          • Reproduce film grain or digital noise
        • Dithering
      • Lens simulation
        • Focus Blur (DOF)
        • Glare
        • Physically-based Lens Structure
        • Motion blur
          • Camera
          • Object
      • Color filters
        • Contrast
        • Brightness
        • Monotone
        • Tone Curve
        • Color Temperature
      • Other effects
        • Outdoor Light Scattering
        • Light Shaft simulation
        • Screen-space Ambient Occlusion
    • Other features
      • Shader flexibility
        • Arbitrary number of…
        • Or any combination of…
          • Per-pixel lights and light type
          • Per-vertex lights
          • Shadows and shadow algorithms
          • Projectors
    • Shader node example – simple Specular Map Texture Normal Map Texture Albedo Texture Ashikhmin BRDF Shader
    • Shader node example – artistic Color Texture for albedo Tangent space normal map texture Specular color texture Phong BRDF Shader Fresnel Shader Output Diffuse Color Gray scaled color output from Fresnel Shader
    • Shader node example - background Parallax Occlusion Map Shader Output Normal Dual paraboloid environment map texture Color map for albedo with modified UV Blinn-Phong BRDF Shader
    • Shader node example - effect Animated UV Offset Blending 2 textures UVs are shifted based off normal map
    • Shader node example – detail map LOD factor computation node Base Normal Map Texture Normal Map blended with three other textures using LOD factor Albedo Texture
    • AHSL
      • In-house shader language based on HLSL and Cg
        • Each shader node has its own AHSL source file
          • Some shader nodes correspond to multiple source files
        • AHSL Shader Manager class generates shader binaries by linking all code specified in Maya materials at run-time
    • Shader Immediate Constants
      • Change some constant values to immediate values
        • For optimization
        • Use immediate values for generation of source code instead of using constant registers for specific constant vectors
          • (0,0,0,0)
          • (0,0,0,1)
          • (1,1,1,0)
          • (1,1,1,1)
      float4 eConstCol0; float4 LoadConst0() { return eConstCol0;} float4 LoadConst0() { return float4 (0,0,0,0)}
    • Shader Cache
      • Store complied shaders in the shader cache file on devkit
        • Cache components
          • ShaderKey
            • Specifies a unique ID (includes shader structure)
          • Constant table
            • Information of constants
          • Shader binary
        • Assumes that this file contains all possible shader combinations which are used in a game
    • Shader profile data
      • Supports development
        • Stored in different file from the cache file
        • Store information corresponding to each ShaderKey
          • Total time used, Time last used
          • # of queries, Time to generate
          • Name of build devkit
          • Region information
          • # of merges
            • And so on…
    • Previous problems…
      • Development period
        • Star Ocean: TLH was developed almost simultaneously with Infinite Undiscovery
          • Infinite Undiscovery finished slightly earlier
        • Most problems related to shader management observed in Infinite Undiscovery
          • Same problems occurred in Star Ocean: TLH
          • Most problems were solved in Infinite Undiscovery
            • Some information came from Infinite Undiscovery
    • Problems
      • This implementation requires shader generation
        • Essentially compile shaders at run-time
          • Generated shaders stored in the cache file
          • Rebuild in the case of a version update
            • Or remove the cache file
        • A foreseen problem with cache file generation and management when beginning engine design
          • Becomes tangible as approaching to end of project (Infinite Undiscovery)
    • The size of cache file
      • Don’t compile shaders in the shipped build
        • Fundamentally the shader cache is created during QA
          • Shaders CAN be compiled in the shipped build
            • Didn’t use, looks ugly
        • The size of the cache file increased as approached end of project
          • Estimated 10Mbytes at first
            • Actually, seemed to exceed estimate
    • Solutions
      • To solve the size problem
        • Decompress each shader binary at run-time
        • Separated shader cache to L1 and L2
        • Supported multiple shader cache files
        • Created a tool to manage shader files in Windows
        • Implemented performance vs. size control parameters
    • But…
      • Increased the size
        • Despite a lot of effort
          • The size exceeded 50Mbytes! (Infinite Undiscovery)
          • More than 30,000 cache combinations
          • Separated the cache files but they still exceeded acceptable file size
        • Started analyzing the details of the shader cache files
    • The problem of Shader Adaptors
      • Shader Adaptors dominated most of the shader combinations
        • What is a Shader Adaptor?
          • Shaders added at run-time
            • Shadows, projectors and so on…
            • These kinds of shaders occupied 80%
              • Especially shadows
              • There was a shader supporting 5 shadows for an object
    • Example All Adaptors are shadow maps in this shader Many shaders using 5 Shader Adaptors
    • Limitation
      • Supported a limitation to the number of Shader Adaptors (shadows)
        • To limit the number during generation or in the tool
          • The size was dramatically decreased
            • Caused appearance problems
              • Manually adjusted
    • Non-generated shaders
      • Implemented functionality for shaders not generated
        • In case of Shader Adaptors
          • Used base shaders
          • Better than disappearing
    • Cache file creation
      • The cache file was created by the QA team
        • Created the cache file when the specifications and resources relating to the shader were fixed
          • Using debug functionality
          • Play through the game
          • Merge files created by multiple testers
        • This process was much tougher than expected
          • Several weeks with tens of testers
          • Done again and again because it was an unfamiliar system
    • Statistics
      • Infinite Undiscovery separated the file into 4 files.
        • Common Region contains shaders shared for both areas
      11120 14.8M Area2 6804 9.0M Area1 9158 9.8M Common 26 18k Boot # of shaders Size Name of Region
    • Statistics
      • Star Ocean: TLH separated the file into 4 files
        • Star Ocean: TLH only has Boot Region as a common shader binary
          • Disc1, 2 and 3 shader files include common shaders
      15810 19.8M Disc3 20417 25.4M Disc2 18882 24.1M Disc1 356 293k Boot # of shaders Size Name of region
    • Summary of size problem
      • The engine was developing simultaneously with game development for earlier projects
        • Variations in Shader Adaptors
          • Many shadow algorithms implemented
        • Variations in Shader Immediate Constants
          • Found a solution
            • Didn’t use at this time
        • Increased the number of shaders as new shaders were implemented
          • Final shaders differed dramatically despite similar appearance
          • Artists repeated a trial and error process because it was their first experience
    • Disadvantages
      • The cost of shader cache creation
        • Limitation of automatic creation
        • The problem of file size
          • Cache file size too big
            • Probably no problem for next-gen?
          • Consequently, should be conscious of this in the case of creating resources
          • Try to cut down the shader count
      • Difficulty of creating shaders
        • Artists must know the mechanism of shaders
    • Advantages
      • High flexibility
        • Artists can create various shaders without programmers
          • Can create an unreasonable combination of shaders
        • Performance optimization
          • Shader Immediate Constants…
        • Optimal shader code can be automatically generated using a shader complier
    • More problems
      • Solution for shader creation
        • More automation
          • Create estimated shader combinations off line?
          • Create shaders using scene information or database?
        • Optimize shader generation
          • Rebuild
          • Distributed shader generation
        • Reduce combinations
          • Run-time shader profiler
        • GPU support?
          • DX11?
    • This system is…
      • Also used for
        • Particle rendering
          • Many variations of particles
        • Post-processing shader
          • Flexible post-processing
    • Flexible post-processing
      • Our engine tries to implement
        • Camera system simulation
          • Physically plausible
          • Visually plausible
        • In order to achieve photo-realistic camera simulation, HDR rendering is very important
          • Camera simulation is an efficient approach to utilize a HDR rendering result
        • Post-processing is a realistic solution to simulate phenomena occurring in a camera
          • Ray-tracing is an ideal solution
    • Photo-realistic Camera System
        • Our engine tries to implement a physically-based camera system
      200mm F2.8
    • Why physics?
      • Unnatural effects from the camera system
        • An audience feels incongruity from situations which are impossible to shoot using a real camera
          • Grow up watching a lot of visuals through a real camera
            • TV programs
            • Films
          • Very shallow depth of field with a wide photo lens or a far focal distance?
          • Large F value when shooting a dark scene
            • Long time exposure? Motion Blur?
          • No noise or grain within dark scenes
            • Need motion blur or hand vibration when carrying a camera by hand?
            • Be conscious of shooting situations
            • Time value can’t exceed the frame rate in the case of movies
          • No diffraction in bright pixels
          • And so on…
        • Physically plausible effects make visuals persuasive and give presence
          • Has been used in animation for a long time
      Effect from diffraction Less noise but long motion blur Noisy but short motion blur
    • Before physics
      • Rendering must be processed in linear color space
        • Difficult to process properly without linear color space
          • Shading (BRDF)
          • Post-processing
            • The law of energy conservation
          • Proper texture filtering
          • Be aware of the color space
          • sRGB space of textures should be converted to linear
          • Ideally spectrum based rendering?
        • Finally apply gamma correction
          • Appropriate gamma correction for the output device
    • Bokeh Effect (DOF)
    • High quality bokeh
      • Characteristics of our bokeh
        • Can adjust actual camera parameters
          • F-stop, Focal length, real lens, ISO sensitivity, etc…
        • Quality settings hardly affect look of bokeh
          • Reduced buffer size, tap count, etc…
        • High quality, fine blur
          • Try to focus on only one eye in close-up shots
            • Very shallow depth of field
        • Easy shader management with AHSL system
    • Bokeh – Low profile 1/16 & 1/64 Reduced buffer 33 tap GPU 1.3[ms]
    • Bokeh – Standard profile 1/4 & 1/16 Reduced buffer 21 tap GPU 3.1[ms]
    • Bokeh – Reference profile Use full size buffer 227 tap GPU 246.6[ms]
    • Lens Equation
      • The size of bokeh can be computed with the Lens Equation
      x: diameter of blur (CoC) o: object distance p: focal distance f: focal length F: F-stop
    • Blur kernel
      • Blurred light is distributed equally
        • Without considering aberrations
        • Blur kernel should be flat
          • Not Gaussian
      In focus (focus on the plane) Out of focus (Further distances focus off the plane) Image plane
    • Gaussian blur
      • Bokeh with Gaussian blur is often seen
        • Not visually pleasing
        • Looks softer than reality
          • No sharp edge on blur (especially in bright places)
      Gauss Flat
    • Blend with blurred image
      • Bokeh by simply blending with a blurred image
        • Also not visually pleasing
        • Looks like the soft focus effect
    • Blend vs. Per-pixel blur Blend Per-pixel (21taps)
    • Bokeh diagram Output (Gathered during Tonemapping Process) Blend Mipmapped color buffers Reduced color buffer Fixed tap count blur on Large Reduced Buffer High tap count blur on Small Reduced Buffer Create masks for all the bokeh shaders from depth buffer
    • Bokeh diagram (detail) Output (Gathered during Tonemapping Process) Fixed tap count blur and blend with SRB on large reduced buffer High tap count blur on Small Reduced Buffer (SRB) Create mipmap src. (put mask to A) Generate A lot of Masks (A,R,G,B)x2 from Depth Dilate Front Mask Create Half Reduced Buffer
    • Mipmapped source textures
      • Create mipmapped textures of the back buffer
        • Trade off between the tap count and sharpness using tri-linear filtering
      Different Source Tex LodBias, same tap count More correct blur radius Smoother but less correct
    • Bleeding artifacts
      • A phenomenon where foreground pixels bleed into the background
      Fixed
    • Solution for bleeding artifact
      • Create blurred image from the source image and the mask image
        • Bleeding occurs on bright boundaries because of bilinear filtering
      • Create blurred image with the source image pre-multiplied by the mask image
        • Expensive, need to process the foreground blur and background blur separately
      Using Source Image & Mask Using pre-multiplied Source Image Bleeding Remains
    • Creating foreground mask
      • Expanding mask texture with 10 taps for foreground bokeh
      • fetch 10 surrounding points for dilate
      • 9 tap Gaussian blur (4 fetches)
      1 2
    • Bokeh AHSL shader variation
      • Max 3*2 3 = 24 variations
        • Region (Front, Back, Both)
        • Tap count (odd, even)
        • Mask type
        • Eclipse
    • Aperture shape
      • F-Stop describes how wide the aperture is
      • The shape of aperture is…
        • More circular when opened
        • Tends towards a polygonal shape as it is closed
      F5.6 F8.0 F13 Fully opened
    • Variable bokeh shape
      • Bokeh shape changes with F-stop
      100mm 1/30 F12.5 (our simulation) 100mm 1/250 F4.3 (our simulation) 28-200mm F3.5-5.6 (6 blades) (91taps)
    • Comparison with Real Photo Our simulation Real photo F2.8 F3.2 F3.5 F4 F5.6 A light source of 5mm in diameter at a distance of 3m from the camera. The focus distance is 1.5m, with 70-200mm F2.8 lens at 200mm
    • Comparison with Real Photo Our simulation Real photo F8 F11 F16 F22 F32
    • One more thing
      • During the experiment, the difference in size between F2.8 and F3.5 in the real photo is smaller than the theoretical calculation
        • Actually, bokeh at F2.8 is optically vignetted
      F2.8 (without vignetting simulation) F2.8 F3.5 Real Photo F2.8 & F3.5
    • Optical Vignetting
      • The phenomenon where light entering near the edge of the lens at an obtuse angle can’t reach the film because it is obstructed by the lens barrel
        • The actual physical phenomenon is very complicated
        • Our implementation for lens structures is approximated rather than using real physical properties
      Blue line indicates that ray is obstructed Film
    • Optical Vignetting
      • Recognized by…
        • The corners of the picture are darkened
        • “ Cat eye effect” (eclipsed bokeh)
      24mm F2.8 24mm F5.6
    • Implementation
      • Compute the attenuation curve and eclipsing ratio using a virtual lens database
      Our simulation Falloff illuminance from Cosine Fourth Law and optical vignetting
    • Struct LensParameter struct LensParameter { string szName[64]; u8 nAppertureAngleNumber; f32 fDesignedFilmSize; f32 fMinFStop; f32 fMaxFStop; f32 fFStopZoom; f32 fMinFocusDepth; f32 fMinProjectionDistance; f32 fMaxProjectionDistance; // ----------------- Vignetting Distance // || --------------- <-----------> // || ---------------- | // || || | || | // Entrance || Open Ap ||Vignetting|| | // Size || Size ||Size || | // || || | || | // || ---------------- | // || --------------- <-Frange Back -> // ----------------- ^Iris // <- Iris Distance -> // <------------ Entrance Distance ------------> f32 fEntranceDistance; f32 fEntranceSize; f32 fApertureDistance; f32 fOpenApertureSize;
    • View angle change
      • The focal distance changes with the focal length in a real camera
        • Means that changing the focal distance affects the view angle
      Focus on this glass Easily recognize view angle change This lens can focus between 1.5m – 10m – Inf. Even with a 50cm focal distance change as shown in these pictures, recognizable view change appears. This change can be easily calculated with the lens equation.
    • Comparison of view angle When focusing on a closer point, focal length gets longer
    • Comparison of view angle
    • Theory
      • If you want to change the focal distance (a) and don’t want to change the view angle (m), change the distance to the image plane (b)
        • However the distance to the image plane can’t be moved because it’s based on camera design
          • For some lenses, distance to the image plane can be “virtually” changed
        • Therefore, the view angle changes when changing the focal distance
      a b f Lens Formula a: distance to object b: distance to image plane f: focal length magnification (FOV)
    • Motion Blur 35mm 1/60sec F14
    • What is motion blur?
      • Film is exposed to light while the shutter is open
        • As the camera or subjects move during exposure, blur appears
      • Dt ≠ Time value
      • Implementation is easy and produces high quality visuals
    • Camera motion blur
      • Create a velocity map from the depth buffer using the different matrices of the camera across frames
    • Object Motion Blur
      • Create a velocity map of the object in another pass
        • Smear the velocity map based on each velocity vector
      (1/64 reduced buffer)
    • Without motion blur
    • With motion blur
    • Tonemapping & filters
    • Glare filter
      • Multiple Gaussian Filter
        • Not physically-based
        • “ Log base” emphasis instead of “Threshold base”
        • “ Energy conservation” is implemented (optional)
      + =
    • Tonemapping curve
      • Instead of using a specific algorithm, use specifications from films or C-MOS sensors to create the curve table
        • The curve is compressed based on the log basis
          • float u = saturate(log2(vInCol.r+1)/2.32); vOutCol.r = tex1D(s, u).r;
    • Print
      • Negative films are developed in negative colors
        • Negative film requires printing for correct colors
          • Orange removal is also required
        • Film duplication process was implemented
    • Film profile comparison Original (used in the game) Company K, Reversal Company F, Reversal
    • Film profile comparison Original (used in the game) Company K, Reversal Company K, Negative
    • Film profile comparison Company F, Negative Company C, Digital
    • Film profile comparison Original (used in the game) Company K, Reversal No film profile (Reinhard)
    • Film profile comparison (all) Original K, Rev. K, Neg. F, Rev. F, Neg. C, Digital Reinhard
    • Film grain
      • Developed a tool to create film grain based on a simulation
        • “ Multiply ” blending
      ISO 50 ISO 400
    • C-MOS noise
      • “ Additive ” blending
      • Bayer pattern × White noise
        • Considering color interpolation
    • Results of grain Digital noise Film grain Enlarged and brightened in Photoshop (Digital camera profile)
    • Other simulations
      • Auto Focus / Auto Exposure
        • AF and AE are also implemented based on a real camera
          • Professionally, AF and AE are essentially not used
          • Only used for special cases
            • AE was used for in-game exposure control
    • Custom Filters
      • Combination of matrix and texture fetch
        • Easy to edit in Maya
          • Brightness & Contrast
          • Tone curve
          • Color temperature
          • Gamma
          • Chroma correction
    • Dither
      • Keep color information during dither process when processing gamma correction
        • HDR should have more than 8 bits of information
        • Fp32 in the shader
        • Disadvantage of noisy image caused by dithering
        • Probably make visuals worse with dithering and the display’s visual processing
        • Can use information of more than 8 bits with DeepColor
          • In case of using analog interfaces, depends on A/D converter in the output device
      Emphasized dithering
    • Tonemapping flow Gather Buffers (DOF, Glare) Pre matrix transform (Spectral film characteristic) Added Noise (Film grain, C-MOS noise ) Tex fetched transform (Film curve, Tone curve, gamma …) Matrix transform (chroma correction, …) Dithering Output to LDR buffer Input HDR buffers
    • Post-process AHSL shader variations
      • Max 2 10 = 1024 variations
        • Focus blur enabled (+ vignetting)
        • Textured tonemap
        • Pre matrix transform
        • Matrix transform
        • Film grain (or CCD noise)
        • Dithering
        • Depth buffer (input, output)
    • Results without post-process with post-process (high quality) with post-process (original) 85mm F6.3
    • Results without post-process with post-process (high quality) with post-process (original) 50mm F2.8
    • Results without post-process with post-process (high quality) with post-process (original) 24-70mm F12 58mm
    • Problem
      • Artists couldn’t handle many parameters
        • The parameters are based on real camera parameters
          • F-stop
          • Time Value
          • Focal distance
          • Lens choice
          • Film or sensor choice
          • Sensitivity
          • Etc…
        • However, in order to control quality, a lot of other (non-physical) parameters are implemented
          • Templates were implemented
    • Finding a solution
      • In-house seminars were held
        • Although it was difficult to cover all parameters
        • However, for Star Ocean: TLH
          • Some simulations weren’t used
          • Some quality parameters weren’t set efficiently
            • The quality got worse, less than needed in some cases
            • Lower quality than prepared templates for the performance
              • Although, no gain performance-wise
      Unnatural bokeh caused by too few taps
    • Summary
      • Camera parameters affect each other
        • Large F-stop number
          • Long time value or high sensitivity
            • Long motion blur or coarser grain
            • Small bokeh (Focus blur)
        • Small F-stop number
          • Short time value or low sensitivity
            • Short motion blur or finer grain
            • Large bokeh
            • Optical Vignetting
        • Changing focal distance
          • Affects the angle of view and the size of bokeh at the same time
        • Various optical phenomena occur due to limitations in lens structure and camera
    • More simulations
      • More physically based simulations
        • Diffraction simulation
          • Rather than simple glare and stars
        • Aberration simulation
        • Ghost simulation
          • Rather than image based one
        • Scatter-based implementation
    • Acknowledgements
      • Copyrights
        • © 1996 tri-Ace Ltd./MEIMU/SQUARE ENIX CO., LTD. All Rights Reserved.
        • © 1998 tri-Ace Inc./LINKS/Minato Koio/SQUARE ENIX CO., LTD. All Rights Reserved.
        • © 2001 tri-Ace Inc./Mayumi Azuma/SQUARE ENIX CO., LTD. All Rights Reserved.
        • © 2003 tri-Ace Inc./SQUARE ENIX CO., LTD. All Rights Reserved.
        • © 2004 tri-Ace Inc./SQUARE ENIX CO., LTD. All Rights Reserved.
        • © 2009 SQUARE ENIX CO., LTD. All Rights Reserved. Developed by tri-Ace Inc.
    • Acknowledgements
      • Assistance for slides
        • Tatsuya Shoji, R&D
        • Daisuke Sugiura, R&D
        • Paul Caristino, R&D
        • Chuk Tang, R&D
        • Masaki Kawase, Silicon Studio Corporation
        • Many screenshots and video from Star Ocean: TLH team
    • Questions?
      • These slides can be found on
        • http://research.tri-ace.com/
          • More details about flexible shader management and camera simulation are described in the other slides
            • Unfortunately, written in Japanese 
        • Questions are welcome
          • [email_address]