SlideShare a Scribd company logo
1 of 186
Download to read offline
The Importance of Terminology
and sRGB Uncertainty
Notes - 0.4
colour-science.org
1
Foreword
This presentation is the organised and formatted embodiment of the Colour
Science notes I have taken along the years. It is aimed at the VFX industry,
and is the work in-progress subset of a broader and generic Colour Science
presentation. Its creation wouldn’t have been possible without the works and
references cited in the Bibliography section.
Thomas Mansencal
2
The sRGB Uncertainty
3
The sRGB Uncertainty
• Understanding linear and sRGB color spaces : What does this mean?
sRGB is intrinsically linear!
• “We’ll start by learning how the sRGB and linear color spaces differ.”
• This is confusing for non experts because omitting an explicit emphasis of
the affected component of the sRGB colourspace.
4
What is Colour?
“Almost everyone knows what color is. After all, they have had firsthand
experience of it since shortly after birth. However, very few can precisely
describe their color experiences or even precisely define color.” [1]
1. Fairchild, M. D. (2013). Color Appearance Models (3rd ed., pp. 1–10831). Wiley. ISBN:B00DAYO8E2 5
What is Colour?
• Characteristic of visual perception that can be described by attributes of
hue, brightness (or lightness) and colourfulness (or saturation or chroma).
[1]
• Colour is perceived when light interacts with the human visual system
(HVS).
1. CIE. (n.d.). 17-198 colour (perceived). Retrieved June 26, 2014, from http://eilv.cie.co.at/term/198 6
Additive RGB Colourspace
7
Additive RGB Colourspace
• An additive RGB colourspace is defined by specifying 3 mandatory
components:
• Primaries
• Whitepoint
• Conversion Functions (OECF and EOCF)
8
Additive RGB Colourspace
• An additive RGB colourspace is a colorimetric colour space having three
colour primaries (generally red, green and blue) such that CIE XYZ
tristimulus values can be determined from the RGB colour space values
by forming a weighted combination of the CIE XYZ tristimulus values for
the individual colour primaries, where the weights are proportional to the
radiometrically linear colour space values for the corresponding colour
primaries. [1]
• NOTE 2 Additive RGB colour spaces are defined by specifying the CIE
chromaticity values for a set of additive RGB primaries and a colour
space white point, together with a colour component transfer
function.
1. ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology - Extended colour encodings for digital image storage,
manipulation and interchange, 2004. 9
Primaries
10
Primaries
• The primaries chromaticity coordinates define the gamut of colours that
can be encoded by a given RGB colourspace.
• While commonly represented as triangles on a chromaticity diagram (such
as the CIE 1931 Chromaticity Diagram), RGB colourspace gamuts define
the boundaries of an actual solid within the CIE xyY colourspace.
11
Whitepoint
• The colourspace whitepoint is defined as the colour stimulus to which
colour space values are normalized. [1]
• Any colour lying on the neutral axis normal to the xy plane and passing
through the whitepoint, no matter its luminance, will be achromatic.
1. ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology - Extended colour encodings for digital image storage,
manipulation and interchange, 2004. 12
Whitepoint
13
Conversion Functions (Transfer Functions)
• A colour component conversion function is defined as a single variable,
monotonic mathematical function applied individually to one or more
colour channels of a colour space. [1]
• They perform the mapping between the linear light components /
tristimulus values and a non-linear R'G'B' video signal.
• They are commonly used for faithful representation of images and
perceptual coding in relation with display non linear response and HVS
non linearity.
1. ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology - Extended colour encodings for digital image storage,
manipulation and interchange, 2004. 14
Opto-electronic conversion function
15
Opto-electronic conversion function
• The opto-electronic conversion function (OECF or OETF) maps (encodes)
estimated tristimulus values in a scene to a non-linear R'G'B' video
component signal value.
• Typical OECFs are usually expressed by a power function with an
exponent between 0.4 and 0.5.
16
Electro-optical conversion function
17
Electro-optical conversion function
• The electro-optical conversion function (EOCF or EOTF) maps (decodes) a
non-linear R'G'B' video component signal to a tristimulus value at the
display.
• Typical EOCFs are usually expressed by a power function with an
exponent between 2.2 and 2.6.
18
Misleading Terminology
19
Misleading Terminology
Nuke’s Read node colorspace knob until Nuke 10 is only specifying an
electro-optical conversion function and will not perform gamut change.
20
Non Linearity of the Human Visual System
1. Davson, H. (1990). Physiology of the Eye (5th ed.). Elsevier Science Ltd. ISBN:978-0080379074 - colour-science.org 21
Non Linearity of the Human Visual System
• Weber’s law states that the just-noticeable difference (JND) between two
stimuli is proportional to the magnitude of the stimuli: an increment is
judged relative to the previous amount.
• Fechner mathematically characterised Weber’s law showing that it follows
a logarithmic transformation: the perceived magnitude of a stimulus is
proportional to the logarithm of the physical stimulus intensity.
22
Non Linearity of the Human Visual System
• Fechner’s scaling has been found to apply to the perception of brightness,
at moderate and high brightness, with perceived brightness being
proportional to the logarithm of the actual intensity.
• At lower levels of brightness, the de Vries-Rose law applies which states
that the perception of brightness is proportional to the square root of the
actual intensity.
23
Non Linearity of the Human Visual System
• Stevens’s law supersedes Fechner's law and addresses its lack of
generality.
• The results of the physical-perceptual relationship of his experiments on a
logarithmic scale were characterised by straight lines with different slopes,
suggesting that the relationship between perceptual magnitude and
stimulus intensity follows a power law with varying exponent.
24
Steven’s Law
25
Steven’s Law
26
Lightness - CIE L*
27
Lightness - CIE L*
• Because of the various HVS adaptation mechanisms, perceived
brightness has a non-linear relationship with the actual physical intensity of
the stimulus.
• It is commonly approximated by a cube root.
• Multiple approximations of lightness (or value in the Munsell Renotation
System) were proposed leading to the creation of CIE L* in 1976.
• CIE L* characterises the perceptual response to relative luminance.
28
Colour Imaging System
29
Colour Imaging System
• A colour imaging system embodies any combination of technologies and
devices required to perform:
• Image capture
• Signal processing
• Image formation
30
Colour Imaging System
31
Image Capture
• Image capture / acquisition of colour stimuli can be performed in a
number of different ways using for example:
• An electronic device (electronic video camera, DSLR)
• Photographic film
32
Electronic Capture
• A movie camera may use a solid-state image sensor (CCD or CMOS) that
absorbs photons of light.
• As photons absorption occurs, electrons are collected into charge
packets.
• The image signal is produced by a sequential readout of the packets.
33
Electronic Capture
• Accurate image reproduction requires the capture device to be at least
trichromatic implying that colour stimuli spectral power distributions must
be separated into 3 colour signals.
• This separation can be achieved with:
• A beam splitter / colour filters combined to three sensors on high end
capture devices resulting in reduced noise and increased resolution.
• A single sensor covered with a mosaic of colour filters on systems
requiring a small form factor and lower price.
• Three sensor layers with different responses to wavelengths of light
stacked together similarly to photographic film (Foveon).
34
Photographic Film Capture
1. https://www.fujifilmusa.com/shared/bin/AF3-150E_Sensia100_Data_Sheet_2003.pdf 35
Photographic Negative Film
• A photographic film has red-, green-, and blue-light-sensitive layers
coated on a transparent base.
• The red and green layers are also sensitive to blue light, thus a yellow
filtering layer is placed above them. It will be made colourless during
chemical processing.
• Light sensitivity is induced by silver halide grains with appropriate spectral
response scattered within each light sensitive layer. The sensitive layers
also contain an appropriate dye coupler.
36
Image Formation
1. https://commons.wikimedia.org/wiki/File:AdditiveColor.svg
2. https://commons.wikimedia.org/wiki/File:SubtractiveColor.svg 37
Image Formation
• The processed image signals control colour-forming elements of the
image formation medium / device.
• Two categories of image formation exist:
• Additive colour
• Subtractive colour
38
Additive Colour Formation
• CRT, LCD or plasma displays mix red, green and blue light through pixels
adjacency.
• DLP, digital cinema projectors perform superposition by using a beam
combiner.
39
Subtractive Colour Formation
• Photographic film use cyan, magenta and yellow dyes to absorb red,
green and blue light.
• Similarly, most printing processes use CMY inks.
• Colour stimuli formed by subtractive colour are dependent (and affected)
by the viewing light source.
40
Picture Rendering
• The colour imaging system usually achieves representation of a scene in a
way that matches viewer expectation of the appearance of that scene
instead of attempting to reproduce physical colour stimuli quantities.
• A sunlight outdoor scene can have luminance of 50,000 cd.m-2 but may be
displayed on a consumer electronic display with white peak luminance of
320 cd.m-2.
41
Picture Rendering
• The different viewing conditions and image formation medium / device
capabilities impose that scene luminance must be mapped to image
formation medium / device luminance.
• A simple linear mapping from scene luminance to image formation
medium / device luminance is not satisfactory.
• Picture rendering adjusts the tone scale to achieve a perceptual uniform
mapping.
42
Non Triviality of Picture Rendering
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 43
Non Triviality of Picture Rendering
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 44
Non Triviality of Picture Rendering
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 45
Non Triviality of Picture Rendering
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 46
Effect of Lateral-Brightness Adaptation
Images seen with a dark surround appear to have less contrast than if
viewed with a dim, average or bright surround.
47
Effect of Lateral-Brightness Adaptation
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 48
Colour Encoding
• A colour encoding is a digital representation of colours for image
processing, storage, and interchange between systems.
• A colour encoding specification (standardised input / output interface of a
colour imaging system) must define:
• A colour encoding method which determines the meaning of the
encoded data or what will be represented by the data.
• A colour encoding data metric characterising the colourspace and the
numerical units used to encode the data or how the the representation
will be numerically expressed.
49
Image States
• The image state concept was defined by Madden & Giorgianni.
• Some signal processing operations make the image transition to a different
colorimetric state.
• An image may exist in scene state which is not directly viewable on typical
image formation devices and must be transitioned to a new state, the
rendered state.
50
Image States
• A colour encoding specification defined in relation to scene quantities is
said to be scene-referred: it has a colorimetric link to a scene.
• A colour encoding specification defined in relation to digital display
characteristics is said to be display-referred (rendered state): it has a
colorimetric link to a digital display device.
51
Display-Referred Imaging
• Raw image processors used by photographers (Lightroom, Darktable,
DCRaw, etc…) perform picture rendering on the raw scene referred data
to deliver a display-referred image.
• Artists achieving direct content creation in 2d applications are generating
display-referred content.
• Images available on the Internet such as on Google Images or textures
vendors website are output- / display-referred.
• A photograph taken on a mobile phone and uploaded to a social network
is display-referred.
52
Display-Referred Imaging
• Display-referred imagery created and exhibited on a display that matches
a standard reference (using sRGB specification and viewing conditions)
will appear the same across similar display devices without any further
action required.
• A photograph processed on a consumer graphics desktop and output as
a sRGB JPG or PNG file will approximately look the same on other
consumer graphics desktop.
53
Display-Referred Imaging
• Display-referred imagery has usually a restricted luminance dynamic
range and limited colour gamut thus some of the original captured scene-
referred data is lost upon encoding.
• This is unsuitable if the image is meant to be viewed on different image
formation devices with wider dynamic range.
54
Sony F35 - Out of Gamut Colours
1. http://www.oscars.org/science-technology/sci-tech-projects/aces 55
Sony F35 - Out of Gamut Colours
56
Sony F35 - Out of Gamut Colours
57
Scene-Referred Imaging
• Scene-referred representation of data contains enough information to
achieve the desired appearance of the scene on a variety of image
formation medium / device.
• Scene-referred imaging is the basis of physically-based rendering
allowing to reproduce realistic light interaction using plausible light
quantities. It makes possible realistic camera effects (motion-blur,
defocus).
58
Scene-Referred Imaging
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 59
Scene-Referred Imaging
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 60
Scene-Referred Imaging
1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 61
Scene-Referred Imaging
• Measured scene linear-light quantities are usually normalised to a known
reference.
• Commonly middle grey is set at luminance = 0.18 which is the reflectance
of:
• Reference Kodak 18% Grey Card
• Background colour of a DSC Labs CamAlign ChromaDuMonde chart
• Reflectance of a X-Rite ColorChecker neutral 5 (.70 D) sample is ≈ 19%!
62
Scene-Referred Imaging
63
Energy Conservation
• Anti-aliasing or image filtering operations should be energy preserving: the
total light emitted from the display should remain the same after the
processing operations.
• Resizing an image should not affect its luminance.
• Those operations must be performed on linear image data
64
Linear Data Base
65
Blur in Linear Colourspace
66
Blur in Non-Linear Colourspace
67
Digital Image - Raster Graphics
• A digital image is a rectangular data structure (a 2 or 3-dimensional array)
of picture elements (pixels).
• A pixel colour is determined by a single code for achromatic images or
multiple codes for chromatic images (commonly three).
68
Digital Image - Raster Graphics
69
Digital Image - Raster Graphics
70
Digital Image - Raster Graphics
71
Quantization
• Quantization is the process of mapping a continuous signal (or large set of
input values) to a smaller set.
• Information between each quantizer steps is discarded and lost.
• Quantization error (signal distortion) decreases signal-to-noise ratio (SNR).
• Banding and contouring artefacts can be reduced by introducing a small
amount of noise (≈ 1 / 2 quantiser step) prior to the quantization. Dithering
decreases the SNR.
72
Quantization
73
Quantization
74
Quantization
75
Quantization
76
Quantization
Input Signal
1. Getty Images. (n.d.). Getty Images Test Image. Retrieved June 20, 2003, from https://www.drycreekphoto.com/tools/ 77
Quantization
4-Bit Linear Quantization
1. Getty Images. (n.d.). Getty Images Test Image. Retrieved June 20, 2003, from https://www.drycreekphoto.com/tools/ 78
Quantization
4-Bit Perceptually Uniform Quantization
1. Getty Images. (n.d.). Getty Images Test Image. Retrieved June 20, 2003, from https://www.drycreekphoto.com/tools/ 79
Perceptual Uniformity
80
Perceptual Uniformity
81
Perceptual Uniformity
82
Perceptual Uniformity
• A colour imaging system is perceptually uniform if a small perturbation of a
component value is approximately equally perceptible across the range of
that value. [1]
• Most electronic colour imaging systems account for non linearity of the HVS
and its perceptual response to brightness when encoding RGB scene
relative luminance values (linear-light values) into R’G’B’ perceptually
uniform values.



This is commonly achieved with a logarithmic transfer function (gamma,
L*).
• They leverage non linearity of the HVS to reduce the bandwidth and
number of bits needed per pixel by optimising digital codes allocation.
1. Poynton, C. (n.d.). Perceptual Uniformity. Retrieved March 5, 2016, from http://www.poynton.com/notes/Timo/Perceptual_uniformity.html 83
Perceptual Uniformity
• Cathode ray tubes (CRT) display electron gun characteristics imposed an
EOCF that is approximately the inverse of HVS perception of brightness.
• HVS perceptual response to brightness associated with the CRT power
function produces code values displayed in a perceptual uniform way.
• Modern display devices (LCD, plasma, DLP) replicate this behaviour by
imposing a 2.2, 2.4 or 2.6 power function (Gamma Correction) through
signal processing circuitry.
84
Perceptual Uniformity
85
Perceptual Uniformity - Linear Ramp
86
Perceptual Uniformity - Linear Ramp
87
Perceptual Uniformity - Perceptually Uniform Ramp
88
Perceptual Coding
1. Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267 89
Perceptual Coding
1. Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267 90
Perceptual Coding
91
Perceptual Coding
• The luminance difference between L and L + ΔL is noticeable when ΔL is
about 1% of L.
• The 1.01 (101 / 100) ratio is known as the Weber contrast or fraction.
92
Perceptual Coding
• An ideal non-linear transfer function will allocate code values to minimise
the just-noticeable difference (JND).
• On a linear-light values scale, code 100 is the location where Weber
contrast reaches 1%.
• Weber contrast increases for codes below 100, raising the perceptible
difference between adjacent codes and possibly producing banding and
contouring artefacts.
• Weber contrast decreases for codes over 100, higher codes are getting
wasteful and could be discarded without affecting the perception.
93
Perceptual Coding
• High-quality image reproduction requires a contrast ratio >= 30:1 as
shown by the NTSC engineers in the 1950s.
• Using 8-bit linear-light coding, the contrast ratio that can be reproduced
without artefacts is only 2.55:1.
• Achieving a contrast ratio >= 30:1 with linear-light coding requires 12-bit
resulting in an artefacts free contrast ratio of 40.95:1 however most of
those codes cannot be visually discriminated.
94
Perceptual Coding
Maintaining a 1.01 Weber contrast over scene relative luminance range of
[0.01, 100], contrast ratio of 100:1, requires approximately 462 codes (≈ 9
bits). [1]
1. Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display. Color Research and Application, 39(1), 6–15. doi:10.1002/
col.21768 95
log100
log1.01
⇡ 462; 1.01462
⇡ 100
C =
log(CR)
log(WC)
where C is the number of codes, CR is the contrast ratio and WC is the desired
Weber contrast.
16-Bit Integer & Half Float
Perceptual coding is not required when using 16-bit integer (artefacts free
contrast ratio of 655.35:1) or half float representations (Weber contrast of
0.1% [1], 2^10 = 1024 code values per stop)).
1. Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display. Color Research and Application, 39(1), 6–15. doi:10.1002/
col.21768 96
8-Bit Colour Imaging System Dynamic Range
Dynamic range associated with code 1 on 8-bit colour imaging system is
closer to 200,000:1 (or 600,000:1) instead of the 255:1 (or 256:1) dynamic
range often alleged because of the incorrect assumption that linear-light
values are encoded.
97
1
255
!2.4
⇡ 0.0000016 ⇡
1
600000
Gamma
• Gamma (γ) is a numerical parameter giving the exponent of a power
function assumed to approximate the relationship between a signal
quantity (such as a video signal code) and light power. [1]
• Gamma Encoding (γE), characteristic of OECFs uses an exponent
approximately between 0.4 and 0.5.
• Gamma Decoding (γD), characteristic of EOCFs uses an exponent
approximately between 2.2 and 2.6.
1. Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267 98
Gamma Encoding - OECF
99
Gamma Decoding - EOCF
100
Digital Colour Imaging System End-to-End Power Function
To overcome the loss in apparent contrast, the end-to-end power function of
a digital colour imaging system may have appropriate exponent values of 1,
1.25, and 1.5 for respectively bright, dim, and dark surrounds. [1]
1. Hunt, R. W. G. (2004). The Reproduction of Colour (6th ed.). Chichester, UK: Wiley. doi:10.1002/0470024275 101
End-to-End γ
102
Gamma Correction Misconceptions
• NTSC monochrome television was created in the 1940s and non linear
coding was a well understood element of good visual performance.
• Significance of perceptual uniformity has been generally forgotten: video
engineers seem to see gamma correction as a mean to address CRT “non
linearity defect”.
• ‘‘If gamma correction was not already necessary for physical reasons at
the CRT, we would have to invent it for perceptual reasons.” [1]
1. Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display. Color Research and Application, 39(1), 6–15. doi:10.1002/
col.21768 103
Digital Video & HD
• The luminance output of a CRT is proportional to input raised to the 5 / 2
power. A studio reference display CRT has a gamma ≈ 2.4.
• Gamma correction through the mean of an OECF is applied to pre-
compensate CRT display non linear power function and achieve perceptual
uniformity.
• In order to account for the different viewing conditions between original
scene and presentation, the correction under-compensate the actual CRT
display non linearity.
• This under-compensation yields an end-to-end power function with
exponent ≈ 1.2 which produces a pleasing television viewing experience in
dim surrounds.
104
Digital Video & HD
• Image Structure
• 1920 x 1080 progressive (24Hz, 30Hz), 16:9 aspect ratio
• 1920 x 1080 interlaced (30Hz), 16:9 aspect ratio
• 1280 x 720 progressive (24Hz, 30Hz, 60Hz), 16:9 aspect ratio
105
ITU-R BT.1886
• ITU-R BT.1886 defines the reference electro-optical transfer function for
CRT and LCD displays used in HDTV studio production.
• ITU-R BT.1886 adopts a power function with exponent γ = 0.5.
• The recommendation doesn’t standardise reference white and viewing
conditions.
• ITU-R BT.2035 defines a reference viewing environment for evaluation of
HDTV program material.
106
ITU-R BT.1886
• HD Studio Mastering (Typical)
• Reference white is typically set at 100-120 cd.m-2
.
• Surround luminance is expected to be very dim at around 1% of reference white
luminance.
• Typical intra-image contrast ratio is 1000:1.
• HD Consumer (Typical)
• Reference white is typically set at 200 cd.m-2
.
• Surround luminance is expected to be dim at around 5% of reference white luminance.
• Typical intra-image contrast ratio is 400:1.
107
ITU-R BT.2035
• ITU-R BT.2035 defines a reference viewing environment for evaluation of
HDTV program material.
• D.R.A.F.T
108
ITU-R BT.709 / Rec. 709
ITU-R BT.709 is the international standard defining the parameter values for
HDTV.
109
BT.709 OECF
• BT.709 OECF defines a 0.45 exponent but its effective power function
exponent is γE ≈ 0.5.
• BT.709 OECF is a piece-wise function: in order to reduce noise in dark
region, a line segment limits the slope of the power function (slope of a
power function is infinite at zero).
110
BT.709 - BT.1886 End-to-End γ
111
BT.709 Colourspace
112
BT.709 Colourspace
• Primaries: 



[0.6400, 0.3300]

[0.3000, 0.6000]

[0.1500, 0.0600]
• Illuminant / Whitepoint: D65
• Pointer’s Gamut Coverage: 81.1674568
• Visible Spectrum Coverage: 36.6606209
113
UHDTV
The UHD Alliance (UHDA) developed three specifications to support the
next-generation premium home entertainment experience covering the
entertainment ecosystem in the following categories: [1]
• Devices
• Distribution
• Content
1. UHDA. (2016). UHD Alliance Defines Premium Home Entertainment Experience. Retrieved January 8, 2016, from http://www.uhdalliance.org/uhd-alliance-
press-releasejanuary-4-2016/ 114
UHDTV - Devices
An UHDA compliant device must meet or exceed the following specifications:
• Image Resolution: 3840×2160
• Color Bit Depth: 10-bit signal
• Color Palette (Wide Color Gamut)
• Signal Input: BT.2020 color representation
• Display Reproduction: More than 90% of P3 colours
• High Dynamic Range
• SMPTE ST2084 EOTF
• A combination of peak brightness and black level of either:
• More than 1000 nits peak brightness and less than 0.05 nits black level
• More than 540 nits peak brightness and less than 0.0005 nits black level
115
UHDTV - Distribution
An UHDA compliant distribution channel must support:
• Image Resolution: 3840×2160
• Color Bit Depth: Minimum 10-bit signal
• Color: BT.2020 color representation
• High Dynamic Range: SMPTE ST2084 EOTF
116
UHDTV - Content Mastering
UHDA Content Master must meet the following requirements:
• Image Resolution: 3840×2160
• Color Bit Depth: Minimum 10-bit signal
• Color: BT.2020 color representation
• High Dynamic Range: SMPTE ST2084 EOTF
Specifications of UHDA recommended mastering display:
• Display Reproduction: Minimum 100% of P3 colours
• Peak Brightness: More than 1000 nits
• Black Level: Less than 0.03 nits
117
ITU-R BT.2020 / Rec. 2020
ITU-R BT.2020 defines the parameter values for ultra-high definition
television systems for production and international programme exchange.
118
ITU-R BT.2020 / Rec. 2020
• Image Structure
• 7680 × 4320, 16:9 aspect ratio, 1:1
• 3840 × 2160, 16:9 aspect ratio, 1:1
• Frequency:120, 60, 60/1.001, 50, 30, 30/1.001, 25, 24, 24/1.001
• Progressive scan mode
119
BT.2020 OECF
BT.2020 OECF is the same than BT.709 OECF and is expected to be used in
conjunction with BT.1886 EOCF yielding an an end-to-end power function
with exponent ≈ 1.2.
120
BT.2020 - BT.1886 End-to-End γ
121
BT.2020 Colourspace
122
BT.2020 Colourspace
• Primaries: 



[0.708, 0.292]

[0.170, 0.797]

[0.131, 0.046]
• Illuminant / Whitepoint: D65
• Pointer’s Gamut Coverage: 99.9635339
• Visible Spectrum Coverage: 70.7051466
123
SMPTE ST 2084
• SMPTE ST 2084 (PQ) is the international standard defining the EOTF
characterizing high-dynamic-range reference displays used primarily for
mastering non-broadcast content.
• The perceptual quantizer has been modeled by Dolby Laboratories using
Barten (1999) contrast sensitivity function.
• Display peak luminance is expected to reach 10,000 cd.m-2 and use a 10
or 12-bit data representation.
124
sRGB - ST 2084 EOCF
125
Multimedia & Desktop Graphics
sRGB IEC 61966-2-1:1999 specification is defined for multimedia
applications, desktop graphics considering a brighter surround than the one
of a studio reference display.
126
sRGB IEC 61966-2-1:1999
• sRGB adopts ITU-R BT.709 RGB colourspace gamut but a different set of
OECF / EOCF.
• sRGB reference white is specified at 80 cd.m-2 in accordance to CRTs.
• Surround luminance is expected to be average at around 20% of
reference white luminance.
• Typical intra-image contrast ratio is 100:1.
• Modern LCD displays commonly peak at 320 cd.m-2.
127
sRGB OECF
• sRGB OECF doesn’t account for picture rendering: the end-to-end gamma
is ≈ 1.0 when associated with sRGB EOCF (γE ≈ 2.2), thus it is not suitable
for image capture.
• sRGB is defined as a display-referred colour encoding.
128
BT.709 - sRGB End-to-End γ
129
sRGB - sRGB End-to-End γ
130
sRGB Colourspace
131
sRGB Colourspace
• Primaries (Rec. 709): 



[0.6400, 0.3300]

[0.3000, 0.6000]

[0.1500, 0.0600]
• Illuminant / Whitepoint: D65
• Pointer’s Gamut Coverage: 81.1674568
• Visible Spectrum Coverage: 36.6606209
132
Digital Cinema
• Picture rendering was traditionally imposed by a camera negative film
gamma ≈ 0.5-0.6, an inter-positive film having a unity gamma and a
release print film stock with gamma ≈ 2.8-3.2, resulting in an end-to-end
gamma ≈ 1.4-1.8, suitable for dark film projection surrounds.
• DCI / SMPTE standard reference digital cinema projector apply a 2.6
gamma to the X’Y’Z’ DCDM (Digital Cinema Distribution Master) non linear
components.
133
Digital Cinema
• The X’Y’Z’ DCDM is encoded with JPEG-2000 compression.
• The X’Y’Z’ DCDM image file format is mapped into TIFF. Colour channels
are represented by 12-bit unsigned integer code values. These 12 bits are
placed into the most significant bits of 16-bit words, with the remaining 4
bits filled with zeroes.
• Image Structure
• 4096 x 2160, 24Hz, 1:1
• 2048 x 1080, 24Hz, 1:1
• 2048 x 1080, 48Hz, 1:1
134
Digital Cinema
• Digital cinema standards are display-referred: colour appearance of the
digital intermediate is fully baked into the X’Y’Z’ DCDM.
• Digital cinema reference white is specified at 48 cd.m-2.
• Surround luminance is expected to be dark (0% of reference white
luminance).
• Typical intra-image contrast ratio is 100:1.
• DCI-P3 is the wide gamut RGB colourspace in which digital cinema
material is mastered.
135
DCI-P3 Colourspace
136
DCI-P3 Colourspace
• Primaries: 



[0.680, 0.320]

[0.265, 0.690]

[0.150, 0.060]
• Illuminant / Whitepoint: 0.314, 0.351
• Pointer’s Gamut Coverage: 88.2782774
• Visible Spectrum Coverage: 45.4533861
137
Digital Capture for Digital Cinema
• Motion picture camera vendors commonly encode their scene-referred
data using a log encoding function ('ALEXA Log C', 'C-Log', 'Panalog', 'S-
Log', ‘V-Log', etc…) tailored to account camera specific dynamic range
and noise characteristics.
• They also define dedicated gamuts accounting for the specific spectral
responses of their respective camera.
138
Camera Vendors Log Encoding Functions
139
Camera Vendors Gamuts
140
Digital Capture for Digital Cinema
• Those log encoding functions draw inspiration into Cineon Digital Film
System developed by Eastman Kodak Company.
• Cineon is a logarithmic encoding of the colour film negative optical
density.
• “Film has traditionally been represented by a characteristic curve which
plots density vs log exposure. This is a log/log representation. In defining
the calibration for the Cineon digital film system, Eastman Kodak Co.
talked to many experts in the film industry to determine the best data
metric to use for digitizing film. The consensus was to use the familier
density metric and to store the film as logarithmic data.” [1]
1. Kodak. (1995). Conversion of 10-bit Log Film Data To 8-bit Linear or Video Data for The Cineon Digital Film System. 141
Cineon Digital LAD
1. KODAK Digital LAD Test Image - Eastman Kodak Company 142
sRGB Digital LAD
1. KODAK Digital LAD Test Image - Eastman Kodak Company 143
Visual Effects Colour Pipeline
1. Selan, J. (2012). Cinematic color. ACM SIGGRAPH 2012 Posters on - SIGGRAPH ’12, 1–54. doi:10.1145/2343483.2343492 - colour-science.org 144
Visual Effects Colour Pipeline
• Visual effects vendors generate scene-referred imagery that is seamlessly
integrated onto client plates while not altering their image state.
• This fundamental principle is at the heart of visual effects as shots with
visual effects must be intercutted with shots without visual effects (or
coming from other vendors).
• The digital intermediate (DI) expects a delivery that is a high fidelity
representation of the original capture.
145
Visual Effects Colour Pipeline
• The visual effects colour pipeline is a complex colour imaging system built
on individual chained colour imaging systems.
• Colour encoding specifications must be defined (and identifiable to be
accounted for) for every input / output signal processing operations.
146
Working Colour Encoding Specification
• A modern paradigm is to define a working colour encoding specification
(for example based on ACEScg, DCI-P3, or Rec. 2020 gamuts and
representing scene-referred linear-light quantities) and convert all the input
imagery with their respective colour encoding specifications to that
working specification.
• Plates are usually converted to the working colour encoding specification
by using an invertible decoding 1D LUT specific to their originating
gamma / log encoding and then to the working gamut by mean of a 3x3
matrix (or a 3D LUT).
147
Working Colour Encoding Specification
Some facilities perform the compositing stage within the client delivery
gamut: it can be beneficial when the working colour encoding specification
doesn’t encompass the captured plates gamut (avoiding complicated to
handle negative values).



Note: ARRI Alexa cameras are notorious to have a very wide gamut.
148
View Transform
• The scene-referred data is visualised using a dedicated view transform
(1D LUT or 3D LUT) that commonly model the typical characteristic curve
of a print film (print film emulation, S-Curve, sigmoid function combined
with a log curve, etc…).
• The view transform is never baked into the DI delivered imagery.
149
sRGB - View Transform
150
Duiker Print Film Emulation View Transform
151
Compositing
• Plates are neutralised using an invertible process to overcome lighting
changes across a sequence.
• This permits reusability of light rigs at the rendering stage and establish a
better consistency across shots during the compositing stage.
• The neutralisation is reversed on compositing output.
152
Texturing & Matte Painting
• D.R.A.F.T
153
Digital Intermediate & Mastering
1. http://www.parkroad.co.nz/wp-content/uploads/2015/10/Clare_Mahana_DI.jpg 154
Digital Intermediate & Mastering
• Digital intermediate is a display-referred finishing process originally
involving motion picture digitisation, colour manipulation (colour timing /
grading, contrast adjustment, etc…) and recording back to film again to
create a master internegative.
• The viewing environment replicates the final exhibition viewing
environment, and is adapted accordingly to each type of exhibition image
formation device (digital cinema, typical home theater, etc…).
• Calibration tolerances to the standards (DCI / SMPTE) are very strict.
155
Digital Intermediate & Mastering
• The DI process is commonly split into an initial pass that neutralises per
shot variation and a secondary pass that defines the colour artistic intent /
look of the film.
• The DI house may provide a Colour Decision List (CDL) or 3D LUT per
shot to visual effects vendors to give them an overview of the look being
developed.
156
Digital Intermediate & Mastering
• DI often creates masters for multiple image formation medium / devices.
• Artistic grading is performed on the “gold standard” image formation
device (usually the digital cinema projector) with approval of the director.
• Trim passes are executed for the other image formation devices and will
include specific corrections for the respective devices characteristics and
viewing conditions.
157
Academy Color Encoding System
1. http://www.orbitnet.com/ampas/ACES_1.html 158
Academy Color Encoding System
• ACES is a colour management and image interchange system designed
for production, mastering and long-term archiving of motion pictures. [1]
• It enables consistent, high-quality colour management from production to
distribution.
• It provides digital image encoding and specifications preserving original
imagery latitude and colour range while establishing a common standard
so deliverables can be efficiently and predictably created and preserved.
1. The Academy of Motion Picture Arts and Sciences. (n.d.). ACES. Retrieved March 22, 2016, from http://www.oscars.org/science-technology/sci-tech-
projects/aces 159
ACES Components - Input
• Reference Input Capture Device (RICD)

The RICD, an ideal capturing device, records all the colour (and dynamic
range) of a given scene. It provides a documented, unambiguous and
fixed relationship between scene colours and encoded RGB values.
• Input Device Transform (IDT)

An image captured by a physical or virtual camera is transformed by the
IDT into ACES RGB relative exposure values that the RICD would have
recorded if used in-place.
160
ACES Components - Output
• Reference Rendering Transform (RRT)

ACES images are an intermediate representation and cannot be used for
final image evaluation. The RRT is an idealised replacement for print-film
emulations (S-Curve) with an extremely wide gamut and high dynamic
range (32 stops).
• Output Device Transform (ODT)

The ODT performs rendering of the RRT wide gamut and dynamic range
on a given physical display, accounting for its specific characteristics
(gamut, dynamic range, and EOCF) and viewing conditions.
161
ACES Components - Negative Film
• Academy Printing Density (APD)

Reference printing density for calibrating film scanners and film recorders.
• Academy Density Exchange (ADX)

Densitometric encoding (similar to Cineon) used for capturing data from
film scanners.
162
ACES Encodings
• ACES2065-1 (ACES Primaries 0, AP0)

The ACES common colour encoding colourspace used for exchange of full fidelity images
and archiving.
• ACEScg (ACES Primaries 1, AP1)

A linearly encoded colourspace for CG rendering and compositing, using the improved set
of primaries that encompass Rec. 2020 and DCI-P3 gamuts.
• ACEScc (ACES Primaries 1, AP1)

A logarithmically encoded colourspace for use in colour grading applications, using the AP1
primaries.
• ACES proxy (ACES Primaries 1, AP1)

A lightweight encoding using the AP1 primaries, for transmission over HD-SDI (or other
production transmission schemes), onset look management. Not intended to be stored or
used in production imagery or for final colour grading / mastering.

163
ACES Primaries 0, AP0
• Primaries: 



[0.73470, 0.26530]

[0.00000, 1.00000]

[0.00010, -0.07700]
• Illuminant / Whitepoint: D60
• Pointer’s Gamut Coverage: 100.0000000
• Visible Spectrum Coverage: 98.0872282
164
ACES Primaries 1, AP1
• Primaries: 



[0.713, 0.293]

[0.165, 0.830]

[0.128, 0.044]
• Illuminant / Whitepoint: D60
• Pointer’s Gamut Coverage: 99.9905787
• Visible Spectrum Coverage: 74.0533610
165
ACES Encodings
166
Colour Grading - GoG
1. https://vimeo.com/116019668 167
Colour Grading - GoG
168
y = ax + b (1)
y = (ax + b + c(1 x))1/
(2)
Yo = (gain ⇥ Yi + offset + lift ⇥ (1 Yi))(1/gamma)
(3)
where Yi is the input luminance and Yo the output luminance.
Note 1: (1) is slope-intercept form of a linear equation.
Note 2: On a television the contrast and brightness controls are respectively
mapped to the gain and o↵set variables.
Neutral
169
Positive Gain
170
Negative Gain
171
Neutral
172
Positive Offset
173
Negative Offset
174
Neutral
175
Positive Lift
176
Negative Lift
177
Neutral
178
Positive Gamma
179
Negative Gamma
180
Neutral
181
1D Lut & 3D Lut
• A 1D Lut is a single variable indexed one dimensional table. 



Expensive runtime computation are replaced with a simpler array indexing
operation / look up.
• A 3D Lut is a three variable indexed three dimensional table (3D lattice)
where each variable (lattice axis) represent a colour component. 



Output colour values for input variable points not exactly matching output
lattice points are interpolated.
182
Bibliography
• Fairchild, M. D. (2013). Color Appearance Models (3rd ed.). Wiley.
ISBN:B00DAYO8E2
• Wyszecki, G., & Stiles, W. S. (2000). Color Science: Concepts and
Methods, Quantitative Data and Formulae. Wiley. ISBN:978-0471399186
• Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and
Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267
• Madden, T. E., & Giorgianni, E. J. (2007). Digital Color Management (Vol.
20). doi:10.1002/9780470994375
• Dutré, P., Bekaert, P., & Bala, K. (2006). Advanced Global Illumination, 2,
384. ISBN:1439864950
183
Bibliography
• ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology -
Extended colour encodings for digital image storage, manipulation and interchange, 2004.
• International Telecommunication Union. (2011). Recommendation ITU-R BT.1886 - Reference electro-
optical transfer function for flat panel displays used in HDTV studio production BT Series Broadcasting
service.
• International Telecommunication Union. (2015). Recommendation ITU-R BT.709-6 - Parameter values
for the HDTV standards for production and international programme exchange BT Series Broadcasting
service (Vol. 5). Retrieved from https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.709-6-201506-I!!
PDF-E.pdf
• International Telecommunication Union. (2013). Recommendation ITU-R BT.2035 - A reference viewing
environment for evaluation of HDTV program material or completed programmes BT Series
Broadcasting service.
• International Telecommunication Union. (2015). Recommendation ITU-R BT.2020 - Parameter values for
ultra-high definition television systems for production and international programme exchange (Vol. 1).
Retrieved from https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2020-2-201510-I!!PDF-E.pdf
184
Bibliography
• Reinhard, E. (2009). A Reassessment of the Simultaneous Dynamic Range of the Human Visual
System, 17–24.
• Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display.
Color Research and Application, 39(1), 6–15. doi:10.1002/col.21768
• Selan, J. (2012). Cinematic color. ACM SIGGRAPH 2012 Posters on - SIGGRAPH ’12, 1–54. doi:
10.1145/2343483.2343492
• Kodak. (2002). KODAK: Student Filmmaker’s Handbook. Retrieved from http://ultra.sdk.free.fr/misc/
TechniquePhoto/Kodak Student Handbook.pdf
• Gilchrist, A. (2008). Perceptual organization in lightness. Vasa, 1–25. Retrieved from http://
www.gestaltrevision.be/pdfs/oxford/Gilchrist-Perceptual_organization_in_lightness.pdf
• Nilsson, M. (2015). BT Media and Broadcast - Ultra High Definition Video Formats and
Standardisation. Retrieved from http://www.mediaandbroadcast.bt.com/wp-content/uploads/
D2936-UHDTV-final.pdf
185
Bibliography
• Brendel, H. (2005). ARRI COMPANION TO DI - Chapter 2. Motion Picture
Film. Retrieved March 12, 2016, from http://dicomp.arri.de/digital/
digital_systems/DIcompanion/ch02.html
• Pritchard, B. R. (n.d.). Why Colour Negative is Orange. Retrieved March
19, 2016, from http://www.brianpritchard.com/
why_colour_negative_is_orange.htm
• https://github.com/colour-science/colour-ipython
• Wikipedia. (n.d.).
186

More Related Content

What's hot

Customize renderpipeline
Customize renderpipelineCustomize renderpipeline
Customize renderpipelineAkilarLiao
 
Lighting of Killzone: Shadow Fall
Lighting of Killzone: Shadow FallLighting of Killzone: Shadow Fall
Lighting of Killzone: Shadow FallGuerrilla
 
Final ppt
Final pptFinal ppt
Final pptpramada
 
KinectでAR空間に入り込もう
KinectでAR空間に入り込もうKinectでAR空間に入り込もう
KinectでAR空間に入り込もうTakashi Yoshinaga
 
Khronos Munich 2018 - Halcyon and Vulkan
Khronos Munich 2018 - Halcyon and VulkanKhronos Munich 2018 - Halcyon and Vulkan
Khronos Munich 2018 - Halcyon and VulkanElectronic Arts / DICE
 
유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)
유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)
유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)ozlael ozlael
 
A Bizarre Way to do Real-Time Lighting
A Bizarre Way to do Real-Time LightingA Bizarre Way to do Real-Time Lighting
A Bizarre Way to do Real-Time LightingSteven Tovey
 
【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフロー
【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフロー【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフロー
【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフローUnityTechnologiesJapan002
 
Secrets of CryENGINE 3 Graphics Technology
Secrets of CryENGINE 3 Graphics TechnologySecrets of CryENGINE 3 Graphics Technology
Secrets of CryENGINE 3 Graphics TechnologyTiago Sousa
 
Physically Based Sky, Atmosphere and Cloud Rendering in Frostbite
Physically Based Sky, Atmosphere and Cloud Rendering in FrostbitePhysically Based Sky, Atmosphere and Cloud Rendering in Frostbite
Physically Based Sky, Atmosphere and Cloud Rendering in FrostbiteElectronic Arts / DICE
 
【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!
【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!
【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!UnityTechnologiesJapan002
 
Cascade Shadow Mapping
Cascade Shadow MappingCascade Shadow Mapping
Cascade Shadow MappingSukwoo Lee
 
Deferred shading
Deferred shadingDeferred shading
Deferred shadingFrank Chao
 
Hable John Uncharted2 Hdr Lighting
Hable John Uncharted2 Hdr LightingHable John Uncharted2 Hdr Lighting
Hable John Uncharted2 Hdr Lightingozlael ozlael
 
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...Unity Technologies
 
SIGGRAPH 2010 Water Flow in Portal 2
SIGGRAPH 2010 Water Flow in Portal 2SIGGRAPH 2010 Water Flow in Portal 2
SIGGRAPH 2010 Water Flow in Portal 2Alex Vlachos
 

What's hot (20)

Customize renderpipeline
Customize renderpipelineCustomize renderpipeline
Customize renderpipeline
 
Lighting of Killzone: Shadow Fall
Lighting of Killzone: Shadow FallLighting of Killzone: Shadow Fall
Lighting of Killzone: Shadow Fall
 
Final ppt
Final pptFinal ppt
Final ppt
 
Node-RED v1.3新機能紹介
Node-RED v1.3新機能紹介Node-RED v1.3新機能紹介
Node-RED v1.3新機能紹介
 
KinectでAR空間に入り込もう
KinectでAR空間に入り込もうKinectでAR空間に入り込もう
KinectでAR空間に入り込もう
 
Khronos Munich 2018 - Halcyon and Vulkan
Khronos Munich 2018 - Halcyon and VulkanKhronos Munich 2018 - Halcyon and Vulkan
Khronos Munich 2018 - Halcyon and Vulkan
 
유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)
유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)
유니티의 라이팅이 안 이쁘다구요? (A to Z of Lighting)
 
A Bizarre Way to do Real-Time Lighting
A Bizarre Way to do Real-Time LightingA Bizarre Way to do Real-Time Lighting
A Bizarre Way to do Real-Time Lighting
 
【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフロー
【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフロー【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフロー
【Unite Tokyo 2019】MeshSyncを有効活用したセルルックプリレンダーのワークフロー
 
UE4 Hair & Groomでのリアルタイムファーレンダリング (UE4 Character Art Dive Online)
UE4 Hair & Groomでのリアルタイムファーレンダリング (UE4 Character Art Dive Online)UE4 Hair & Groomでのリアルタイムファーレンダリング (UE4 Character Art Dive Online)
UE4 Hair & Groomでのリアルタイムファーレンダリング (UE4 Character Art Dive Online)
 
Secrets of CryENGINE 3 Graphics Technology
Secrets of CryENGINE 3 Graphics TechnologySecrets of CryENGINE 3 Graphics Technology
Secrets of CryENGINE 3 Graphics Technology
 
Shiny PC Graphics in Battlefield 3
Shiny PC Graphics in Battlefield 3Shiny PC Graphics in Battlefield 3
Shiny PC Graphics in Battlefield 3
 
Physically Based Sky, Atmosphere and Cloud Rendering in Frostbite
Physically Based Sky, Atmosphere and Cloud Rendering in FrostbitePhysically Based Sky, Atmosphere and Cloud Rendering in Frostbite
Physically Based Sky, Atmosphere and Cloud Rendering in Frostbite
 
【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!
【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!
【Unite Tokyo 2018】カスタムシェーダーでモバイルでも最先端グラフィックスな格闘ゲームを!
 
Cascade Shadow Mapping
Cascade Shadow MappingCascade Shadow Mapping
Cascade Shadow Mapping
 
Deferred shading
Deferred shadingDeferred shading
Deferred shading
 
Hable John Uncharted2 Hdr Lighting
Hable John Uncharted2 Hdr LightingHable John Uncharted2 Hdr Lighting
Hable John Uncharted2 Hdr Lighting
 
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
 
SIGGRAPH 2010 Water Flow in Portal 2
SIGGRAPH 2010 Water Flow in Portal 2SIGGRAPH 2010 Water Flow in Portal 2
SIGGRAPH 2010 Water Flow in Portal 2
 
Android presentation slide
Android presentation slideAndroid presentation slide
Android presentation slide
 

Similar to The Importance of Terminology and sRGB Uncertainty - Notes - 0.4

Colour models
Colour modelsColour models
Colour modelsBCET
 
06 color image processing
06 color image processing06 color image processing
06 color image processingJaiverdhan .
 
Optimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdf
Optimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdfOptimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdf
Optimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdfAwal72
 
Color Image Processing
Color Image ProcessingColor Image Processing
Color Image Processingkiruthiammu
 
Compututer Graphics - Color Modeling And Rendering
Compututer Graphics - Color Modeling And RenderingCompututer Graphics - Color Modeling And Rendering
Compututer Graphics - Color Modeling And RenderingPrince Soni
 
THE TELEVISION SYSTEM IN INDIA
THE TELEVISION SYSTEM IN INDIATHE TELEVISION SYSTEM IN INDIA
THE TELEVISION SYSTEM IN INDIAIshank Ranjan
 
What Color is Solid State Lighting - Panel Discussion
What Color is Solid State Lighting - Panel DiscussionWhat Color is Solid State Lighting - Panel Discussion
What Color is Solid State Lighting - Panel DiscussionCindy Foster-Warthen
 
Analyzing color imaging failure on consumer-grade cameras
Analyzing color imaging failure on consumer-grade camerasAnalyzing color imaging failure on consumer-grade cameras
Analyzing color imaging failure on consumer-grade camerasSaiTedla1
 
Color-in-Digital-Image-Processing.pptx
Color-in-Digital-Image-Processing.pptxColor-in-Digital-Image-Processing.pptx
Color-in-Digital-Image-Processing.pptxEveCarolino
 
Digital image processing
Digital image processingDigital image processing
Digital image processingABIRAMI M
 
Project report_DTRL_subrat
Project report_DTRL_subratProject report_DTRL_subrat
Project report_DTRL_subratSubrat Prasad
 
LCD Characterization Report
LCD Characterization ReportLCD Characterization Report
LCD Characterization ReportTanmay Mondal
 
Unit i mm_chap4_color in image and video
Unit i mm_chap4_color in image and  videoUnit i mm_chap4_color in image and  video
Unit i mm_chap4_color in image and videoEellekwameowusu
 
Chapter 6 color image processing
Chapter 6 color image processingChapter 6 color image processing
Chapter 6 color image processingasodariyabhavesh
 
Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...
Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...
Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...Savvas Chatzichristofis
 
Evaluating color descriptors for object and scene recognition
Evaluating color descriptors for object and scene recognitionEvaluating color descriptors for object and scene recognition
Evaluating color descriptors for object and scene recognitionSOYEON KIM
 
DIGITAL IMAGE PROCESSING - Day 5 Applications of DIP
DIGITAL IMAGE PROCESSING - Day 5 Applications of DIPDIGITAL IMAGE PROCESSING - Day 5 Applications of DIP
DIGITAL IMAGE PROCESSING - Day 5 Applications of DIPvijayanand Kandaswamy
 

Similar to The Importance of Terminology and sRGB Uncertainty - Notes - 0.4 (20)

Colour models
Colour modelsColour models
Colour models
 
lecture_07.pptx
lecture_07.pptxlecture_07.pptx
lecture_07.pptx
 
06 color image processing
06 color image processing06 color image processing
06 color image processing
 
DIP-Questions.pdf
DIP-Questions.pdfDIP-Questions.pdf
DIP-Questions.pdf
 
Optimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdf
Optimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdfOptimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdf
Optimizing Seismic Attributes Interpretation using HSV Rotation_PPT.pdf
 
Color Image Processing
Color Image ProcessingColor Image Processing
Color Image Processing
 
Compututer Graphics - Color Modeling And Rendering
Compututer Graphics - Color Modeling And RenderingCompututer Graphics - Color Modeling And Rendering
Compututer Graphics - Color Modeling And Rendering
 
THE TELEVISION SYSTEM IN INDIA
THE TELEVISION SYSTEM IN INDIATHE TELEVISION SYSTEM IN INDIA
THE TELEVISION SYSTEM IN INDIA
 
Images
ImagesImages
Images
 
What Color is Solid State Lighting - Panel Discussion
What Color is Solid State Lighting - Panel DiscussionWhat Color is Solid State Lighting - Panel Discussion
What Color is Solid State Lighting - Panel Discussion
 
Analyzing color imaging failure on consumer-grade cameras
Analyzing color imaging failure on consumer-grade camerasAnalyzing color imaging failure on consumer-grade cameras
Analyzing color imaging failure on consumer-grade cameras
 
Color-in-Digital-Image-Processing.pptx
Color-in-Digital-Image-Processing.pptxColor-in-Digital-Image-Processing.pptx
Color-in-Digital-Image-Processing.pptx
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Project report_DTRL_subrat
Project report_DTRL_subratProject report_DTRL_subrat
Project report_DTRL_subrat
 
LCD Characterization Report
LCD Characterization ReportLCD Characterization Report
LCD Characterization Report
 
Unit i mm_chap4_color in image and video
Unit i mm_chap4_color in image and  videoUnit i mm_chap4_color in image and  video
Unit i mm_chap4_color in image and video
 
Chapter 6 color image processing
Chapter 6 color image processingChapter 6 color image processing
Chapter 6 color image processing
 
Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...
Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...
Searching Images with MPEG-7 (& MPEG-7 Like) Powered Localized dEscriptors (S...
 
Evaluating color descriptors for object and scene recognition
Evaluating color descriptors for object and scene recognitionEvaluating color descriptors for object and scene recognition
Evaluating color descriptors for object and scene recognition
 
DIGITAL IMAGE PROCESSING - Day 5 Applications of DIP
DIGITAL IMAGE PROCESSING - Day 5 Applications of DIPDIGITAL IMAGE PROCESSING - Day 5 Applications of DIP
DIGITAL IMAGE PROCESSING - Day 5 Applications of DIP
 

Recently uploaded

G9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptG9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptMAESTRELLAMesa2
 
Botany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdfBotany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdfSumit Kumar yadav
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhousejana861314
 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfSumit Kumar yadav
 
Zoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfZoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfSumit Kumar yadav
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRDelhi Call girls
 
Chemistry 4th semester series (krishna).pdf
Chemistry 4th semester series (krishna).pdfChemistry 4th semester series (krishna).pdf
Chemistry 4th semester series (krishna).pdfSumit Kumar yadav
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfnehabiju2046
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxgindu3009
 
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxBroad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxjana861314
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxUmerFayaz5
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksSérgio Sacani
 
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCESTERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCEPRINCE C P
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsAArockiyaNisha
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxAleenaTreesaSaji
 
Botany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questionsBotany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questionsSumit Kumar yadav
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​kaibalyasahoo82800
 

Recently uploaded (20)

G9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptG9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.ppt
 
Botany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdfBotany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdf
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhouse
 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdf
 
Zoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfZoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdf
 
Engler and Prantl system of classification in plant taxonomy
Engler and Prantl system of classification in plant taxonomyEngler and Prantl system of classification in plant taxonomy
Engler and Prantl system of classification in plant taxonomy
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
 
Chemistry 4th semester series (krishna).pdf
Chemistry 4th semester series (krishna).pdfChemistry 4th semester series (krishna).pdf
Chemistry 4th semester series (krishna).pdf
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdf
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptx
 
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxBroad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptx
 
CELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdfCELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdf
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disks
 
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCESTERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based Nanomaterials
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptx
 
The Philosophy of Science
The Philosophy of ScienceThe Philosophy of Science
The Philosophy of Science
 
Botany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questionsBotany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questions
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​
 

The Importance of Terminology and sRGB Uncertainty - Notes - 0.4

  • 1. The Importance of Terminology and sRGB Uncertainty Notes - 0.4 colour-science.org 1
  • 2. Foreword This presentation is the organised and formatted embodiment of the Colour Science notes I have taken along the years. It is aimed at the VFX industry, and is the work in-progress subset of a broader and generic Colour Science presentation. Its creation wouldn’t have been possible without the works and references cited in the Bibliography section. Thomas Mansencal 2
  • 4. The sRGB Uncertainty • Understanding linear and sRGB color spaces : What does this mean? sRGB is intrinsically linear! • “We’ll start by learning how the sRGB and linear color spaces differ.” • This is confusing for non experts because omitting an explicit emphasis of the affected component of the sRGB colourspace. 4
  • 5. What is Colour? “Almost everyone knows what color is. After all, they have had firsthand experience of it since shortly after birth. However, very few can precisely describe their color experiences or even precisely define color.” [1] 1. Fairchild, M. D. (2013). Color Appearance Models (3rd ed., pp. 1–10831). Wiley. ISBN:B00DAYO8E2 5
  • 6. What is Colour? • Characteristic of visual perception that can be described by attributes of hue, brightness (or lightness) and colourfulness (or saturation or chroma). [1] • Colour is perceived when light interacts with the human visual system (HVS). 1. CIE. (n.d.). 17-198 colour (perceived). Retrieved June 26, 2014, from http://eilv.cie.co.at/term/198 6
  • 8. Additive RGB Colourspace • An additive RGB colourspace is defined by specifying 3 mandatory components: • Primaries • Whitepoint • Conversion Functions (OECF and EOCF) 8
  • 9. Additive RGB Colourspace • An additive RGB colourspace is a colorimetric colour space having three colour primaries (generally red, green and blue) such that CIE XYZ tristimulus values can be determined from the RGB colour space values by forming a weighted combination of the CIE XYZ tristimulus values for the individual colour primaries, where the weights are proportional to the radiometrically linear colour space values for the corresponding colour primaries. [1] • NOTE 2 Additive RGB colour spaces are defined by specifying the CIE chromaticity values for a set of additive RGB primaries and a colour space white point, together with a colour component transfer function. 1. ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology - Extended colour encodings for digital image storage, manipulation and interchange, 2004. 9
  • 11. Primaries • The primaries chromaticity coordinates define the gamut of colours that can be encoded by a given RGB colourspace. • While commonly represented as triangles on a chromaticity diagram (such as the CIE 1931 Chromaticity Diagram), RGB colourspace gamuts define the boundaries of an actual solid within the CIE xyY colourspace. 11
  • 12. Whitepoint • The colourspace whitepoint is defined as the colour stimulus to which colour space values are normalized. [1] • Any colour lying on the neutral axis normal to the xy plane and passing through the whitepoint, no matter its luminance, will be achromatic. 1. ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology - Extended colour encodings for digital image storage, manipulation and interchange, 2004. 12
  • 14. Conversion Functions (Transfer Functions) • A colour component conversion function is defined as a single variable, monotonic mathematical function applied individually to one or more colour channels of a colour space. [1] • They perform the mapping between the linear light components / tristimulus values and a non-linear R'G'B' video signal. • They are commonly used for faithful representation of images and perceptual coding in relation with display non linear response and HVS non linearity. 1. ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology - Extended colour encodings for digital image storage, manipulation and interchange, 2004. 14
  • 16. Opto-electronic conversion function • The opto-electronic conversion function (OECF or OETF) maps (encodes) estimated tristimulus values in a scene to a non-linear R'G'B' video component signal value. • Typical OECFs are usually expressed by a power function with an exponent between 0.4 and 0.5. 16
  • 18. Electro-optical conversion function • The electro-optical conversion function (EOCF or EOTF) maps (decodes) a non-linear R'G'B' video component signal to a tristimulus value at the display. • Typical EOCFs are usually expressed by a power function with an exponent between 2.2 and 2.6. 18
  • 20. Misleading Terminology Nuke’s Read node colorspace knob until Nuke 10 is only specifying an electro-optical conversion function and will not perform gamut change. 20
  • 21. Non Linearity of the Human Visual System 1. Davson, H. (1990). Physiology of the Eye (5th ed.). Elsevier Science Ltd. ISBN:978-0080379074 - colour-science.org 21
  • 22. Non Linearity of the Human Visual System • Weber’s law states that the just-noticeable difference (JND) between two stimuli is proportional to the magnitude of the stimuli: an increment is judged relative to the previous amount. • Fechner mathematically characterised Weber’s law showing that it follows a logarithmic transformation: the perceived magnitude of a stimulus is proportional to the logarithm of the physical stimulus intensity. 22
  • 23. Non Linearity of the Human Visual System • Fechner’s scaling has been found to apply to the perception of brightness, at moderate and high brightness, with perceived brightness being proportional to the logarithm of the actual intensity. • At lower levels of brightness, the de Vries-Rose law applies which states that the perception of brightness is proportional to the square root of the actual intensity. 23
  • 24. Non Linearity of the Human Visual System • Stevens’s law supersedes Fechner's law and addresses its lack of generality. • The results of the physical-perceptual relationship of his experiments on a logarithmic scale were characterised by straight lines with different slopes, suggesting that the relationship between perceptual magnitude and stimulus intensity follows a power law with varying exponent. 24
  • 28. Lightness - CIE L* • Because of the various HVS adaptation mechanisms, perceived brightness has a non-linear relationship with the actual physical intensity of the stimulus. • It is commonly approximated by a cube root. • Multiple approximations of lightness (or value in the Munsell Renotation System) were proposed leading to the creation of CIE L* in 1976. • CIE L* characterises the perceptual response to relative luminance. 28
  • 30. Colour Imaging System • A colour imaging system embodies any combination of technologies and devices required to perform: • Image capture • Signal processing • Image formation 30
  • 32. Image Capture • Image capture / acquisition of colour stimuli can be performed in a number of different ways using for example: • An electronic device (electronic video camera, DSLR) • Photographic film 32
  • 33. Electronic Capture • A movie camera may use a solid-state image sensor (CCD or CMOS) that absorbs photons of light. • As photons absorption occurs, electrons are collected into charge packets. • The image signal is produced by a sequential readout of the packets. 33
  • 34. Electronic Capture • Accurate image reproduction requires the capture device to be at least trichromatic implying that colour stimuli spectral power distributions must be separated into 3 colour signals. • This separation can be achieved with: • A beam splitter / colour filters combined to three sensors on high end capture devices resulting in reduced noise and increased resolution. • A single sensor covered with a mosaic of colour filters on systems requiring a small form factor and lower price. • Three sensor layers with different responses to wavelengths of light stacked together similarly to photographic film (Foveon). 34
  • 35. Photographic Film Capture 1. https://www.fujifilmusa.com/shared/bin/AF3-150E_Sensia100_Data_Sheet_2003.pdf 35
  • 36. Photographic Negative Film • A photographic film has red-, green-, and blue-light-sensitive layers coated on a transparent base. • The red and green layers are also sensitive to blue light, thus a yellow filtering layer is placed above them. It will be made colourless during chemical processing. • Light sensitivity is induced by silver halide grains with appropriate spectral response scattered within each light sensitive layer. The sensitive layers also contain an appropriate dye coupler. 36
  • 37. Image Formation 1. https://commons.wikimedia.org/wiki/File:AdditiveColor.svg 2. https://commons.wikimedia.org/wiki/File:SubtractiveColor.svg 37
  • 38. Image Formation • The processed image signals control colour-forming elements of the image formation medium / device. • Two categories of image formation exist: • Additive colour • Subtractive colour 38
  • 39. Additive Colour Formation • CRT, LCD or plasma displays mix red, green and blue light through pixels adjacency. • DLP, digital cinema projectors perform superposition by using a beam combiner. 39
  • 40. Subtractive Colour Formation • Photographic film use cyan, magenta and yellow dyes to absorb red, green and blue light. • Similarly, most printing processes use CMY inks. • Colour stimuli formed by subtractive colour are dependent (and affected) by the viewing light source. 40
  • 41. Picture Rendering • The colour imaging system usually achieves representation of a scene in a way that matches viewer expectation of the appearance of that scene instead of attempting to reproduce physical colour stimuli quantities. • A sunlight outdoor scene can have luminance of 50,000 cd.m-2 but may be displayed on a consumer electronic display with white peak luminance of 320 cd.m-2. 41
  • 42. Picture Rendering • The different viewing conditions and image formation medium / device capabilities impose that scene luminance must be mapped to image formation medium / device luminance. • A simple linear mapping from scene luminance to image formation medium / device luminance is not satisfactory. • Picture rendering adjusts the tone scale to achieve a perceptual uniform mapping. 42
  • 43. Non Triviality of Picture Rendering 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 43
  • 44. Non Triviality of Picture Rendering 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 44
  • 45. Non Triviality of Picture Rendering 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 45
  • 46. Non Triviality of Picture Rendering 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 46
  • 47. Effect of Lateral-Brightness Adaptation Images seen with a dark surround appear to have less contrast than if viewed with a dim, average or bright surround. 47
  • 48. Effect of Lateral-Brightness Adaptation 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 48
  • 49. Colour Encoding • A colour encoding is a digital representation of colours for image processing, storage, and interchange between systems. • A colour encoding specification (standardised input / output interface of a colour imaging system) must define: • A colour encoding method which determines the meaning of the encoded data or what will be represented by the data. • A colour encoding data metric characterising the colourspace and the numerical units used to encode the data or how the the representation will be numerically expressed. 49
  • 50. Image States • The image state concept was defined by Madden & Giorgianni. • Some signal processing operations make the image transition to a different colorimetric state. • An image may exist in scene state which is not directly viewable on typical image formation devices and must be transitioned to a new state, the rendered state. 50
  • 51. Image States • A colour encoding specification defined in relation to scene quantities is said to be scene-referred: it has a colorimetric link to a scene. • A colour encoding specification defined in relation to digital display characteristics is said to be display-referred (rendered state): it has a colorimetric link to a digital display device. 51
  • 52. Display-Referred Imaging • Raw image processors used by photographers (Lightroom, Darktable, DCRaw, etc…) perform picture rendering on the raw scene referred data to deliver a display-referred image. • Artists achieving direct content creation in 2d applications are generating display-referred content. • Images available on the Internet such as on Google Images or textures vendors website are output- / display-referred. • A photograph taken on a mobile phone and uploaded to a social network is display-referred. 52
  • 53. Display-Referred Imaging • Display-referred imagery created and exhibited on a display that matches a standard reference (using sRGB specification and viewing conditions) will appear the same across similar display devices without any further action required. • A photograph processed on a consumer graphics desktop and output as a sRGB JPG or PNG file will approximately look the same on other consumer graphics desktop. 53
  • 54. Display-Referred Imaging • Display-referred imagery has usually a restricted luminance dynamic range and limited colour gamut thus some of the original captured scene- referred data is lost upon encoding. • This is unsuitable if the image is meant to be viewed on different image formation devices with wider dynamic range. 54
  • 55. Sony F35 - Out of Gamut Colours 1. http://www.oscars.org/science-technology/sci-tech-projects/aces 55
  • 56. Sony F35 - Out of Gamut Colours 56
  • 57. Sony F35 - Out of Gamut Colours 57
  • 58. Scene-Referred Imaging • Scene-referred representation of data contains enough information to achieve the desired appearance of the scene on a variety of image formation medium / device. • Scene-referred imaging is the basis of physically-based rendering allowing to reproduce realistic light interaction using plausible light quantities. It makes possible realistic camera effects (motion-blur, defocus). 58
  • 59. Scene-Referred Imaging 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 59
  • 60. Scene-Referred Imaging 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 60
  • 61. Scene-Referred Imaging 1. Fairchild, M. D. (n.d.). The HDR Photographic Survey. Retrieved April 15, 2015, from http://rit-mcsl.org/fairchild/HDRPS/HDRthumbs.html 61
  • 62. Scene-Referred Imaging • Measured scene linear-light quantities are usually normalised to a known reference. • Commonly middle grey is set at luminance = 0.18 which is the reflectance of: • Reference Kodak 18% Grey Card • Background colour of a DSC Labs CamAlign ChromaDuMonde chart • Reflectance of a X-Rite ColorChecker neutral 5 (.70 D) sample is ≈ 19%! 62
  • 64. Energy Conservation • Anti-aliasing or image filtering operations should be energy preserving: the total light emitted from the display should remain the same after the processing operations. • Resizing an image should not affect its luminance. • Those operations must be performed on linear image data 64
  • 66. Blur in Linear Colourspace 66
  • 67. Blur in Non-Linear Colourspace 67
  • 68. Digital Image - Raster Graphics • A digital image is a rectangular data structure (a 2 or 3-dimensional array) of picture elements (pixels). • A pixel colour is determined by a single code for achromatic images or multiple codes for chromatic images (commonly three). 68
  • 69. Digital Image - Raster Graphics 69
  • 70. Digital Image - Raster Graphics 70
  • 71. Digital Image - Raster Graphics 71
  • 72. Quantization • Quantization is the process of mapping a continuous signal (or large set of input values) to a smaller set. • Information between each quantizer steps is discarded and lost. • Quantization error (signal distortion) decreases signal-to-noise ratio (SNR). • Banding and contouring artefacts can be reduced by introducing a small amount of noise (≈ 1 / 2 quantiser step) prior to the quantization. Dithering decreases the SNR. 72
  • 77. Quantization Input Signal 1. Getty Images. (n.d.). Getty Images Test Image. Retrieved June 20, 2003, from https://www.drycreekphoto.com/tools/ 77
  • 78. Quantization 4-Bit Linear Quantization 1. Getty Images. (n.d.). Getty Images Test Image. Retrieved June 20, 2003, from https://www.drycreekphoto.com/tools/ 78
  • 79. Quantization 4-Bit Perceptually Uniform Quantization 1. Getty Images. (n.d.). Getty Images Test Image. Retrieved June 20, 2003, from https://www.drycreekphoto.com/tools/ 79
  • 83. Perceptual Uniformity • A colour imaging system is perceptually uniform if a small perturbation of a component value is approximately equally perceptible across the range of that value. [1] • Most electronic colour imaging systems account for non linearity of the HVS and its perceptual response to brightness when encoding RGB scene relative luminance values (linear-light values) into R’G’B’ perceptually uniform values.
 
 This is commonly achieved with a logarithmic transfer function (gamma, L*). • They leverage non linearity of the HVS to reduce the bandwidth and number of bits needed per pixel by optimising digital codes allocation. 1. Poynton, C. (n.d.). Perceptual Uniformity. Retrieved March 5, 2016, from http://www.poynton.com/notes/Timo/Perceptual_uniformity.html 83
  • 84. Perceptual Uniformity • Cathode ray tubes (CRT) display electron gun characteristics imposed an EOCF that is approximately the inverse of HVS perception of brightness. • HVS perceptual response to brightness associated with the CRT power function produces code values displayed in a perceptual uniform way. • Modern display devices (LCD, plasma, DLP) replicate this behaviour by imposing a 2.2, 2.4 or 2.6 power function (Gamma Correction) through signal processing circuitry. 84
  • 86. Perceptual Uniformity - Linear Ramp 86
  • 87. Perceptual Uniformity - Linear Ramp 87
  • 88. Perceptual Uniformity - Perceptually Uniform Ramp 88
  • 89. Perceptual Coding 1. Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267 89
  • 90. Perceptual Coding 1. Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267 90
  • 92. Perceptual Coding • The luminance difference between L and L + ΔL is noticeable when ΔL is about 1% of L. • The 1.01 (101 / 100) ratio is known as the Weber contrast or fraction. 92
  • 93. Perceptual Coding • An ideal non-linear transfer function will allocate code values to minimise the just-noticeable difference (JND). • On a linear-light values scale, code 100 is the location where Weber contrast reaches 1%. • Weber contrast increases for codes below 100, raising the perceptible difference between adjacent codes and possibly producing banding and contouring artefacts. • Weber contrast decreases for codes over 100, higher codes are getting wasteful and could be discarded without affecting the perception. 93
  • 94. Perceptual Coding • High-quality image reproduction requires a contrast ratio >= 30:1 as shown by the NTSC engineers in the 1950s. • Using 8-bit linear-light coding, the contrast ratio that can be reproduced without artefacts is only 2.55:1. • Achieving a contrast ratio >= 30:1 with linear-light coding requires 12-bit resulting in an artefacts free contrast ratio of 40.95:1 however most of those codes cannot be visually discriminated. 94
  • 95. Perceptual Coding Maintaining a 1.01 Weber contrast over scene relative luminance range of [0.01, 100], contrast ratio of 100:1, requires approximately 462 codes (≈ 9 bits). [1] 1. Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display. Color Research and Application, 39(1), 6–15. doi:10.1002/ col.21768 95 log100 log1.01 ⇡ 462; 1.01462 ⇡ 100 C = log(CR) log(WC) where C is the number of codes, CR is the contrast ratio and WC is the desired Weber contrast.
  • 96. 16-Bit Integer & Half Float Perceptual coding is not required when using 16-bit integer (artefacts free contrast ratio of 655.35:1) or half float representations (Weber contrast of 0.1% [1], 2^10 = 1024 code values per stop)). 1. Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display. Color Research and Application, 39(1), 6–15. doi:10.1002/ col.21768 96
  • 97. 8-Bit Colour Imaging System Dynamic Range Dynamic range associated with code 1 on 8-bit colour imaging system is closer to 200,000:1 (or 600,000:1) instead of the 255:1 (or 256:1) dynamic range often alleged because of the incorrect assumption that linear-light values are encoded. 97 1 255 !2.4 ⇡ 0.0000016 ⇡ 1 600000
  • 98. Gamma • Gamma (γ) is a numerical parameter giving the exponent of a power function assumed to approximate the relationship between a signal quantity (such as a video signal code) and light power. [1] • Gamma Encoding (γE), characteristic of OECFs uses an exponent approximately between 0.4 and 0.5. • Gamma Decoding (γD), characteristic of EOCFs uses an exponent approximately between 2.2 and 2.6. 1. Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267 98
  • 99. Gamma Encoding - OECF 99
  • 100. Gamma Decoding - EOCF 100
  • 101. Digital Colour Imaging System End-to-End Power Function To overcome the loss in apparent contrast, the end-to-end power function of a digital colour imaging system may have appropriate exponent values of 1, 1.25, and 1.5 for respectively bright, dim, and dark surrounds. [1] 1. Hunt, R. W. G. (2004). The Reproduction of Colour (6th ed.). Chichester, UK: Wiley. doi:10.1002/0470024275 101
  • 103. Gamma Correction Misconceptions • NTSC monochrome television was created in the 1940s and non linear coding was a well understood element of good visual performance. • Significance of perceptual uniformity has been generally forgotten: video engineers seem to see gamma correction as a mean to address CRT “non linearity defect”. • ‘‘If gamma correction was not already necessary for physical reasons at the CRT, we would have to invent it for perceptual reasons.” [1] 1. Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display. Color Research and Application, 39(1), 6–15. doi:10.1002/ col.21768 103
  • 104. Digital Video & HD • The luminance output of a CRT is proportional to input raised to the 5 / 2 power. A studio reference display CRT has a gamma ≈ 2.4. • Gamma correction through the mean of an OECF is applied to pre- compensate CRT display non linear power function and achieve perceptual uniformity. • In order to account for the different viewing conditions between original scene and presentation, the correction under-compensate the actual CRT display non linearity. • This under-compensation yields an end-to-end power function with exponent ≈ 1.2 which produces a pleasing television viewing experience in dim surrounds. 104
  • 105. Digital Video & HD • Image Structure • 1920 x 1080 progressive (24Hz, 30Hz), 16:9 aspect ratio • 1920 x 1080 interlaced (30Hz), 16:9 aspect ratio • 1280 x 720 progressive (24Hz, 30Hz, 60Hz), 16:9 aspect ratio 105
  • 106. ITU-R BT.1886 • ITU-R BT.1886 defines the reference electro-optical transfer function for CRT and LCD displays used in HDTV studio production. • ITU-R BT.1886 adopts a power function with exponent γ = 0.5. • The recommendation doesn’t standardise reference white and viewing conditions. • ITU-R BT.2035 defines a reference viewing environment for evaluation of HDTV program material. 106
  • 107. ITU-R BT.1886 • HD Studio Mastering (Typical) • Reference white is typically set at 100-120 cd.m-2 . • Surround luminance is expected to be very dim at around 1% of reference white luminance. • Typical intra-image contrast ratio is 1000:1. • HD Consumer (Typical) • Reference white is typically set at 200 cd.m-2 . • Surround luminance is expected to be dim at around 5% of reference white luminance. • Typical intra-image contrast ratio is 400:1. 107
  • 108. ITU-R BT.2035 • ITU-R BT.2035 defines a reference viewing environment for evaluation of HDTV program material. • D.R.A.F.T 108
  • 109. ITU-R BT.709 / Rec. 709 ITU-R BT.709 is the international standard defining the parameter values for HDTV. 109
  • 110. BT.709 OECF • BT.709 OECF defines a 0.45 exponent but its effective power function exponent is γE ≈ 0.5. • BT.709 OECF is a piece-wise function: in order to reduce noise in dark region, a line segment limits the slope of the power function (slope of a power function is infinite at zero). 110
  • 111. BT.709 - BT.1886 End-to-End γ 111
  • 113. BT.709 Colourspace • Primaries: 
 
 [0.6400, 0.3300]
 [0.3000, 0.6000]
 [0.1500, 0.0600] • Illuminant / Whitepoint: D65 • Pointer’s Gamut Coverage: 81.1674568 • Visible Spectrum Coverage: 36.6606209 113
  • 114. UHDTV The UHD Alliance (UHDA) developed three specifications to support the next-generation premium home entertainment experience covering the entertainment ecosystem in the following categories: [1] • Devices • Distribution • Content 1. UHDA. (2016). UHD Alliance Defines Premium Home Entertainment Experience. Retrieved January 8, 2016, from http://www.uhdalliance.org/uhd-alliance- press-releasejanuary-4-2016/ 114
  • 115. UHDTV - Devices An UHDA compliant device must meet or exceed the following specifications: • Image Resolution: 3840×2160 • Color Bit Depth: 10-bit signal • Color Palette (Wide Color Gamut) • Signal Input: BT.2020 color representation • Display Reproduction: More than 90% of P3 colours • High Dynamic Range • SMPTE ST2084 EOTF • A combination of peak brightness and black level of either: • More than 1000 nits peak brightness and less than 0.05 nits black level • More than 540 nits peak brightness and less than 0.0005 nits black level 115
  • 116. UHDTV - Distribution An UHDA compliant distribution channel must support: • Image Resolution: 3840×2160 • Color Bit Depth: Minimum 10-bit signal • Color: BT.2020 color representation • High Dynamic Range: SMPTE ST2084 EOTF 116
  • 117. UHDTV - Content Mastering UHDA Content Master must meet the following requirements: • Image Resolution: 3840×2160 • Color Bit Depth: Minimum 10-bit signal • Color: BT.2020 color representation • High Dynamic Range: SMPTE ST2084 EOTF Specifications of UHDA recommended mastering display: • Display Reproduction: Minimum 100% of P3 colours • Peak Brightness: More than 1000 nits • Black Level: Less than 0.03 nits 117
  • 118. ITU-R BT.2020 / Rec. 2020 ITU-R BT.2020 defines the parameter values for ultra-high definition television systems for production and international programme exchange. 118
  • 119. ITU-R BT.2020 / Rec. 2020 • Image Structure • 7680 × 4320, 16:9 aspect ratio, 1:1 • 3840 × 2160, 16:9 aspect ratio, 1:1 • Frequency:120, 60, 60/1.001, 50, 30, 30/1.001, 25, 24, 24/1.001 • Progressive scan mode 119
  • 120. BT.2020 OECF BT.2020 OECF is the same than BT.709 OECF and is expected to be used in conjunction with BT.1886 EOCF yielding an an end-to-end power function with exponent ≈ 1.2. 120
  • 121. BT.2020 - BT.1886 End-to-End γ 121
  • 123. BT.2020 Colourspace • Primaries: 
 
 [0.708, 0.292]
 [0.170, 0.797]
 [0.131, 0.046] • Illuminant / Whitepoint: D65 • Pointer’s Gamut Coverage: 99.9635339 • Visible Spectrum Coverage: 70.7051466 123
  • 124. SMPTE ST 2084 • SMPTE ST 2084 (PQ) is the international standard defining the EOTF characterizing high-dynamic-range reference displays used primarily for mastering non-broadcast content. • The perceptual quantizer has been modeled by Dolby Laboratories using Barten (1999) contrast sensitivity function. • Display peak luminance is expected to reach 10,000 cd.m-2 and use a 10 or 12-bit data representation. 124
  • 125. sRGB - ST 2084 EOCF 125
  • 126. Multimedia & Desktop Graphics sRGB IEC 61966-2-1:1999 specification is defined for multimedia applications, desktop graphics considering a brighter surround than the one of a studio reference display. 126
  • 127. sRGB IEC 61966-2-1:1999 • sRGB adopts ITU-R BT.709 RGB colourspace gamut but a different set of OECF / EOCF. • sRGB reference white is specified at 80 cd.m-2 in accordance to CRTs. • Surround luminance is expected to be average at around 20% of reference white luminance. • Typical intra-image contrast ratio is 100:1. • Modern LCD displays commonly peak at 320 cd.m-2. 127
  • 128. sRGB OECF • sRGB OECF doesn’t account for picture rendering: the end-to-end gamma is ≈ 1.0 when associated with sRGB EOCF (γE ≈ 2.2), thus it is not suitable for image capture. • sRGB is defined as a display-referred colour encoding. 128
  • 129. BT.709 - sRGB End-to-End γ 129
  • 130. sRGB - sRGB End-to-End γ 130
  • 132. sRGB Colourspace • Primaries (Rec. 709): 
 
 [0.6400, 0.3300]
 [0.3000, 0.6000]
 [0.1500, 0.0600] • Illuminant / Whitepoint: D65 • Pointer’s Gamut Coverage: 81.1674568 • Visible Spectrum Coverage: 36.6606209 132
  • 133. Digital Cinema • Picture rendering was traditionally imposed by a camera negative film gamma ≈ 0.5-0.6, an inter-positive film having a unity gamma and a release print film stock with gamma ≈ 2.8-3.2, resulting in an end-to-end gamma ≈ 1.4-1.8, suitable for dark film projection surrounds. • DCI / SMPTE standard reference digital cinema projector apply a 2.6 gamma to the X’Y’Z’ DCDM (Digital Cinema Distribution Master) non linear components. 133
  • 134. Digital Cinema • The X’Y’Z’ DCDM is encoded with JPEG-2000 compression. • The X’Y’Z’ DCDM image file format is mapped into TIFF. Colour channels are represented by 12-bit unsigned integer code values. These 12 bits are placed into the most significant bits of 16-bit words, with the remaining 4 bits filled with zeroes. • Image Structure • 4096 x 2160, 24Hz, 1:1 • 2048 x 1080, 24Hz, 1:1 • 2048 x 1080, 48Hz, 1:1 134
  • 135. Digital Cinema • Digital cinema standards are display-referred: colour appearance of the digital intermediate is fully baked into the X’Y’Z’ DCDM. • Digital cinema reference white is specified at 48 cd.m-2. • Surround luminance is expected to be dark (0% of reference white luminance). • Typical intra-image contrast ratio is 100:1. • DCI-P3 is the wide gamut RGB colourspace in which digital cinema material is mastered. 135
  • 137. DCI-P3 Colourspace • Primaries: 
 
 [0.680, 0.320]
 [0.265, 0.690]
 [0.150, 0.060] • Illuminant / Whitepoint: 0.314, 0.351 • Pointer’s Gamut Coverage: 88.2782774 • Visible Spectrum Coverage: 45.4533861 137
  • 138. Digital Capture for Digital Cinema • Motion picture camera vendors commonly encode their scene-referred data using a log encoding function ('ALEXA Log C', 'C-Log', 'Panalog', 'S- Log', ‘V-Log', etc…) tailored to account camera specific dynamic range and noise characteristics. • They also define dedicated gamuts accounting for the specific spectral responses of their respective camera. 138
  • 139. Camera Vendors Log Encoding Functions 139
  • 141. Digital Capture for Digital Cinema • Those log encoding functions draw inspiration into Cineon Digital Film System developed by Eastman Kodak Company. • Cineon is a logarithmic encoding of the colour film negative optical density. • “Film has traditionally been represented by a characteristic curve which plots density vs log exposure. This is a log/log representation. In defining the calibration for the Cineon digital film system, Eastman Kodak Co. talked to many experts in the film industry to determine the best data metric to use for digitizing film. The consensus was to use the familier density metric and to store the film as logarithmic data.” [1] 1. Kodak. (1995). Conversion of 10-bit Log Film Data To 8-bit Linear or Video Data for The Cineon Digital Film System. 141
  • 142. Cineon Digital LAD 1. KODAK Digital LAD Test Image - Eastman Kodak Company 142
  • 143. sRGB Digital LAD 1. KODAK Digital LAD Test Image - Eastman Kodak Company 143
  • 144. Visual Effects Colour Pipeline 1. Selan, J. (2012). Cinematic color. ACM SIGGRAPH 2012 Posters on - SIGGRAPH ’12, 1–54. doi:10.1145/2343483.2343492 - colour-science.org 144
  • 145. Visual Effects Colour Pipeline • Visual effects vendors generate scene-referred imagery that is seamlessly integrated onto client plates while not altering their image state. • This fundamental principle is at the heart of visual effects as shots with visual effects must be intercutted with shots without visual effects (or coming from other vendors). • The digital intermediate (DI) expects a delivery that is a high fidelity representation of the original capture. 145
  • 146. Visual Effects Colour Pipeline • The visual effects colour pipeline is a complex colour imaging system built on individual chained colour imaging systems. • Colour encoding specifications must be defined (and identifiable to be accounted for) for every input / output signal processing operations. 146
  • 147. Working Colour Encoding Specification • A modern paradigm is to define a working colour encoding specification (for example based on ACEScg, DCI-P3, or Rec. 2020 gamuts and representing scene-referred linear-light quantities) and convert all the input imagery with their respective colour encoding specifications to that working specification. • Plates are usually converted to the working colour encoding specification by using an invertible decoding 1D LUT specific to their originating gamma / log encoding and then to the working gamut by mean of a 3x3 matrix (or a 3D LUT). 147
  • 148. Working Colour Encoding Specification Some facilities perform the compositing stage within the client delivery gamut: it can be beneficial when the working colour encoding specification doesn’t encompass the captured plates gamut (avoiding complicated to handle negative values).
 
 Note: ARRI Alexa cameras are notorious to have a very wide gamut. 148
  • 149. View Transform • The scene-referred data is visualised using a dedicated view transform (1D LUT or 3D LUT) that commonly model the typical characteristic curve of a print film (print film emulation, S-Curve, sigmoid function combined with a log curve, etc…). • The view transform is never baked into the DI delivered imagery. 149
  • 150. sRGB - View Transform 150
  • 151. Duiker Print Film Emulation View Transform 151
  • 152. Compositing • Plates are neutralised using an invertible process to overcome lighting changes across a sequence. • This permits reusability of light rigs at the rendering stage and establish a better consistency across shots during the compositing stage. • The neutralisation is reversed on compositing output. 152
  • 153. Texturing & Matte Painting • D.R.A.F.T 153
  • 154. Digital Intermediate & Mastering 1. http://www.parkroad.co.nz/wp-content/uploads/2015/10/Clare_Mahana_DI.jpg 154
  • 155. Digital Intermediate & Mastering • Digital intermediate is a display-referred finishing process originally involving motion picture digitisation, colour manipulation (colour timing / grading, contrast adjustment, etc…) and recording back to film again to create a master internegative. • The viewing environment replicates the final exhibition viewing environment, and is adapted accordingly to each type of exhibition image formation device (digital cinema, typical home theater, etc…). • Calibration tolerances to the standards (DCI / SMPTE) are very strict. 155
  • 156. Digital Intermediate & Mastering • The DI process is commonly split into an initial pass that neutralises per shot variation and a secondary pass that defines the colour artistic intent / look of the film. • The DI house may provide a Colour Decision List (CDL) or 3D LUT per shot to visual effects vendors to give them an overview of the look being developed. 156
  • 157. Digital Intermediate & Mastering • DI often creates masters for multiple image formation medium / devices. • Artistic grading is performed on the “gold standard” image formation device (usually the digital cinema projector) with approval of the director. • Trim passes are executed for the other image formation devices and will include specific corrections for the respective devices characteristics and viewing conditions. 157
  • 158. Academy Color Encoding System 1. http://www.orbitnet.com/ampas/ACES_1.html 158
  • 159. Academy Color Encoding System • ACES is a colour management and image interchange system designed for production, mastering and long-term archiving of motion pictures. [1] • It enables consistent, high-quality colour management from production to distribution. • It provides digital image encoding and specifications preserving original imagery latitude and colour range while establishing a common standard so deliverables can be efficiently and predictably created and preserved. 1. The Academy of Motion Picture Arts and Sciences. (n.d.). ACES. Retrieved March 22, 2016, from http://www.oscars.org/science-technology/sci-tech- projects/aces 159
  • 160. ACES Components - Input • Reference Input Capture Device (RICD)
 The RICD, an ideal capturing device, records all the colour (and dynamic range) of a given scene. It provides a documented, unambiguous and fixed relationship between scene colours and encoded RGB values. • Input Device Transform (IDT)
 An image captured by a physical or virtual camera is transformed by the IDT into ACES RGB relative exposure values that the RICD would have recorded if used in-place. 160
  • 161. ACES Components - Output • Reference Rendering Transform (RRT)
 ACES images are an intermediate representation and cannot be used for final image evaluation. The RRT is an idealised replacement for print-film emulations (S-Curve) with an extremely wide gamut and high dynamic range (32 stops). • Output Device Transform (ODT)
 The ODT performs rendering of the RRT wide gamut and dynamic range on a given physical display, accounting for its specific characteristics (gamut, dynamic range, and EOCF) and viewing conditions. 161
  • 162. ACES Components - Negative Film • Academy Printing Density (APD)
 Reference printing density for calibrating film scanners and film recorders. • Academy Density Exchange (ADX)
 Densitometric encoding (similar to Cineon) used for capturing data from film scanners. 162
  • 163. ACES Encodings • ACES2065-1 (ACES Primaries 0, AP0)
 The ACES common colour encoding colourspace used for exchange of full fidelity images and archiving. • ACEScg (ACES Primaries 1, AP1)
 A linearly encoded colourspace for CG rendering and compositing, using the improved set of primaries that encompass Rec. 2020 and DCI-P3 gamuts. • ACEScc (ACES Primaries 1, AP1)
 A logarithmically encoded colourspace for use in colour grading applications, using the AP1 primaries. • ACES proxy (ACES Primaries 1, AP1)
 A lightweight encoding using the AP1 primaries, for transmission over HD-SDI (or other production transmission schemes), onset look management. Not intended to be stored or used in production imagery or for final colour grading / mastering.
 163
  • 164. ACES Primaries 0, AP0 • Primaries: 
 
 [0.73470, 0.26530]
 [0.00000, 1.00000]
 [0.00010, -0.07700] • Illuminant / Whitepoint: D60 • Pointer’s Gamut Coverage: 100.0000000 • Visible Spectrum Coverage: 98.0872282 164
  • 165. ACES Primaries 1, AP1 • Primaries: 
 
 [0.713, 0.293]
 [0.165, 0.830]
 [0.128, 0.044] • Illuminant / Whitepoint: D60 • Pointer’s Gamut Coverage: 99.9905787 • Visible Spectrum Coverage: 74.0533610 165
  • 167. Colour Grading - GoG 1. https://vimeo.com/116019668 167
  • 168. Colour Grading - GoG 168 y = ax + b (1) y = (ax + b + c(1 x))1/ (2) Yo = (gain ⇥ Yi + offset + lift ⇥ (1 Yi))(1/gamma) (3) where Yi is the input luminance and Yo the output luminance. Note 1: (1) is slope-intercept form of a linear equation. Note 2: On a television the contrast and brightness controls are respectively mapped to the gain and o↵set variables.
  • 182. 1D Lut & 3D Lut • A 1D Lut is a single variable indexed one dimensional table. 
 
 Expensive runtime computation are replaced with a simpler array indexing operation / look up. • A 3D Lut is a three variable indexed three dimensional table (3D lattice) where each variable (lattice axis) represent a colour component. 
 
 Output colour values for input variable points not exactly matching output lattice points are interpolated. 182
  • 183. Bibliography • Fairchild, M. D. (2013). Color Appearance Models (3rd ed.). Wiley. ISBN:B00DAYO8E2 • Wyszecki, G., & Stiles, W. S. (2000). Color Science: Concepts and Methods, Quantitative Data and Formulae. Wiley. ISBN:978-0471399186 • Poynton, C. (2012). Digital Video and HD, Second Edition: Algorithms and Interfaces (2nd ed.). Elsevier / Morgan Kaufmann. ISBN:978-0123919267 • Madden, T. E., & Giorgianni, E. J. (2007). Digital Color Management (Vol. 20). doi:10.1002/9780470994375 • Dutré, P., Bekaert, P., & Bala, K. (2006). Advanced Global Illumination, 2, 384. ISBN:1439864950 183
  • 184. Bibliography • ISO. (2004). INTERNATIONAL STANDARD ISO 22028-1 - Photography and graphic technology - Extended colour encodings for digital image storage, manipulation and interchange, 2004. • International Telecommunication Union. (2011). Recommendation ITU-R BT.1886 - Reference electro- optical transfer function for flat panel displays used in HDTV studio production BT Series Broadcasting service. • International Telecommunication Union. (2015). Recommendation ITU-R BT.709-6 - Parameter values for the HDTV standards for production and international programme exchange BT Series Broadcasting service (Vol. 5). Retrieved from https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.709-6-201506-I!! PDF-E.pdf • International Telecommunication Union. (2013). Recommendation ITU-R BT.2035 - A reference viewing environment for evaluation of HDTV program material or completed programmes BT Series Broadcasting service. • International Telecommunication Union. (2015). Recommendation ITU-R BT.2020 - Parameter values for ultra-high definition television systems for production and international programme exchange (Vol. 1). Retrieved from https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2020-2-201510-I!!PDF-E.pdf 184
  • 185. Bibliography • Reinhard, E. (2009). A Reassessment of the Simultaneous Dynamic Range of the Human Visual System, 17–24. • Poynton, C., & Funt, B. (2014). Perceptual uniformity in digital image representation and display. Color Research and Application, 39(1), 6–15. doi:10.1002/col.21768 • Selan, J. (2012). Cinematic color. ACM SIGGRAPH 2012 Posters on - SIGGRAPH ’12, 1–54. doi: 10.1145/2343483.2343492 • Kodak. (2002). KODAK: Student Filmmaker’s Handbook. Retrieved from http://ultra.sdk.free.fr/misc/ TechniquePhoto/Kodak Student Handbook.pdf • Gilchrist, A. (2008). Perceptual organization in lightness. Vasa, 1–25. Retrieved from http:// www.gestaltrevision.be/pdfs/oxford/Gilchrist-Perceptual_organization_in_lightness.pdf • Nilsson, M. (2015). BT Media and Broadcast - Ultra High Definition Video Formats and Standardisation. Retrieved from http://www.mediaandbroadcast.bt.com/wp-content/uploads/ D2936-UHDTV-final.pdf 185
  • 186. Bibliography • Brendel, H. (2005). ARRI COMPANION TO DI - Chapter 2. Motion Picture Film. Retrieved March 12, 2016, from http://dicomp.arri.de/digital/ digital_systems/DIcompanion/ch02.html • Pritchard, B. R. (n.d.). Why Colour Negative is Orange. Retrieved March 19, 2016, from http://www.brianpritchard.com/ why_colour_negative_is_orange.htm • https://github.com/colour-science/colour-ipython • Wikipedia. (n.d.). 186