SlideShare a Scribd company logo
1 of 220
Download to read offline
Dr. Mohieddin Moradi
mohieddinmoradi@gmail.com
1
Dream
Idea
Plan
Implementation
The first experiments were mechanical.
• 1873 - May & Smith experiment with selenium.
• Selenium is sensitive to light.
• Forms the basis for all early televisions.
• 1884 - The Nipkow disk.
• Many basic concepts laid down (determined) by the Nipkow disk.
• Scanning and synchronisation.
2
A Short History of Television
Scanning Holes
Transmitter Receiver
3
A Short History of Television
1923 - Vladimir Zworykin develops the Kinescope (Kinescope is a recording of a television
program on motion picture film, directly through a lens focused on the screen of a video
monitor.)
4
A Short History of Television
1924 - John Logie Baird transmits first image
5
A Short History of Television
1925- Vladimir K. Zworykin demonstrated 60 lines TV.
Curved lines typical of
mechanical televisionLine structure
6
A Short History of Television
7
Supper 16 mm Film
(included optical sound)
Review of 16 mm Film Standards
Q1 Spatial Resolution (HD, UHD)
Q2 Temporal Resolution (Frame Rate) (HFR)
Q3 Dynamic Range (SDR, HDR)
Q4 Color Gamut (BT. 709, BT. 2020)
Q5 Coding (Quantization, Bit Depth)
Q6 Compression Artifacts
.
.
.
Not only more pixels, but better pixels
8
Elements of High-Quality Image Production
Spatial Resolution (Pixels)
HD, FHD, UHD1, UHD2
Temporal Resolution (Frame rate)
24fps, 30fps, 60fps, 120fps …
Dynamic Range (Contrast)
From 100 nits to HDR
Color Space (Gamut)
From BT 709 to Rec. 2020
Quantization (Bit Depth)
8 bits, 10 bits, 12 bits …
9
Major Elements of High-Quality Image
Production
UHDTV 1
3840 x 2160
8.3 MPs
Digital Cinema 2K
2048 x 1080 2.21 MPs
4K
4096 x 2160 8.84 MPs
SD (PAL)
720 x 576
0.414MPs
HDTV 720P
1280 x 720
0.922 MPs
HDTV 1920 x 1080
2.027 MPs
UHDTV 2
7680 x 4320
33.18 MPs
8K
8192×4320
35.39 MPs
Wider viewing angle
More immersive
10
Q1: Spatial Resolution
Motion Blur
Motion Judder
Conventional Frame Rate High Frame Rate
Wider viewing angle
Increased
perceived motion
artifacts
Higher frame rates
needed
50fps minimum (100fps being vetted)
11
Q2: High Frame Rate (HFR)
– Deeper Colors
– More Realistic Pictures
– More Colorful
Wide Color Space (ITU-R Rec. BT.2020)
75.8%, of CIE 1931
Color Space (ITU-R Rec. BT.709)
35.9%, of CIE 1931
WCG
CIE 1931 Color Space
12
Q3: Wide Color Gamut
Standard Dynamic Range
High Dynamic Range
(More Vivid, More Detail)
13
Q4: High Dynamic Range
10 bits 1024
Levels
8 bits 256
Levels
14
– More colours
– More bits (10-bit)
– Banding, Contouring
Q5: Quantization (Bit Depth)
Brief Summary of ITU-R BT.709, BT.2020, and BT.2100
− ITU-R BT.709, BT.2020 and BT.2100 address transfer function, color space, matrix coefficients,
and more.
− The following table is a summary comparison of those three documents.
Parameter ITU-R BT.709 ITU-R BT.2020 ITU-R BT.2100
Spatial Resolution HD UHD, 8K HD, UHD, 8K
Framerates 24, 25, 30, 50, 60
24, 25, 30, 50, 60,
100, 120
24, 25, 30, 50, 60,
100, 120
Interlace/Progressive Interlace, Progressive Progressive Progressive
Color Space BT.709 BT.2020 BT.2020
Dynamic Range SDR (BT.1886) SDR (BT.1886) HDR (PQ, HLG)
Bit Depth 8, 10 10, 12 10, 12
Color Representation RGB, YCBCR RGB, YCBCR
RGB, YCBCR,
ICTCP
15
16
Visible Light
Visible Light
17
Radiometry and Photometry
Radiometry
– The science of measuring light in any portion of the electromagnetic spectrum
including infrared, ultraviolet, and visible light.
– This range includes the infrared, visible, and ultraviolet regions of the electromagnetic spectrum
– Wavelength from 1000 to 0.01 micrometer (=10-6 meter =10-3 millimeter)
Photometry
– Photometry is like radiometry except that it weights everything by the sensitivity of the human eye
– Deals with only the visible spectrum (=visible band)
– A wavelength range of about 380 to 780 nanometer (=10-9 meter)
– Do not deal with the perception of color itself, but rather the perceived strength of various
wavelengths
18
Radiant Flux
(Radiant Power)
The science of measuring light in any portion of the
electromagnetic spectrum including infrared, ultraviolet,
and visible light.
Watt
W or J/s
Luminous Flux
(Luminous Power)
The weighted emitted electromagnetic waves
according to “luminosity function” model of the human
eye's sensitivity to various wavelengths (Visible Light).
Lumen
lm
Luminous Intensity The quantity of visible light emitted by a light source in a
given direction per unit solid angle.
Candela
1cd = 1lm / sr
Illuminance The amount of light or luminous flux falling on a surface. Lux
(lumens per square meter)
1lx = 1lm / m²
Foot-candles
(lumens per square foot)
1fc = 1lm / ft²
Luminance The luminous intensity that is reflected or emitted from an
object per unit area in a specific direction.
Candela per square meter
cd/m² or nit
Radiometry and Photometry
19
− Radiance is the total amount of energy that flows from the light source, and it is usually
measured in watts (W).
− Luminance, measured in lumens (lm), is a measure of the amount of energy that an observer
perceives from a light source. For example, light emitted from a source operating in the far
infrared region of the spectrum could have significant energy (radiance), but an observer
would hardly perceive it; its luminance would be almost zero.
− Brightness is a subjective descriptor that is practically impossible to measure. It embodies the
achromatic notion of intensity, and is one of the key factors in describing color sensation.
20
Brightness
− A piece of white paper reflects almost all light colors emitted from the light source and thus
looks white.
− In contrast, a pure green object only reflects green light (spectrum) and absorbs all other light
colors.
− The light colors that each object reflects are governed by the characteristics of the object’s
surface.
Light and Color
21
Visual Fields
Rabbit Human 22
23
Human Visual System
cornea
‫قرنيه‬
Retina
‫چشم‬ ‫شبکيه‬
Sclera
‫چشم‬ ‫سخت‬ ‫سفيده‬ ‫يا‬ ‫صلبيه‬
Pupil
‫،حدقه‬ ‫چشم‬ ‫مردمک‬
choroid
‫مشيميه‬
Human Visual System
Image Formation
cornea, sclera, pupil, iris, lens, retina, fovea
Transduction
retina, rods, and cones
Processing
optic nerve, brain
24
cornea
‫قرنيه‬
Retina
‫چشم‬ ‫شبکيه‬
Sclera
‫چشم‬ ‫سخت‬ ‫سفيده‬ ‫يا‬ ‫صلبيه‬
Pupil
‫،حدقه‬ ‫چشم‬ ‫مردمک‬
choroid
‫مشيميه‬
25
Human Visual System
cornea
‫قرنيه‬
Retina
‫چشم‬ ‫شبکيه‬
Sclera
‫چشم‬ ‫سخت‬ ‫سفيده‬ ‫يا‬ ‫صلبيه‬
Pupil
‫،حدقه‬ ‫چشم‬ ‫مردمک‬
choroid
‫مشيميه‬
Human Visual System
Structure of the retina layers
The human eye has one lens (used to focus) …
… an iris (used to adjust the light level)…
… and retina (used to sense the image).
The retina is made up of rod and cone shaped cells.
• About 120,000,000 rods used for black & white.
• About 7,000,000 cones used for colour.
S : 430 nm (blue) (2%)
M: 535 nm (green) (33%)
L : 590 nm (red) (65%)
S = Short wavelength cone
M = Medium wavelength cone
L = Long wavelength cone
27
Human Visual System
28
− The highest point on each curve is called the “peak wavelength”, indicating the
wavelength of radiation that the cone is most sensitive to it.
Normalized Human Cone Sensitivity
Human Visual System
Fovea - Small region (1 or 2°) at the center of the visual field containing the highest density of
cones (and no rods).
• The centre of the image is the fovea.
– The fovea sees colour only.
• The nerve leaves the eye at the blind spot.
29
Human Visual System
− Fovea is small, dense region of receptors only cones (no rods) gives visual acuity.
− Outside fovea fewer receptors overall larger proportion of rods.
30
Human Visual System
Retina
Retina has photosensitive receptors at back of eye
31
Human Visual System
• Contain photo-pigment
• Respond to low energy
• Enhance sensitivity
• Concentrated in retina, but outside of fovea
• One type, sensitive to grayscale changes
• Contain photo-pigment
• Respond to high energy
• Enhance perception
• Concentrated in fovea, exist sparsely in retina
• Three types, sensitive to different wavelengths
Cones Rods
32
Human Visual System
(Very Bright and Shiny Color, Clear and Lively Color)
(Not Bright or Shiny Color)
Hue, Saturation and Luminosity
33
HUE
Saturation= 255
Luminance = 128
34
Saturation
Hue = 156
Luminance = 150
Saturation ranges =255 – 0
35
Luminance
Hue = 156
Sat = 200
Luminance ranges =255 – 0
36
Hue, Saturation and Luminosity
• Hue is a measure of the colour.
− Sometimes called “Chroma Phase”.
• Saturation is a measure of colour intensity.
− Sometimes simply called “Color Intensity”.
• Luminosity (Luminance) (Intensity (Gray Level)) is a measure of brightness.
− Sometimes simply called “Brightness” or “Lightness” (!?).
Hue, Saturation and Luminosity
37
yellowgreenblue
#Photons
Wavelength
Mean Hue
The dominant color as perceived by an observer
Hue, Saturation and Luminosity
Hue is an attribute associated with the dominant wavelength in a mixture of light waves.
− Hue represents dominant color as perceived by an observer.
− Thus, when we call an object red, orange, or yellow, we are referring to its hue.
38
Variance Saturation
Wavelength
high
medium
low
hi.
med.
low
#Photons
The relative purity or the amount of white light mixed with a hue
Hue, Saturation and Luminosity
Saturation refers to the relative purity or the amount of white light mixed with a hue.
− The pure spectrum colors are fully saturated.
− Colors such as pink (red and white) and lavender (violet and white) are less saturated, with the degree
of saturation being inversely proportional to the amount of white light added.
39
Area Luminosity#Photons
Wavelength
B. Area Lightness
bright
dark
It embodies the achromatic notion of intensity
Hue, Saturation and Luminosity
Brightness is a subjective descriptor that is practically impossible to measure.
− It embodies the achromatic (gray level) notion of intensity, and is one of the key factors in describing
color sensation.
40
Hue is an attribute associated with the dominant wavelength in a mixture of light waves.
− Hue represents dominant color as perceived by an observer.
− Thus, when we call an object red, orange, or yellow, we are referring to its hue.
Saturation refers to the relative purity or the amount of white light mixed with a hue.
− The pure spectrum colors are fully saturated.
− Colors such as pink (red and white) and lavender (violet and white) are less saturated, with the degree of saturation
being inversely proportional to the amount of white light added.
− Hue and saturation taken together are called chromaticity and, therefore, a color may be characterized by its
brightness and chromaticity.
Brightness is a subjective descriptor that is practically impossible to measure.
− It embodies the achromatic (gray level) notion of intensity, and is one of the key factors in describing color sensation.
Hue, Saturation and Luminosity
41
Hue, Saturation and Luminosity
42
Hue, Saturation and Luminosity
cylindrical coordinate system
43
Color Space
44
Color Space
45
Color Space
46
Hue, Saturation and Intensity (HSI Colour Model)
− Brightness is a subjective
descriptor that is practically
impossible to measure.
− It embodies the achromatic
notion of intensity and is
one of the key factors in
describing color sensation.
− We do know that intensity
(gray level) is a most useful
descriptor of achromatic
images. This quantity
definitely is measurable and
easily interpretable.
47






GBif360
GBif


H












 
2/12
1
)])(()[(
)]()[(
2
1
cos
BGBRGR
BRGR

)],,[min(
)(
3
1 BGR
BGR
S


)(
3
1
BGRI 
H
(S=1,V=1)
S
(H=1,V=1)
I
(H=1,S=0)
Hue, Saturation and Intensity
48






GBif360
GBif


H












 
2/12
1
)])(()[(
)]()[(
2
1
cos
BGBRGR
BRGR

)],,[min(
)(
3
1 BGR
BGR
S


)(
3
1
BGRI 
Various Colour Space Components
49
50
Additive vs. Subtractive Color Mixing
Additive vs. Subtractive Color Mixing
Subtractive Color Mix
The paint absorbs or
subtracts out wavelengths
and the color you see is
the wavelengths that
were reflected back to you
(not absorbed)
Additive mixture
The wavelengths are
added together so the
final color you see is the
sum of the wavelengths.
51
Additive Primary Colours
Additive Primary colours
• Red, Green & Blue are additive primaries - used for light.
52
Additive Color Mixing
400 500 600 700 nm
400 500 600 700 nm
RedGreen
Red and green make…
400 500 600 700 nm
Yellow
Yellow!
When colors combine by adding the
color spectra.
53
Subtractive Primaries Colours
− A subtractive color model explains the mixing of a limited set of dyes, inks, paint pigments to create a
wider range of colors, each the result of partially or completely subtracting (that is, absorbing) some
wavelengths of light and not others.
− The color that a surface displays depends on which parts of the visible spectrum are not absorbed and
therefore remain visible.
54
Subtractive Color Mixing
When colors combine by multiplying
the color spectra.
400 500 600 700 nm
CyanYellow
400 500 600 700 nm
Cyan and yellow make…
400 500 600 700 nm
Green!
Green
55
56
Subtractive Color Mixing, Examples
− All colour images can be broken down into 3 primary colours.
− Subtractive primaries: Magenta, Yellow & Cyan.
− Additive primaries :Red, Green & Blue
57
Additive vs. Subtractive Color Primaries
Secondary and Tertiary Colours
− Secondary Additive Colours: Cyan, Yellow, Magenta
− Primary Subtractive Colours: Red, Green, Blue
− Secondary additive colours are primary subtractive colours and visa versa
− Additive tertiary: White
− Subtractive tertiary: Black
58
Using Subtractive and Additive Primaries.
Using subtractive primaries.
• Colour printers have Cyan, Magenta & Yellow pigments.
• Black often included.
Using additive primaries.
• Colour primaries are Red, Green & Blue
• Film and drama set lighting uses additive primaries.
• Video uses additive primaries.
• The camera splits image into 3 primaries.
• Television builds image from 3 primaries.
59
− The color circle (color wheel) originated with Sir Isaac Newton, who in the seventeenth century created
its first form by joining the ends of the color spectrum.
− The color circle is a visual representation of colors that are arranged according to the chromatic
relationship between them.
Colour Circle (Colour Wheel)
60
− Based on the color wheel, for example, the proportion of any color can be increased by decreasing the
amount of the opposite (or complementary) color in the image.
− Similarly, it can be increased by raising the proportion of the two immediately adjacent colors or
decreasing the percentage of the two colors adjacent to the complement.
− Suppose, for instance, that there is too much magenta in an RGB image. It can be decreased: (1) by
removing both red and blue, or (2) by adding green.
Magenta 
Removing
Red and Blue
Adding
Green
Colour Circle (Colour Wheel)
61
Human Cone Sensitivity
− The highest point on each curve is called the “peak wavelength”, indicating the wavelength of
radiation that the cone is most sensitive to it.
Normalized Human Cone Sensitivity
62
Spectral Distribution of CIE Illuminants
63
Emission Spectrum and Reflectance Spectrums
64
− For any given object, we can measure its emission (or reflectance) spectrum, and use that to precisely
identify a color.
− If we can reproduce the spectrum, we can certainly reproduce the color!
− The sunlight reflected from a point on a lemon might have a reflectance spectrum that looks like this:
𝑆 𝜆
Emission Spectrum and Reflectance Spectrums
65
Spectral Power Distribution of Light Reflected from Specimen
66
Ex: Cones extraction for a point on the lemon
− By looking at the normalized areas under the curves, we can see how much the radiation reflected
from the real lemon excites each of cones.
− In this case, the normalized excitations of the S, M, and L cones are 0.02, 0.12, and 0.16 respectively.
Normalized Excitation of the S, M and L Cones
67
Two phenomena demonstrate that perceived brightness is not a simple function of intensity.
− Mach Band Effect: The visual system tends to undershoot or overshoot around the boundary of regions of
different intensities.
− Simultaneous Contrast: a region’s perceived brightness does not depend only on its intensity.
Perceived Brightness Relation with Intensity
68
Mach band effect.
Perceived intensity is
not a simple function
of actual intensity.
Examples of simultaneous contrast.
All the inner squares have the same intensity, but they appear
progressively darker as the background becomes lighter
− The term masking usually refers to a destructive interaction or
interference among stimuli that are closely coupled in time or
space.
− This may result in a failure in detection or errors in recognition.
− Here, we are mainly concerned with the detectability of one
stimulus when another stimulus is present simultaneously.
− The effect of one stimulus on the detectability of another,
however, does not have to decrease detectability.
Masking Recall
69
I: Gray level (intensity value)
Masker: Background 𝐼2 (one stimulus)
Disk: Another stimulus 𝐼1
In ∆𝑰 = 𝑰 𝟐 − 𝑰 𝟏,the object can be noticed by
the HVS with a 50% chance.
− Under what circumstances can the disk-shaped object be
discriminated from the background (as a masker stimulus) by
the HVS? Weber’s law:
− Weber’s law states that for a relatively very wide range of I
(Masker), the threshold for disc discrimination, ∆𝑰, is directly
proportional to the intensity I.
• Bright Background: a larger difference in gray levels is needed
for the HVS to discriminate the object from the background.
• Dark Background: the intensity difference required could be
smaller.
Masking Recall
70
Contrast Sensitivity Function (CSF)
I: Gray level (intensity value)
Masker: Background 𝐼2 (one stimulus)
Disk: Another stimulus 𝐼1
In ∆𝑰 = 𝑰 𝟐 − 𝑰 𝟏,the object can be noticed by
the HVS with a 50% chance.
∆𝑰
𝑰
= 𝑪𝒐𝒏𝒔𝒕𝒂𝒏𝒕 (≈ 𝟎. 𝟎𝟐)
− The HVS demonstrates light adaptation characteristics and as a consequence of that it is sensitive to
relative changes in brightness. This effect is referred to as “Luminance masking”.
“ Luminance Masking: The perception of brightness is not a linear function of the luminance”
− In fact, the threshold of visibility of a brightness pattern is a linear function of the background luminance.
− In other words, brighter regions in an image can tolerate more noise due to distortions before it becomes
visually annoying.
− The direct impact that luminance masking has on image and video compression is related to
quantization.
− Luminance masking suggests a nonuniform quantization scheme that takes the contrast sensitivity function
into consideration.
Luminance Masking
71
− It can be observed that the noise is more visible in the dark area than in the bright area if comparing, for
instance, the dark portion and the bright portion of the cloud above the bridge.
72The bridge in Vancouver: (a) Original and (b) Uniformly corrupted by AWGN.
Luminance Masking
Luminance Masking
− The perception of brightness is not a linear function of the luminance.
− The HVS demonstrates light adaptation characteristics and as a consequence of that it is sensitive to
relative changes in brightness.
Contrast Masking
− The changes in contrast are less noticeable when the base contrast is higher than when it is low.
− The visibility of certain image components is reduced due to the presence of other strong image
components with similar spatial frequencies and orientations at neighboring spatial locations.
Contrast Masking
73
With same MSE:
• The distortions are clearly visible in the ‘‘Caps’’ image.
• The distortions are hardly noticeable in the ‘‘Buildings’’ image.
• The strong edges and structure in the ‘‘Buildings’’ image
effectively mask the distortion, while it is clearly visible in the
smooth ‘‘Caps’’ image.
This is a consequence of the contrast masking property of the HVS i.e.
• The visibility of certain image components is reduced due to
the presence of other strong image components with similar
spatial frequencies and orientations at neighboring spatial
locations.
Contrast Masking
74
(a) Original ‘‘Caps’’ image (b) Original ‘‘Buildings’’ image
(c) JPEG compressed image, MSE = 160 (d) JPEG compressed image, MSE = 165
(e) JPEG 2000 compressed image, MSE =155 (f) AWGN corrupted image, MSE = 160.
− In developing a quality metric, a signal is first decomposed into several frequency bands and the HVS
model specifies the maximum possible distortion that can be introduced in each frequency component
before the distortion becomes visible.
− This is known as the Just Noticeable Difference (JND).
− The final stage in the quality evaluation involves combining the errors in the different frequency
components, after normalizing them with the corresponding sensitivity thresholds, using some metric such
as the Minkowski error.
− The final output of the algorithm is either
• a spatial map showing the image quality at different spatial locations
• a single number describing the overall quality of the image.
Developing a Quality Metric Using Just Noticeable Difference (JND)
75
Frequency Response of the HVS
− Spatial Frequency Response
− Temporal Frequency Response and Flicker
− Spatio-temporal Response
− Smooth Pursuit Eye Movement
76
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 + 𝝋 𝟎] 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕 + ƴ𝝋 𝟎)
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕 + 𝝋 𝟎)
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 + 𝝋 𝟎]
Spatial Frequency
− Spatial frequency measures how fast the image intensity changes in the image plane
− Spatial frequency can be completely characterized by the variation frequencies in
two orthogonal directions (horizontal and vertical)
− 𝒇 𝒙: cycles/horizontal unit distance
− 𝒇 𝒚: cycles/vertical unit distance
− It can also be specified by magnitude and angle of change
77
𝒇 𝒔 = 𝒇 𝒙
𝟐
+ 𝒇 𝒚
𝟐
𝝋 = 𝐭𝐚𝐧−𝟏
𝒇 𝒚
𝒇 𝒙
𝒇 𝒙 = 𝒇 𝒔 cos 𝝋
𝒇 𝒚 = 𝒇 𝒔 sin 𝝋
Spatial Frequency
78
𝑓𝑠 = 25
φ = tan−1
0
5
𝑓𝑠 = 125
φ = tan−1
10
5
− Two-dimensional signals:
(a) 𝒇 𝒙, 𝒇 𝒚 = (𝟓, 𝟎)
(b) 𝒇 𝒙, 𝒇 𝒚 = (𝟓, 𝟏𝟎)
− The horizontal and vertical units
are the width and height of the
image, respectively.
− Therefore, 𝑓𝑥 = 5 means that
there are five cycles along
each row.
− Previously defined spatial frequency depends on viewing distance.
− Angular frequency is what matters to the eye! (viewing distance is included in it)
79
𝜽 = 𝟐 𝐭𝐚𝐧−𝟏
𝒉
𝟐𝒅
𝑹𝒂𝒅 ≈
𝟐𝒉
𝟐𝒅
𝑹𝒂𝒅 =
𝟏𝟖𝟎
𝝅
.
𝒉
𝒅
(𝑫𝒆𝒈𝒓𝒆𝒆)
𝒇 𝒔[𝒄𝒑𝒅] =
𝒇 𝒔[𝑪𝒚𝒄𝒍𝒆𝒔/𝑼𝒏𝒊𝒕 𝒅𝒊𝒔𝒕𝒂𝒏𝒄𝒆]
𝜽
=
𝝅
𝟏𝟖𝟎
.
𝒅
𝒉
𝒇 𝒔[𝑪𝒚𝒄𝒍𝒆𝒔/𝑼𝒏𝒊𝒕 𝒅𝒊𝒔𝒕𝒂𝒏𝒄𝒆]
Spatial Frequency (cycles per degree)
− If the stimulus is a spatially periodical pattern (or grating), it is defined
by its spatial frequency, which is the number of cycles per unit of
subtended angle.
− The luminance profile of a sinusoidal grating of frequency f oriented
along a spatial direction defined by the angle 𝝋 can be written as:
− where 𝒀 𝟎 is the mean luminance of the grating, m is its amplitude
and 𝒇 𝒙 and 𝒇 𝒚 are its spatial frequencies along the x and y directions,
respectively (measured in cycles per degree, cpd) ; that is:
80
𝒀 𝒙, 𝒚 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 ]
𝒇 𝒔 = 𝒇 𝒙
𝟐
+ 𝒇 𝒚
𝟐
𝝋 = 𝐭𝐚𝐧−𝟏
𝒇 𝒚
𝒇 𝒙
𝒇 𝒙 = 𝒇 𝒔 cos 𝝋
𝒇 𝒚 = 𝒇 𝒔 sin 𝝋
𝒇 𝒙 = 𝒇 𝒔 cos 𝝋
𝒇𝒚=𝒇𝒔sin𝝋
𝒇 𝒔
𝝋
Spatial Frequency (cycles per degree)
− Ex: The value of the phase, 𝝋, determines the luminance at the
origin of coordinates (𝑥 = 0, 𝑦 = 0).
• If 𝝋 is zero or a multiple of π rad, luminance is zero at the origin
and the pattern has odd symmetry
• If 𝝋 is 𝝅/𝟐 rad or an odd multiple of 𝝅/𝟐 rad, luminance is
maximum at the origin and the pattern has even symmetry.
81
𝒇 𝒔 = 𝒇 𝒙
𝟐
+ 𝒇 𝒚
𝟐
𝝋 = 𝐭𝐚𝐧−𝟏
𝒇 𝒚
𝒇 𝒙
𝒇 𝒙 = 𝒇 𝒔 cos 𝝋
𝒇 𝒚 = 𝒇 𝒔 sin 𝝋
Spatial Frequency (cycles per degree)
𝒇 𝒙 = 𝒇 𝒔 cos 𝝋
𝒇𝒚=𝒇𝒔sin𝝋
𝒇 𝒔
𝝋
Contrast Measurement
− Contrast is the physical parameter describing the magnitude of the luminance variations around the
mean in a scene.
− Types of stimulus to measure contrast sensitivity
• Aperiodic stimulus
• Periodic stimulus (Sinusoidal grating, Square grating)
82
Contrast Measurement
Aperiodical stimulus and Weber’s contrast
− If the stimulus is aperiodical, contrast is simply defined as
− where △ 𝑌 is the luminance amplitude of the stimulus placed
against the luminance 𝑌0 (i.e., the background), provided that
𝑌0 ≠ 0
83
𝑪 =
△ 𝒀
𝒀 𝟎
△ 𝒀
Contrast Measurement
Michaelson’s contrast
− Note that the grating shown in figure 𝒇 𝒚 = 𝟎 and 𝝋=𝝅/𝟐 rad.
− Contrast is defined in this case as follows:
− This definition can be applied to any periodical pattern, no
matter if sinusoidal, square or any other type.
84
𝑪 =
𝒎
𝒀 𝟎
=
𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏
𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏
𝒀 𝒙, 𝒚 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒙 𝒙)
𝒀 𝒎𝒊𝒏
𝒀 𝒎𝒂𝒙
𝒎
Contrast Sensitivity Function (CSF)
Contrast Sensitivity
− Contrast sensitivity is the inverse of the minimum contrast necessary to
detect an object against a background or to distinguish a spatially
modulated pattern from a uniform stimulus (threshold contrast).
Contrast Sensitivity Function (CSF)
− Contrast sensitivity is a function of the spatial frequency (f) and the
orientation (θ or 𝝋) of the stimulus; that is why we talk of the Contrast
Sensitivity Function or CSF.
85
𝑪 𝒎𝒊𝒏 =
𝒎 𝒎𝒊𝒏
𝒀 𝟎
=
𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏
𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏
𝑺 =
𝟏
𝑪 𝒎𝒊𝒏
=
𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏
𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏
𝒀 𝒙, 𝒚 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒙 𝒙)
𝒀 𝒎𝒊𝒏
𝒀 𝒎𝒂𝒙
𝒎 𝒎𝒊𝒏
Different Spatial Frequency in x Direction
86
87
Spatial Contrast Sensitivity Function
Increasing Frequency
DecreasingContrast
Invisible
Visible
Low Contrast
High Contrast
IncreasingContrastSensitivity
𝑺 =
𝟏
𝑪 𝒎𝒊𝒏
88
− Similar to a band-pass filter
• Most sensitive to mid frequencies
• Least sensitive to high frequencies
− Also depends on the orientation of grating
• Most sensitive to horizontal and vertical ones
Spatial Contrast Sensitivity Function
Increasing Frequency
DecreasingContrast
Invisible
Visible
Low Contrast
High Contrast
IncreasingContrastSensitivity
𝑺 =
𝟏
𝑪 𝒎𝒊𝒏
89
Chrominance Spatial Contrast Sensitivity Function
90
− Essentially, the HVS is more sensitive to lower spatial frequencies
and less sensitive to high spatial frequencies.
• The sensitivity to luminance information peaks at around 5–8
cycles/deg. This corresponds to a contrast grid with a stripe
width of 1.8 mm at a distance of 1 m.
• Luminance sensitivity falls off either side of this peak and has
little sensitivity above 50 cycles/deg.
• The peak of the chrominance sensitivity curves occurs at a
lower spatial frequency than that for luminance and the
response falls off rapidly beyond about 2 cycles/deg. It should
also be noted that our sensitivity to luminance information is
about three times that for Red-Green and that the Red-Green
sensitivity is about twice that of Blue-Yellow.
Chrominance Spatial Contrast Sensitivity Function
91
The rapid eye movements that allow us to quickly scan a visual
scene.
− The different stabilization settings used to remove the effect of
saccadic eye movements.
− The stabilization allowed us to control retinal image motion
independent of eye movements.
− Specifically, the image of the stimulus display was slaved to the
subject's eye movements. The image of the display screen moved
in synchrony with the eye movement so that its image remained
stable on the retina, irrespective of eye velocity.
− With this technique, we could control the retinal velocity by
manipulating the movement of the stimulus on the display.
Saccades Eye Movements and Stabilization
92
Saccadic eye movement enhances (increases)
the sensitivity but reduces the frequency that
peak occurs (peak shifts to left side)
• Filled circles were obtained under normal,
unstablized conditions
• Open squares, with optimal gain setting for
stabilization (to control retinal image motion
independent of eye movements)
• Open circles, with the gain changed about 5
percent.
100
50
20
10
SpatialContrastsensitivity(LogScale)
5
2
0.2
1
0.5 1 2 5 10
Spatial frequency (cpd)
Spatial Contrast Sensitivity Function
Low Contrast
High Contrast
The optimal gain
setting for stabilization
Under normal, unstablized conditions
With the gain changed
about 5 percent
Modulation Transfer Function (MTF)
93
Modulation Transfer Function (MTF)
− In optics, a well-known function of spatial frequency used to characterize the quality of an imaging
system is the Modulation Transfer Function (MTF)
− It can be measured by obtaining the image (output) produced by the system of a sinusoidal pattern of
frequency f, orientation θ and contrast 𝐶𝑖𝑛(𝑓, 𝜃) (input)
− Where 𝐶 𝑜𝑢𝑡(𝑓, 𝜃) is the contrast of the image, which is also a sinusoid, provided the system is linear and
spatially invariant.
− This formula may be read as the contrast transmittance of a filter.
94
𝑴𝑻𝑭 𝒇, 𝜽 =
𝑪 𝒐𝒖𝒕(𝒇, 𝜽)
𝑪𝒊𝒏(𝒇, 𝜽)
Modulation Transfer Function (MTF)
− If the visual system is treated globally as a linear, spatially invariant, imaging system and if it is
assumed that, at threshold, the output contrast must be constant, i.e. independent of spatial
frequency, it is easy to demonstrate that the MTF and the CSF of the visual system must be
proportional.
− In fact, if the input is a threshold contrast grating, the MTF can be written as:
− We are assuming that 𝐶𝑡ℎ𝑟𝑒𝑠,𝑜𝑢𝑡(𝑓, 𝜃) is an unknown constant, let us say 𝑅0
− Thus, both functions have the same shape and differ only in an unknown global factor.
95
𝑴𝑻𝑭 𝒇, 𝜽 =
𝑪 𝒕𝒉𝒓𝒆𝒔,𝒐𝒖𝒕(𝒇, 𝜽)
𝑪 𝒕𝒉𝒓𝒆𝒔,𝒊𝒏(𝒇, 𝜽)
𝑴𝑻𝑭 𝒇, 𝜽 =
𝑹 𝟎
𝑪 𝒕𝒉𝒓𝒆𝒔,𝒊𝒏(𝒇,𝜽)
= 𝑹 𝟎 𝑪𝑺𝑭(𝒇, 𝜽)
Temporal Frequency
Temporal frequency measures temporal variation (cycles/s)
− In a video, the temporal frequency is actually 2-dimensional; each point in space has its own
temporal frequency
− Non-zero temporal frequency can be caused by camera or object motion
96
Temporal Contrast Sensitivity Function: TCSF
− A spatially uniform pattern whose luminance is time modulated by a sinusoidal function is
mathematically described as follows:
− Where 𝒇 𝒕 is the temporal frequency
− Again we have
97
𝒀(𝒕) = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕)
𝑺 =
𝟏
𝑪 𝒎𝒊𝒏
=
𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏
𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏
TCSF: The function relating
contrast sensitivity to
temporal frequency
98The responses obtained with different mean brightness
levels, B, measured in trolands.
Critical flicker frequency: The lowest frame rate at
which the eye does not perceive flicker.
– Provides guideline for determining the
frame rate when designing a video system
(Brightness level↑ ⇒ Shift to right)
– Critical flicker frequency depends on the
mean brightness of the display:
– 60 Hz is typically sufficient for watching
TV.
– Watching a movie needs lower frame
rate than TV
Temporal Contrast Sensitivity Function: TCSF
200
100
50
TemporalContrastsensitivity(LogScale)
20
10
5
2
1
9300 trolands
850 trolands
77 trolands
7.1trolands
0.65trolands
0.06trolands
(Flicker) Frequency (Hz)
2 5 10 20 50
Contrast Sensitivity In The Spatio-temporal Domain
A luminance pattern that changes as a function of position (x,y) and time, t, is a spatio-temporal pattern.
− Spatiotemporal patterns usually employed as stimuli are counterphase gratings and travelling gratings.
99
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 ] 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕)
Contrast Sensitivity In The Spatio-temporal Domain
− In counterphase sine gratings, luminance is sinusoidally modulated both in space (with frequencies 𝒇 𝒙, 𝒇 𝒚)
and in time (with frequency 𝒇 𝒕).
− In counterphase sine gratings, if 𝒇 𝒚 = 0 , the corresponding luminance profile would be:
100
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 ] 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕)
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒙 𝒙) 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕) 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒕 𝒕 )]
Contrast Sensitivity In The Spatio-temporal Domain
A travelling grating is a spatial pattern that moves with a given velocity, 𝒗.
− If we assume 𝒇 𝒚 = 0 , the luminance profile would be:
− where 𝒇 𝒕 = 𝒗 × 𝒇 𝒙 is the temporal frequency of the luminance modulation caused by the motion at each
point (𝑥, 𝑦) of the pattern
− The sign (±) accompanying the variable 𝑣 indicates whether the grating is moving towards the left or the
right, respectively.
101
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒕 𝒕 ]
𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅𝒇 𝒙 𝒙 ± 𝒗𝒕 ]
𝒇 𝒕 = 𝒗 × 𝒇 𝒙
Contrast Sensitivity In The Spatio-temporal Domain
(Spatiotemporal CSF)
− The contrast sensitivity becomes here a 3D function, or 2D if the spatial pattern is 1D, that is, if the
modulation occurs only along the x or the y directions.
102
The spatio-temporal CSF surface
Cross-sections of the 2D spatio-temporal CSF surface at different
values of the temporal (left) or the spatial temporal frequency
(right), to obtain, respectively, the spatial and the temporal CSFs.
Spatiotemporal Response
103
300
100
10
30
SpatialContrastsensitivity
0.3
3
1 3 10
Spatial frequency (cpd)
30
300
100
10
30
TemporalContrastsensitivity
0.3
3
1 3 10
Temporal frequency (Hz)
30
Spatiotemporal frequency response of the HVS.
− The reciprocal relation between spatial and temporal sensitivity was used in TV system design.
Spatial Frequency ↑ ⇒ Temporal Frequency↓ Temporal Frequency ↑ ⇒ Spatial Frequency↓
− Interlaced scan provides tradeoff between spatial and temporal resolution.
Spatial frequency responses for different temporal frequencies Temporal frequency responses for different spatial frequencies
1Hz 6Hz
16Hz
22Hz
0.5 cpd
4 cpd
16 cpd
22 cpd
(𝒙, 𝒚)
(𝒙, 𝒚)
𝒗 𝒙, 𝒗 𝒚
𝒙 + 𝒗 𝒙 𝒕, 𝒚 + 𝒗 𝒚 𝒕
𝒙 + 𝒗 𝒙 𝒕, 𝒚 + 𝒗 𝒚 𝒕
104
− Suppose single object with constant velocity (Temporal Frequency caused by Linear Motion)
− Illustration of the constant intensity assumption under motion.
− Every point (𝑥, 𝑦) at t=0 is shifted by 𝑣 𝑥 𝑡, 𝑣 𝑦 𝑡 to 𝑥 + 𝑣 𝑥 𝑡, 𝑦 + 𝑣 𝑦 𝑡 at time t, without change in
color or intensity.
− Alternatively, a point (𝑥, 𝑦) at time t corresponds to a point 𝑥 − 𝑣 𝑥 𝑡, 𝑦 − 𝑣 𝑦 𝑡 at time zero.
Relation between Motion, Spatial and Temporal Frequency
Relation between Motion, Spatial and Temporal Frequency
Consider an object moving with speed (𝒗 𝒙, 𝒗 𝒚)
Assume the image pattern at t=0 is 𝝍 𝟎(𝒙, 𝒚)
The image pattern at time t is
Relation between motion, spatial, and temporal frequency:
It means that the temporal frequency of the image (𝒇𝒕) of a moving object depends on
motion as well as the spatial frequency of the object.
Example: A plane with vertical bar pattern, moving vertically, causes no temporal change;
But moving horizontally, it causes fastest temporal change
105
𝝍 𝒙, 𝒚, 𝒕 = 𝝍 𝟎(𝒙 − 𝒗 𝒙 𝒕, 𝒚 − 𝒗 𝒚 𝒕)
𝜳 𝒇 𝒙, 𝒇 𝒚, 𝒇 𝒕 = 𝜳 𝟎(𝒇 𝒙, 𝒇 𝒚)𝜹(𝒇 𝒕 + 𝒗 𝒙 𝒇 𝒙 + 𝒗 𝒚 𝒇 𝒚)
𝒇 𝒕 = −(𝒗 𝒙 𝒇 𝒙 + 𝒗 𝒚 𝒇 𝒚)
Continuous Space Fourier Transform (CSFT)
Smooth Pursuit Eye Movement
− Smooth Pursuit: the eye tracks moving objects
− Net effect: reduce the velocity of moving objects on the retinal plane, so that the eye can
perceive much higher raw temporal frequencies than indicated by the temporal frequency
response.
Temporal frequency caused by object motion when the object is moving at (𝑣 𝑥, 𝑣 𝑦):
Observed temporal frequency at the retina when the eye is moving at (෤𝑣 𝑥, ෤𝑣 𝑦)
106
෨𝒇 𝒕 = 𝒇 𝒕 + (෥𝒗 𝒙 𝒇 𝒙 + ෥𝒗 𝒚 𝒇 𝒚)
෨𝒇 𝒕 = 𝟎 𝐢𝐟 ෥𝒗 𝒙=𝒗 𝒙, ෥𝒗 𝒚=𝒗 𝒚
A Review on Standard Definition (SD)
− Television is transmitted and recorded as frames.
• Similar to film.
− Each frame is scanned in the camera or camcorder.
• This is called a raster scan.
• Raster scan scans line by line from top to bottom.
• Each line is scanned from left to right.
107
Field, Frame, Progressive and Interlaced Scan
− Continuous scan is called a progressive scan.
− Progressive scans tend to flicker.
− Television splits each frame into two scans.
• One for the odd lines and another for the even lines.
• Each interlaced scan called a field.
• Therefore odd lines (odd field) +even lines(even field) = 1 frame.
− This is called an interlaced scan.
Interlace benefits:
I. The needed bandwidth for odd lines (odd field) +even lines(even field) is equal to the needed bandwidth for one frame
(ex: 50i/25p).
II. Interlaced scans flicker a lot less than progressive scans (ex: 50i/25p).
108
Odd lines
1
2
3
4
5
6
7
8
9
:
:
:
Field 1
569
570
571
572
573
574
575
576 109
1
2
3
4
5
6
7
8
9
:
:
:
570
571
572
573
574
575
576
Even lines
Field 2
110
.
.
.
.
111
Interlaced Scan
112
Progressive Scan
field2field1
113
Interlaced Scan
Even Field
114
Odd Field
115
116
Even Field Odd Field+
Odd lines
Even lines1
2
3
4
5
6
7
8
9
:
:
:
570
571
572
573
574
575
576
117
Progressive Scan
21
24
118
Vertical Blanking
1
22
23
310
311
313
335
336
623
624
625
2
309
312
334
337
622
314
Field 2
Field 1
SDI
The 720 Standard (SMPTE 269M)
119
The 1080 Standard (SMPTE 274M)
120
Image Formats for High Definition
121
High Definition Formats
There are three factors in defining High Definition formats
Resolution
1. 1920 × 1080
2. 1280 ×720
Scanning method (i /p)
1. Interlaced
2. Progressive
Frame rate (fps)
1. 23.98 (24)
2. 25
3. 29.97 (30)
4. 50
5. 59.94 (60)
122
1080i vs. 1080p vs. 720p
1080i
– Widely used format which is defined as 1080 lines, 1920 pixels per line, interlace scan.
– The 1080i statement alone does not specify the field or frame rate which can be:
25 or 30 fps
50 or 60 fps
1080p
– 1080 x 1920 size pictures, progressively scanned. Frame rates can be:
24, 25, 30, 50, or 60 fps
720p
– 1280 x 720 size picture progressively scanned.
24, 25, 30, 50, or 60 fps
− Progressive scan at a high picture refresh rate: well portray action such as in sporting events for smoother slow motion
replays, etc.
123
In Displays
In Displays
The High Definition Signal
Horizontal Interval
SMPTE 274M (The 1080 standard)
124
The High Definition Signal
Vertical Blanking
125
19
22
126
The High Definition Signal
Vertical Blanking
1
20
21
560
561
563
564
583
584
1123
1124
1125
2
559
562
582
585
1122
565
Field 2
Field 1
19
42
1
41
42
1121
1122
1124
2
1120
1123
Frame
1125
The High Definition Signal
Vertical Blanking
127
Progressive scan
− Delivers higher spatial resolution for a given frame size
(better detail)
– Has the same (temporal) look as film
– Good for post and transfer to film
– No motion tear
Interlaced scan
− Delivers higher temporal resolution for a given frame size
(better motion portrayal)
– Has the same (temporal) look as video
– Shooting is easier
– Post production on video is easier
– Interlacing causes motion tears and ‘video’ look
128
Scanning Techniques Pros and Cons
129
Odd and even lines are in different places when there is fast motion
Odd field Even field Odd + Even
No
motion
Motion
Fast
Scanning Techniques Pros and Cons
Progressive or Interlace Shooting?
130
Progressive (25p)
131
Interlaced (field 1) (50i)
132
Interlaced (field 2) (50i)
133
Interlaced Frame (50i)
134
Interlace (50i)Progressive (25p)
Delivers higher spatial resolution for a given
frame size (better detail)
Delivers higher temporal resolution for a given
frame size (better motion portrayal)
135
Interlaced Frame (50i) and Progressive Frame (25p)
In High Speed
Progressive (50p) Interlace (50i)
136
Standard Monochrome Signals
First commercial standards were 60 lines.
Original ‘high definition’ is 405 lines monochrome.
Later standards were 525 and 625 lines.
• Half the number of lines in each field.
Each line consists of active line and horizontal blanking.
• Active line for the picture and horizontal blanking for the flyback interval.
• Horizontal syncs in the horizontal blanking locks the picture horizontally.
Each fields and frame consists of active video and vertical blanking.
• Active video is all the lines within the picture.
• Vertical blanking are the lines that are not seen.
• Vertical syncs in the vertical blanking locks the picture vertically.
Signal is “zero” for black.
Signal increases as the brightness increases.
Negative signal used for synchronisation.
137
Composite Video Signal (Monochrome)
Front Porch
Vision
12 µs 52 µs
Composite video signal:
video signal+Blanking+sync pulse
138
139
The Basic Television Signal
The Basic Television Signal
140
141
The Basic Television Signal
The basic television signal
142
The basic television signal
Short white areas of the
line for the sails produce
sharp white spikes in the
signal.
This part of the line
with black shadows
produces a low signal.
Trees and bushes with
light and dark areas
produce an undulating
signal.
The sky is bright and
produces a high signal
almost as high as the white
sails.
Shadows in the
trees produce a
low signal.
Very small bright area between the
trees produces a very sharp spike in
the signal
143
Deflection System
144
Color Video Signal Formats
145
− Colour pictures can be broken down into three primaries.
− Red Green Blue.
− Original plan to use these primaries in colour television.
− The colour are called components.
RGB
 RGB signals offer the most faithful reproduction in both image brightness and color depth. This
is because they are obtained right after the R, G, and B imagers with minimum video
processing in between.
 Each one of the R, G, and B signals contains information on brightness and color in the same
channel.
 RGB signals are also called full bandwidth signals because they are not limited in bandwidth,
which is the case with other signal formats. This is another reason why RGB signals provide the
best video quality.
146
Color Video Signal Formats
Camera Monitor
Transmission
147
Color Video Signal Formats
148
Color Video Signal Formats
149
Color Video Signal Formats
150
Color Video Signal Formats
Problem with Red, Green & Blue components
Many existing black-&-white television customers.
• Needed to keep these customers happy when colour TV was introduced.
Old black and white signal needed.
Matrix in the camera converts from RGB to Y (R-Y) (B-Y) .
• Y is the black-&-white signal.
• (R-Y) and (B-Y) are two colour difference signals
151
Y ,R,G and B
Relative Human Sensitivity
Wavelength(nm)
400nm 700nm
47%
47.092.017.0
92.0
59.0
47.092.017.0
47.0
3.0
47.092.017.0
17.0
11.0






Y=0.11B+0.3R+0.59G
92%
17%
152
If we measure a human eye’s sensitivity to every
wavelength, we get a luminosity function.
550nm
Video Signal Formats
Y/R-Y/B-Y
– The Y/R-Y/B-Y signal is called the component signal.
– The Y/R-Y/B-Y signal is obtained by feeding the RGB signal to a matrix circuit, which separates it into color information and
brightness information.
– This makes the signal easier to process.
– Information on the total brightness of the three RGB signals is combined into one signal called the luminance signal (Y),
while information on color is packed into two signals called the color difference signals (R-Y/B-Y).
– Information on luminance is not bandwidth-restricted and is equivalent to that of the RGB signal.
– Information on color (R-Y/B-Y) is bandwidth-limited to a certain extent, but kept sufficient for the human eye’s sensitivity to
fine color detail, which is less than that to brightness.
153
Y/R-Y/B-Y Components
154
Matrix
R BG
Y R-Y B-Y
Old Black & White televisions ignore the colour
components and only use the monochrome component
155
Video Signal Formats
Composite Video
– The composite signal is obtained by adding the luminance (Y) and chrominance (C) signals of the Y/C signal to form one
signal, which contains both brightness and color information.
– This is achieved in such a way that the bandwidth of the chrominance signal overlaps with that of the luminance signal.
This allows the composite signal to provide both luminance and chrominance information (color images) using the same
bandwidth as the black and white signal.
– Technically, this is achieved by modulating the color signals on a carrier signal (= color subcarrier) that does not interfere
with the luminance signal’s spectrum.
– The frequency of the color carrier signal is determined so its spectrum interleaves with the spectrum of the luminance
signal.
 For NTSC video, this is approximately 3.58 MHz.
 for PAL video ,this it is approximately 4.43 MHz.
– This prevents the chrominance (C) and luminance signals (Y) from mixing with each other when they are added together
to form the composite signal.
– The composite signal can be separated back into its luminance and chrominance components using special filters,
known as comb filters. 156
157
Video Signal Formats
Composite Video
PAL Colour Signal
− Improved European colour television standard.
− Co-designed in Germany and England.
− More complex than NTSC, but better colours.
− 625 total lines in each frame.
− 576 picture lines.
− Interlaced scanning at 25 frames per second.
− 50 fields per second.
PAL = Phase Alternation by Line
158
U=0.493(B’-Y’)
V=0.877(R’-Y’)
159
160
The Color Bars Test Signal
U=0.493 (B’-Y’)
V=0.877 (R’-Y’)
These particular weighting factors
ensure that the subcarrier
excursions are around 33% above
white level for saturated yellow
and cyan color bars and 33%
maximum below black level for
red and blue bars.
161
NTSC Color Signal
 NTSC is a standard-definition composite video signal format primarily used in North America, Japan, Korea, Taiwan, and
parts of South America.
 Its name is an acronym for National Television Systems Committee.
 Tends to suffer from bad colours.
• Nicknamed “Never The Same Colour”!
 525 total lines in each frame.
• 483 picture lines.
 Interlaced scanning at 30 frames per second.
• Actually 29.97 frames per second to be exact.
• 60 fields per second.
 Color information is encoded on a 3.58-MHz sub-carrier, which is transmitted together with the luminance information.
162
Q: Green-PurpleI: Orange-Cyan
Arctan (Q/ I) = Hue
Square (I 2+ Q 2) = Saturation
163
• The positive polarity of Q is purple, the negative is green. The positive polarity of I is orange, the negative is
cyan. Thus, Q is often called the "green-purple" or "purple-green" axis information and I is often called the
"orange-cyan" or "cyan-orange" axis information.
• The human eye is more sensitive to spatial variations in the "orange-cyan" (the color of face!) than it is for
the "green-purple“. Thus, the "orange-cyan" or I signal has a maximum bandwidth of 1.5 MHz and the
"purple-green" only has a maximum bandwidth of 0.5 MHz.
• Now, the Q and I signals are both modulated by a 3.58 MHz carrier wave. However, they are modulated
out of 90 degrees out of phase (QAM). These two signals are then summed together to make the C or
chrominance signal.
• The nomenclature of the two signals aids in remembering what is going on. The I signal is In-phase with the
3.58 MHz carrier wave. The Q signal is in Quadrature (i.e. 1/4 of the way around the circle or 90 degrees
out of phase, or orthogonal) with the 3.58 MHz carrier wave.
164
NTSC Color Signal
• Position the band limited chrominance at the high end of the luminance spectrum, where the luminance
is weak, but still sufficiently lower than the audio (at 4.5 MHz).
• The two chrominance components (I and Q) are multiplexed onto the same sub- carrier using QAM.
• The resulting video signal including the baseband luminance signal plus the chrominance components
modulated to fc is called composite video signal.
165
NTSC Color Signal
Colour Television Standards
PAL
1963
NTSC
1953
SECAM
1958~ 1967
Color System Frame per
Second
Lines
Quantity
Bandwidth
(MHz)
B &W
Modulation
Color
Modulation
Audio
Modulation
NTSC 30 525 6 AM QAM FM
PAL 25 625 7-8 AM QAM FM
SECAM 25 625 7-8 AM FM FM
166
The chromaticity of a color is then specified by two derived parameters x and y
167
CIE xy Chromaticity Diagram
(The CIE 1931 color space chromaticity diagram)
The CIE 1931 color space
chromaticity diagram.
locus of non-spectral purples
X, Y, and Z are the imaginary primaries.
Y means luminance, Z is somewhat equal
to blue, and X is a mix of cone response
curves chosen to be orthogonal to
luminance and non-negative.
A color gamut is the complete range of colors allowed for a specific color space.
No video, film or printing technology is able to fill this space.
• NTSC and PAL are well inside natural colours.
• The extent of any technology is called the gamut.
Outside edge defines fully saturated colours.
Purple is “impossible”.
Each corner of the gamut defines the primary colours.
Color Gamut
168
169
Color Gamut
170
Color Gamut
171
Color Gamut
(Inner triangle: HDTV primaries, Outer triangle: UHDTV primaries)
0 .1 .2 .3 .4 .5 .6 .7 .8
0
.1
.2
.3
.4
.5
.6
.7
.8
y
0 .1 .2 .3 .4 .5 .6 .7 .8
0
.1
.2
.3
.4
.5
.6
.7
.8
y
(a) Carnation
x
(b) Geranium and marigold
x
Wide Color Gamut Makes Deeper Colors Available
172
173
Color Gamut
174
Y=0.11B+0.3R+0.59G
Colorimetry
SD and HD comparison
SD
HD SD
Colorimetry
SD and HD comparison
175
Colorimetry
SD and HD comparison
176
HD
SD
HD
SD
177
Colorimetry
SD and HD comparison
178
Colorimetry
SD and HD comparison
Colorimetry
Color bars
Y
Cr
Cb
SD BT 709
179
SD
HD
Colorimetry
SD and HD comparison
180
SD
HD
Colorimetry
Color bars: How does it look, what does it matter
Out of Gamut
181
SD
HD
Colorimetry
Color bars: How does it look, what does it matter
− Correct colour bars will look wrong!
− Incorrect conversion/colourimetry will produce
• Oversaturation
• Undersaturation
• Gamut alarms
− HD has a redefined colourimetry
− Wider range of reproducible colours
− More colourful pictures
− Colour bars are different
182
Decibels (dB)
Decibels are defined by the following equation:
dB = 20 × log(v’/v)
(v’: Value to express in decibels v: Well-known value = 1.0 (V))
 This can be rearranged as:
dB/20
v’ = v × 10
 Since the relative value is being discussed, by substituting
v = 1.0 (volt) the given equation is:
dB/20
v’ = 10
183
Decibels (dB)
− The most decibel values that need to be remembered are shown in the following table.
Referring to this table:
 A 20 dB signal gain up means the signal level has been boosted by 10 times.
A 6 dB signal drop (= minus 6 dB) means the signal level has fallen to one half.
184
S/N (Signal-to-Noise) Ratio
 Noise refers to the distortion of the original signal due to external electrical or mechanical Factor.
(S/N )ratio = 20 × log (Vp-p/Nrms) (dB)
 A 60 dB S/N means that the noise level is one-thousandth of the signal level.
 For video, the signal amplitude (Vp-p) is calculated as 1 volt, which is the voltage (potential difference)
from the bottom of the H-sync signal (sync tip) to the white video level.
 Noise level changes over time, and amplitude cannot be used to express its amount. Root mean square
(rms) is a statistical measure for expressing a group of varying values, and allows the magnitude of noise to
be expressed with a single value. Root mean square can be considered a kind of average of temporal
values.
185
4:4:4
Vs
4:2:2
vs
4:2:0
4:2:0
Vs
4:1:1
186
Color Sampling and Sub-Sampling
4:4:4 Line structure8 x 3 x 720 x 576 x 25 = 250 Mbps
Full color resolution
8-bit SD system : 250 Mbps
8-bit HD system : 1.25 Gbps
187
Color Sampling and Sub-Sampling
4:2:2 Line structureOfficial Broadcast Specification
Half Horizontal Color resolution
8-bit SD system : 168 Mbps
8-bit HD system : 830 Mbps
188
Color Sampling and Sub-Sampling
4:1:1 Line structureQuarter Horizontal Color Resolution
8-bit SD system : 126 Mbps
8-bit HD system : 519 Mbps
189
Color Sampling and Sub-Sampling
4:2:0 Line structureHalf Vertical & Horizontal Color Resolution
8-bit SD system :126 Mbps
8-bit HD system : 519 Mbps
190
Color Sampling and Sub-Sampling
191
Color Sampling and Sub-Sampling
192
Color Sampling and Sub-Sampling
4:2:0 Line structureInterlaced video
193
Color Sampling and Sub-Sampling
Comparison
Sampling Y R-Y B-Y
4:4:4 720 720 720 Samples on Every
line
4:2:2 720 360 360 Samples on Every
line
4:1:1 720 180 180 Samples on Every
line
4:2:0 720 360
0
0
360
Samples on
Alternate lines
194
Color Sampling and Sub-Sampling
Rec BT-601/656
– Digital Standard for Component Video
– 27 MHz stream of 8 / 10 bit 4:2:2 Samples
– 8 bit range 219 levels black to white (16-235)
– Sync/Blanking replaced by SAV & EAV signals
– Ancillary data can be sent during Blanking
195
128
16
235
0 & 255
Y VU
Decoding Rec BT-601
196
Rec BT-601/656
– Multiple A/D and D/A conversion generations should be avoided
197
198
An image along with its
Y′, U, and V components.
A color image and its Y,
CB and CR components.
Colour Space Recommended by CCIR-601
Colour Space Recommended by CCIR-601
– The colour space in PAL is represented by YUV, where Y represents the luminance and U and V represent
the two colour components. The basis YUV colour space can be generated from gamma-corrected RGB
(referred to in equations as R’G’B’) components as follows:
– It should be noted that colour space recommended by CCIR-601 is very close to the PAL system.
– The precise luminance and chrominance equations under this recommendation are as follows:
199
Colour Space Recommended by CCIR-601
– The slight departure from the PAL parameters is due to the requirement that in the digital range, Y should
take values in the range of 16–235 quantum levels.
– Also, the normally AC chrominance components of U and V are centred on the grey level 128, and the
range is defined from 16 to 240.
– The reasons for these modifications are
– to reduce the granular noise of all three signals in later stages of processing
– to make chrominance values positive to ease processing operations (e.g. storage)
200
What Is Analogue?
– Analogue signals are described over an infinite number of values.
Advantages
• Simpler
• Need less bandwidth
Disadvantages
• Affected by noise and distortion.
• Quality tends to drop over time and with each new generation.
• Impossible to store in computer based systems.
201
signal
Signal + noise
What Is Digital?
– Digital signals are described as a series of definite individual numbers.
• These numbers are often called samples.
• In video the samples are often called pixels.
Advantages
• Less affected by noise or distortion.
• Quality remains the same.
• Easy to store on computer systems and transmit on networks.
Disadvantages
• May be more complex.
• Needs more bandwidth
202
Why Digital?
– Higher Quality
– More reproducible results
– Greater reliability
– Lower cost
– Less maintenance
– Greater functionality
– Powerful self diagnostic systems
– Computer control
– ……
203
Analog to Digital Conversion
204
Fs=f
T=1/f
T
Analog to Digital Conversion
Sampling Frequency
205
Analog to Digital Conversion
206
Analog to Digital Conversion
207
Sampling Frequency
Fs=2f
T=1/2f
T
Minimum Fs restriction:
– Nyquist law (Fs ≥ 2 signal B.W)
Maximum Fs restriction:
– Chanel bandwidth (bit rate)
Ex: for Y signal we select Fs=13.5MHZ because:
– 13.5MHZ ≥ 2×5MHZ
– 13.5MHZ=864×15625
Fs Restriction in Analog to Digital Conversion
208
Bit Resolution Effect (B)
209
Bit resolution=2 4 different digital levels
Bit Resolution Effect (B)
210
Bit resolution=3 8 different digital levels
Bit Resolution Effect (B)
211
`
212
Minimum bit resolution restriction:
Signal to noise ratio
Signal to Quantization Noise Ratio (SQNR)dB ≈6.02B+1.78
Peak Signal to Noise ratio(PSNR)dB ≈6.02B+11
Maximum bit resolution restriction:
Chanel band width (bit rate)
Ex: for video B=8,10,12,14 bits
Bit Resolution Restriction in Analog to Digital Conversion
213
Analog to Digital Conversion Summary
Sampling Frequency (Fs)
Bit Resolution or Bit Depth (B)
The quality of the digital signal can be increased in two ways (Both of these use bandwidth, storage capacity,
increased complexity).
• Increase the number of samples per second.
• Increase the bit resolution for each sample.
214
Sampling frequency & bit resolution
215
216
Sampling frequency & bit resolution
217
Quality of the digital signal can be increased in two ways.
− Increase the number of samples per second.
− Increase the bit resolution for each sample.
Both of these use bandwidth, storage capacity, increased complexity.
Sampling frequency & bit resolution
4 levels (2 bits) 16 levels (4 bits) 256 levels (8 bits)
Bit Resolution Effect (B)
218
Bit
= binary digit
Graphics can also described by the number of bits to represent each pixel’s color depth.
1-bit =monochrome
8-bit = 256 colors
24-bit ≈ 17 Millions of color
32-bit ≈ 4 Milliards(billions) of color
219
Bit Resolution Effect (B)
Questions??
Discussion!!
Suggestions!!
Criticism!!
220

More Related Content

What's hot

Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-basedDesigning an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-basedDr. Mohieddin Moradi
 
An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2Dr. Mohieddin Moradi
 
Video Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video CodecsVideo Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video CodecsDr. Mohieddin Moradi
 
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGESVIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGESDr. Mohieddin Moradi
 
Broadcast Camera Technology, Part 2
Broadcast Camera Technology, Part 2Broadcast Camera Technology, Part 2
Broadcast Camera Technology, Part 2Dr. Mohieddin Moradi
 
An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1    An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1 Dr. Mohieddin Moradi
 
Video Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video CodecsVideo Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video CodecsDr. Mohieddin Moradi
 
HDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdfHDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdfssuserc5a4dd
 
HDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsHDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsDr. Mohieddin Moradi
 
Modern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operationModern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operationDr. Mohieddin Moradi
 
Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts Dr. Mohieddin Moradi
 

What's hot (20)

Broadcast Lens Technology Part 1
Broadcast Lens Technology Part 1Broadcast Lens Technology Part 1
Broadcast Lens Technology Part 1
 
HDR and WCG Principles-Part 1
HDR and WCG Principles-Part 1HDR and WCG Principles-Part 1
HDR and WCG Principles-Part 1
 
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-basedDesigning an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
Designing an 4K/UHD1 HDR OB Truck as 12G-SDI or IP-based
 
An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2An Introduction to Video Principles-Part 2
An Introduction to Video Principles-Part 2
 
SDI to IP 2110 Transition Part 2
SDI to IP 2110 Transition Part 2SDI to IP 2110 Transition Part 2
SDI to IP 2110 Transition Part 2
 
Video Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video CodecsVideo Compression, Part 3-Section 2, Some Standard Video Codecs
Video Compression, Part 3-Section 2, Some Standard Video Codecs
 
HDR and WCG Principles-Part 6
HDR and WCG Principles-Part 6HDR and WCG Principles-Part 6
HDR and WCG Principles-Part 6
 
HDR and WCG Principles-Part 5
HDR and WCG Principles-Part 5HDR and WCG Principles-Part 5
HDR and WCG Principles-Part 5
 
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGESVIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN,   OPPORTUNITIES & CHALLENGES
VIDEO QUALITY ENHANCEMENT IN BROADCAST CHAIN, OPPORTUNITIES & CHALLENGES
 
Thinking about IP migration
Thinking about IP migration Thinking about IP migration
Thinking about IP migration
 
Broadcast Camera Technology, Part 2
Broadcast Camera Technology, Part 2Broadcast Camera Technology, Part 2
Broadcast Camera Technology, Part 2
 
An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1    An Introduction to HDTV Principles-Part 1
An Introduction to HDTV Principles-Part 1
 
Video Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video CodecsVideo Compression, Part 3-Section 1, Some Standard Video Codecs
Video Compression, Part 3-Section 1, Some Standard Video Codecs
 
HDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdfHDR and WCG Video Broadcasting Considerations.pdf
HDR and WCG Video Broadcasting Considerations.pdf
 
HDR and WCG Principles-Part 2
HDR and WCG Principles-Part 2HDR and WCG Principles-Part 2
HDR and WCG Principles-Part 2
 
HDR and WCG Principles-Part 4
HDR and WCG Principles-Part 4HDR and WCG Principles-Part 4
HDR and WCG Principles-Part 4
 
HDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting ConsiderationsHDR and WCG Video Broadcasting Considerations
HDR and WCG Video Broadcasting Considerations
 
Modern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operationModern broadcast camera techniques, set up & operation
Modern broadcast camera techniques, set up & operation
 
HDR and WCG Principles-Part 3
HDR and WCG Principles-Part 3HDR and WCG Principles-Part 3
HDR and WCG Principles-Part 3
 
Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts Video Compression, Part 2-Section 2, Video Coding Concepts
Video Compression, Part 2-Section 2, Video Coding Concepts
 

Similar to Video Compression Part 1 Video Principles

Broadcast Camera Technology, Part 1
Broadcast Camera Technology, Part 1Broadcast Camera Technology, Part 1
Broadcast Camera Technology, Part 1Dr. Mohieddin Moradi
 
Images and Information
Images and InformationImages and Information
Images and Informationadil raja
 
Electromagnetis Spectrum - Good.ppt
Electromagnetis Spectrum - Good.pptElectromagnetis Spectrum - Good.ppt
Electromagnetis Spectrum - Good.pptKathleenSaldon
 
Introduction to Color Science for display engineer
Introduction to Color Science for display engineerIntroduction to Color Science for display engineer
Introduction to Color Science for display engineerBrian Kim, PhD
 
Computer vision - light
Computer vision - lightComputer vision - light
Computer vision - lightWael Badawy
 
06 color image processing
06 color image processing06 color image processing
06 color image processingJaiverdhan .
 
Spectroscopy.PPT
Spectroscopy.PPTSpectroscopy.PPT
Spectroscopy.PPTdrpvczback
 
Electromagnetic Spectrum
Electromagnetic SpectrumElectromagnetic Spectrum
Electromagnetic SpectrumSamia Dogar
 
Photometry
Photometry Photometry
Photometry Mero Eye
 
ULTRASOUND IMAGING PRINCIPLES
ULTRASOUND IMAGING PRINCIPLESULTRASOUND IMAGING PRINCIPLES
ULTRASOUND IMAGING PRINCIPLESINDIA ULTRASOUND
 
Remote sensing
 Remote sensing Remote sensing
Remote sensingFidy Zegge
 
Lect 02 first portion
Lect 02   first portionLect 02   first portion
Lect 02 first portionMoe Moe Myint
 
Lect 02 first portion
Lect 02   first portionLect 02   first portion
Lect 02 first portionMoe Moe Myint
 

Similar to Video Compression Part 1 Video Principles (20)

Broadcast Camera Technology, Part 1
Broadcast Camera Technology, Part 1Broadcast Camera Technology, Part 1
Broadcast Camera Technology, Part 1
 
Images and Information
Images and InformationImages and Information
Images and Information
 
Ch2
Ch2Ch2
Ch2
 
Lecture 2
Lecture 2Lecture 2
Lecture 2
 
6 b0a22e9a7c5461d8c11bc0ef0942658
6 b0a22e9a7c5461d8c11bc0ef09426586 b0a22e9a7c5461d8c11bc0ef0942658
6 b0a22e9a7c5461d8c11bc0ef0942658
 
Light and em spectrum
Light and em spectrumLight and em spectrum
Light and em spectrum
 
Electromagnetis Spectrum - Good.ppt
Electromagnetis Spectrum - Good.pptElectromagnetis Spectrum - Good.ppt
Electromagnetis Spectrum - Good.ppt
 
Introduction to Color Science for display engineer
Introduction to Color Science for display engineerIntroduction to Color Science for display engineer
Introduction to Color Science for display engineer
 
Computer vision - light
Computer vision - lightComputer vision - light
Computer vision - light
 
06 color image processing
06 color image processing06 color image processing
06 color image processing
 
Images
ImagesImages
Images
 
Spectroscopy.PPT
Spectroscopy.PPTSpectroscopy.PPT
Spectroscopy.PPT
 
Electromagnetic Spectrum
Electromagnetic SpectrumElectromagnetic Spectrum
Electromagnetic Spectrum
 
Photometry
Photometry Photometry
Photometry
 
ULTRASOUND IMAGING PRINCIPLES
ULTRASOUND IMAGING PRINCIPLESULTRASOUND IMAGING PRINCIPLES
ULTRASOUND IMAGING PRINCIPLES
 
Light.pdf
Light.pdfLight.pdf
Light.pdf
 
Remote sensing
 Remote sensing Remote sensing
Remote sensing
 
Lect 02 first portion
Lect 02   first portionLect 02   first portion
Lect 02 first portion
 
Lect 02 first portion
Lect 02   first portionLect 02   first portion
Lect 02 first portion
 
chapter02.pdf
chapter02.pdfchapter02.pdf
chapter02.pdf
 

More from Dr. Mohieddin Moradi

An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3Dr. Mohieddin Moradi
 
An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2Dr. Mohieddin Moradi
 
Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Dr. Mohieddin Moradi
 
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2Dr. Mohieddin Moradi
 
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1Dr. Mohieddin Moradi
 
An Introduction to Audio Principles
An Introduction to Audio Principles An Introduction to Audio Principles
An Introduction to Audio Principles Dr. Mohieddin Moradi
 
Video Compression, Part 4 Section 1, Video Quality Assessment
Video Compression, Part 4 Section 1,  Video Quality Assessment Video Compression, Part 4 Section 1,  Video Quality Assessment
Video Compression, Part 4 Section 1, Video Quality Assessment Dr. Mohieddin Moradi
 
Video Compression, Part 4 Section 2, Video Quality Assessment
Video Compression, Part 4 Section 2,  Video Quality Assessment Video Compression, Part 4 Section 2,  Video Quality Assessment
Video Compression, Part 4 Section 2, Video Quality Assessment Dr. Mohieddin Moradi
 

More from Dr. Mohieddin Moradi (8)

An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3An Introduction to HDTV Principles-Part 3
An Introduction to HDTV Principles-Part 3
 
An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2An Introduction to HDTV Principles-Part 2
An Introduction to HDTV Principles-Part 2
 
Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3Broadcast Camera Technology, Part 3
Broadcast Camera Technology, Part 3
 
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 2
 
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
Serial Digital Interface (SDI), From SD-SDI to 24G-SDI, Part 1
 
An Introduction to Audio Principles
An Introduction to Audio Principles An Introduction to Audio Principles
An Introduction to Audio Principles
 
Video Compression, Part 4 Section 1, Video Quality Assessment
Video Compression, Part 4 Section 1,  Video Quality Assessment Video Compression, Part 4 Section 1,  Video Quality Assessment
Video Compression, Part 4 Section 1, Video Quality Assessment
 
Video Compression, Part 4 Section 2, Video Quality Assessment
Video Compression, Part 4 Section 2,  Video Quality Assessment Video Compression, Part 4 Section 2,  Video Quality Assessment
Video Compression, Part 4 Section 2, Video Quality Assessment
 

Recently uploaded

High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMSHigh Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMSsandhya757531
 
Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________Romil Mishra
 
Triangulation survey (Basic Mine Surveying)_MI10412MI.pptx
Triangulation survey (Basic Mine Surveying)_MI10412MI.pptxTriangulation survey (Basic Mine Surveying)_MI10412MI.pptx
Triangulation survey (Basic Mine Surveying)_MI10412MI.pptxRomil Mishra
 
ADM100 Running Book for sap basis domain study
ADM100 Running Book for sap basis domain studyADM100 Running Book for sap basis domain study
ADM100 Running Book for sap basis domain studydhruvamdhruvil123
 
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.elesangwon
 
Livre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdf
Livre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdfLivre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdf
Livre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdfsaad175691
 
Introduction to Machine Learning Part1.pptx
Introduction to Machine Learning Part1.pptxIntroduction to Machine Learning Part1.pptx
Introduction to Machine Learning Part1.pptxPavan Mohan Neelamraju
 
22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...
22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...
22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...KrishnaveniKrishnara1
 
KCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosKCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosVictor Morales
 
Turn leadership mistakes into a better future.pptx
Turn leadership mistakes into a better future.pptxTurn leadership mistakes into a better future.pptx
Turn leadership mistakes into a better future.pptxStephen Sitton
 
March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...
March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...
March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...gerogepatton
 
input buffering in lexical analysis in CD
input buffering in lexical analysis in CDinput buffering in lexical analysis in CD
input buffering in lexical analysis in CDHeadOfDepartmentComp1
 
Comprehensive energy systems.pdf Comprehensive energy systems.pdf
Comprehensive energy systems.pdf Comprehensive energy systems.pdfComprehensive energy systems.pdf Comprehensive energy systems.pdf
Comprehensive energy systems.pdf Comprehensive energy systems.pdfalene1
 
A brief look at visionOS - How to develop app on Apple's Vision Pro
A brief look at visionOS - How to develop app on Apple's Vision ProA brief look at visionOS - How to develop app on Apple's Vision Pro
A brief look at visionOS - How to develop app on Apple's Vision ProRay Yuan Liu
 
Robotics Group 10 (Control Schemes) cse.pdf
Robotics Group 10  (Control Schemes) cse.pdfRobotics Group 10  (Control Schemes) cse.pdf
Robotics Group 10 (Control Schemes) cse.pdfsahilsajad201
 
TEST CASE GENERATION GENERATION BLOCK BOX APPROACH
TEST CASE GENERATION GENERATION BLOCK BOX APPROACHTEST CASE GENERATION GENERATION BLOCK BOX APPROACH
TEST CASE GENERATION GENERATION BLOCK BOX APPROACHSneha Padhiar
 
1- Practice occupational health and safety procedures.pptx
1- Practice occupational health and safety procedures.pptx1- Practice occupational health and safety procedures.pptx
1- Practice occupational health and safety procedures.pptxMel Paras
 
Prach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism CommunityPrach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism Communityprachaibot
 

Recently uploaded (20)

High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMSHigh Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
High Voltage Engineering- OVER VOLTAGES IN ELECTRICAL POWER SYSTEMS
 
Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________
 
Versatile Engineering Construction Firms
Versatile Engineering Construction FirmsVersatile Engineering Construction Firms
Versatile Engineering Construction Firms
 
Triangulation survey (Basic Mine Surveying)_MI10412MI.pptx
Triangulation survey (Basic Mine Surveying)_MI10412MI.pptxTriangulation survey (Basic Mine Surveying)_MI10412MI.pptx
Triangulation survey (Basic Mine Surveying)_MI10412MI.pptx
 
ADM100 Running Book for sap basis domain study
ADM100 Running Book for sap basis domain studyADM100 Running Book for sap basis domain study
ADM100 Running Book for sap basis domain study
 
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
 
ASME-B31.4-2019-estandar para diseño de ductos
ASME-B31.4-2019-estandar para diseño de ductosASME-B31.4-2019-estandar para diseño de ductos
ASME-B31.4-2019-estandar para diseño de ductos
 
Livre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdf
Livre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdfLivre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdf
Livre Implementing_Six_Sigma_and_Lean_A_prac([Ron_Basu]_).pdf
 
Introduction to Machine Learning Part1.pptx
Introduction to Machine Learning Part1.pptxIntroduction to Machine Learning Part1.pptx
Introduction to Machine Learning Part1.pptx
 
22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...
22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...
22CYT12 & Chemistry for Computer Systems_Unit-II-Corrosion & its Control Meth...
 
KCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosKCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitos
 
Turn leadership mistakes into a better future.pptx
Turn leadership mistakes into a better future.pptxTurn leadership mistakes into a better future.pptx
Turn leadership mistakes into a better future.pptx
 
March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...
March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...
March 2024 - Top 10 Read Articles in Artificial Intelligence and Applications...
 
input buffering in lexical analysis in CD
input buffering in lexical analysis in CDinput buffering in lexical analysis in CD
input buffering in lexical analysis in CD
 
Comprehensive energy systems.pdf Comprehensive energy systems.pdf
Comprehensive energy systems.pdf Comprehensive energy systems.pdfComprehensive energy systems.pdf Comprehensive energy systems.pdf
Comprehensive energy systems.pdf Comprehensive energy systems.pdf
 
A brief look at visionOS - How to develop app on Apple's Vision Pro
A brief look at visionOS - How to develop app on Apple's Vision ProA brief look at visionOS - How to develop app on Apple's Vision Pro
A brief look at visionOS - How to develop app on Apple's Vision Pro
 
Robotics Group 10 (Control Schemes) cse.pdf
Robotics Group 10  (Control Schemes) cse.pdfRobotics Group 10  (Control Schemes) cse.pdf
Robotics Group 10 (Control Schemes) cse.pdf
 
TEST CASE GENERATION GENERATION BLOCK BOX APPROACH
TEST CASE GENERATION GENERATION BLOCK BOX APPROACHTEST CASE GENERATION GENERATION BLOCK BOX APPROACH
TEST CASE GENERATION GENERATION BLOCK BOX APPROACH
 
1- Practice occupational health and safety procedures.pptx
1- Practice occupational health and safety procedures.pptx1- Practice occupational health and safety procedures.pptx
1- Practice occupational health and safety procedures.pptx
 
Prach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism CommunityPrach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism Community
 

Video Compression Part 1 Video Principles

  • 2. The first experiments were mechanical. • 1873 - May & Smith experiment with selenium. • Selenium is sensitive to light. • Forms the basis for all early televisions. • 1884 - The Nipkow disk. • Many basic concepts laid down (determined) by the Nipkow disk. • Scanning and synchronisation. 2 A Short History of Television
  • 3. Scanning Holes Transmitter Receiver 3 A Short History of Television
  • 4. 1923 - Vladimir Zworykin develops the Kinescope (Kinescope is a recording of a television program on motion picture film, directly through a lens focused on the screen of a video monitor.) 4 A Short History of Television
  • 5. 1924 - John Logie Baird transmits first image 5 A Short History of Television
  • 6. 1925- Vladimir K. Zworykin demonstrated 60 lines TV. Curved lines typical of mechanical televisionLine structure 6 A Short History of Television
  • 7. 7 Supper 16 mm Film (included optical sound) Review of 16 mm Film Standards
  • 8. Q1 Spatial Resolution (HD, UHD) Q2 Temporal Resolution (Frame Rate) (HFR) Q3 Dynamic Range (SDR, HDR) Q4 Color Gamut (BT. 709, BT. 2020) Q5 Coding (Quantization, Bit Depth) Q6 Compression Artifacts . . . Not only more pixels, but better pixels 8 Elements of High-Quality Image Production
  • 9. Spatial Resolution (Pixels) HD, FHD, UHD1, UHD2 Temporal Resolution (Frame rate) 24fps, 30fps, 60fps, 120fps … Dynamic Range (Contrast) From 100 nits to HDR Color Space (Gamut) From BT 709 to Rec. 2020 Quantization (Bit Depth) 8 bits, 10 bits, 12 bits … 9 Major Elements of High-Quality Image Production
  • 10. UHDTV 1 3840 x 2160 8.3 MPs Digital Cinema 2K 2048 x 1080 2.21 MPs 4K 4096 x 2160 8.84 MPs SD (PAL) 720 x 576 0.414MPs HDTV 720P 1280 x 720 0.922 MPs HDTV 1920 x 1080 2.027 MPs UHDTV 2 7680 x 4320 33.18 MPs 8K 8192×4320 35.39 MPs Wider viewing angle More immersive 10 Q1: Spatial Resolution
  • 11. Motion Blur Motion Judder Conventional Frame Rate High Frame Rate Wider viewing angle Increased perceived motion artifacts Higher frame rates needed 50fps minimum (100fps being vetted) 11 Q2: High Frame Rate (HFR)
  • 12. – Deeper Colors – More Realistic Pictures – More Colorful Wide Color Space (ITU-R Rec. BT.2020) 75.8%, of CIE 1931 Color Space (ITU-R Rec. BT.709) 35.9%, of CIE 1931 WCG CIE 1931 Color Space 12 Q3: Wide Color Gamut
  • 13. Standard Dynamic Range High Dynamic Range (More Vivid, More Detail) 13 Q4: High Dynamic Range
  • 14. 10 bits 1024 Levels 8 bits 256 Levels 14 – More colours – More bits (10-bit) – Banding, Contouring Q5: Quantization (Bit Depth)
  • 15. Brief Summary of ITU-R BT.709, BT.2020, and BT.2100 − ITU-R BT.709, BT.2020 and BT.2100 address transfer function, color space, matrix coefficients, and more. − The following table is a summary comparison of those three documents. Parameter ITU-R BT.709 ITU-R BT.2020 ITU-R BT.2100 Spatial Resolution HD UHD, 8K HD, UHD, 8K Framerates 24, 25, 30, 50, 60 24, 25, 30, 50, 60, 100, 120 24, 25, 30, 50, 60, 100, 120 Interlace/Progressive Interlace, Progressive Progressive Progressive Color Space BT.709 BT.2020 BT.2020 Dynamic Range SDR (BT.1886) SDR (BT.1886) HDR (PQ, HLG) Bit Depth 8, 10 10, 12 10, 12 Color Representation RGB, YCBCR RGB, YCBCR RGB, YCBCR, ICTCP 15
  • 18. Radiometry and Photometry Radiometry – The science of measuring light in any portion of the electromagnetic spectrum including infrared, ultraviolet, and visible light. – This range includes the infrared, visible, and ultraviolet regions of the electromagnetic spectrum – Wavelength from 1000 to 0.01 micrometer (=10-6 meter =10-3 millimeter) Photometry – Photometry is like radiometry except that it weights everything by the sensitivity of the human eye – Deals with only the visible spectrum (=visible band) – A wavelength range of about 380 to 780 nanometer (=10-9 meter) – Do not deal with the perception of color itself, but rather the perceived strength of various wavelengths 18
  • 19. Radiant Flux (Radiant Power) The science of measuring light in any portion of the electromagnetic spectrum including infrared, ultraviolet, and visible light. Watt W or J/s Luminous Flux (Luminous Power) The weighted emitted electromagnetic waves according to “luminosity function” model of the human eye's sensitivity to various wavelengths (Visible Light). Lumen lm Luminous Intensity The quantity of visible light emitted by a light source in a given direction per unit solid angle. Candela 1cd = 1lm / sr Illuminance The amount of light or luminous flux falling on a surface. Lux (lumens per square meter) 1lx = 1lm / m² Foot-candles (lumens per square foot) 1fc = 1lm / ft² Luminance The luminous intensity that is reflected or emitted from an object per unit area in a specific direction. Candela per square meter cd/m² or nit Radiometry and Photometry 19
  • 20. − Radiance is the total amount of energy that flows from the light source, and it is usually measured in watts (W). − Luminance, measured in lumens (lm), is a measure of the amount of energy that an observer perceives from a light source. For example, light emitted from a source operating in the far infrared region of the spectrum could have significant energy (radiance), but an observer would hardly perceive it; its luminance would be almost zero. − Brightness is a subjective descriptor that is practically impossible to measure. It embodies the achromatic notion of intensity, and is one of the key factors in describing color sensation. 20 Brightness
  • 21. − A piece of white paper reflects almost all light colors emitted from the light source and thus looks white. − In contrast, a pure green object only reflects green light (spectrum) and absorbs all other light colors. − The light colors that each object reflects are governed by the characteristics of the object’s surface. Light and Color 21
  • 23. 23 Human Visual System cornea ‫قرنيه‬ Retina ‫چشم‬ ‫شبکيه‬ Sclera ‫چشم‬ ‫سخت‬ ‫سفيده‬ ‫يا‬ ‫صلبيه‬ Pupil ‫،حدقه‬ ‫چشم‬ ‫مردمک‬ choroid ‫مشيميه‬
  • 24. Human Visual System Image Formation cornea, sclera, pupil, iris, lens, retina, fovea Transduction retina, rods, and cones Processing optic nerve, brain 24 cornea ‫قرنيه‬ Retina ‫چشم‬ ‫شبکيه‬ Sclera ‫چشم‬ ‫سخت‬ ‫سفيده‬ ‫يا‬ ‫صلبيه‬ Pupil ‫،حدقه‬ ‫چشم‬ ‫مردمک‬ choroid ‫مشيميه‬
  • 25. 25 Human Visual System cornea ‫قرنيه‬ Retina ‫چشم‬ ‫شبکيه‬ Sclera ‫چشم‬ ‫سخت‬ ‫سفيده‬ ‫يا‬ ‫صلبيه‬ Pupil ‫،حدقه‬ ‫چشم‬ ‫مردمک‬ choroid ‫مشيميه‬
  • 26. Human Visual System Structure of the retina layers
  • 27. The human eye has one lens (used to focus) … … an iris (used to adjust the light level)… … and retina (used to sense the image). The retina is made up of rod and cone shaped cells. • About 120,000,000 rods used for black & white. • About 7,000,000 cones used for colour. S : 430 nm (blue) (2%) M: 535 nm (green) (33%) L : 590 nm (red) (65%) S = Short wavelength cone M = Medium wavelength cone L = Long wavelength cone 27 Human Visual System
  • 28. 28 − The highest point on each curve is called the “peak wavelength”, indicating the wavelength of radiation that the cone is most sensitive to it. Normalized Human Cone Sensitivity Human Visual System
  • 29. Fovea - Small region (1 or 2°) at the center of the visual field containing the highest density of cones (and no rods). • The centre of the image is the fovea. – The fovea sees colour only. • The nerve leaves the eye at the blind spot. 29 Human Visual System
  • 30. − Fovea is small, dense region of receptors only cones (no rods) gives visual acuity. − Outside fovea fewer receptors overall larger proportion of rods. 30 Human Visual System
  • 31. Retina Retina has photosensitive receptors at back of eye 31 Human Visual System
  • 32. • Contain photo-pigment • Respond to low energy • Enhance sensitivity • Concentrated in retina, but outside of fovea • One type, sensitive to grayscale changes • Contain photo-pigment • Respond to high energy • Enhance perception • Concentrated in fovea, exist sparsely in retina • Three types, sensitive to different wavelengths Cones Rods 32 Human Visual System
  • 33. (Very Bright and Shiny Color, Clear and Lively Color) (Not Bright or Shiny Color) Hue, Saturation and Luminosity 33
  • 35. Saturation Hue = 156 Luminance = 150 Saturation ranges =255 – 0 35
  • 36. Luminance Hue = 156 Sat = 200 Luminance ranges =255 – 0 36
  • 37. Hue, Saturation and Luminosity • Hue is a measure of the colour. − Sometimes called “Chroma Phase”. • Saturation is a measure of colour intensity. − Sometimes simply called “Color Intensity”. • Luminosity (Luminance) (Intensity (Gray Level)) is a measure of brightness. − Sometimes simply called “Brightness” or “Lightness” (!?). Hue, Saturation and Luminosity 37
  • 38. yellowgreenblue #Photons Wavelength Mean Hue The dominant color as perceived by an observer Hue, Saturation and Luminosity Hue is an attribute associated with the dominant wavelength in a mixture of light waves. − Hue represents dominant color as perceived by an observer. − Thus, when we call an object red, orange, or yellow, we are referring to its hue. 38
  • 39. Variance Saturation Wavelength high medium low hi. med. low #Photons The relative purity or the amount of white light mixed with a hue Hue, Saturation and Luminosity Saturation refers to the relative purity or the amount of white light mixed with a hue. − The pure spectrum colors are fully saturated. − Colors such as pink (red and white) and lavender (violet and white) are less saturated, with the degree of saturation being inversely proportional to the amount of white light added. 39
  • 40. Area Luminosity#Photons Wavelength B. Area Lightness bright dark It embodies the achromatic notion of intensity Hue, Saturation and Luminosity Brightness is a subjective descriptor that is practically impossible to measure. − It embodies the achromatic (gray level) notion of intensity, and is one of the key factors in describing color sensation. 40
  • 41. Hue is an attribute associated with the dominant wavelength in a mixture of light waves. − Hue represents dominant color as perceived by an observer. − Thus, when we call an object red, orange, or yellow, we are referring to its hue. Saturation refers to the relative purity or the amount of white light mixed with a hue. − The pure spectrum colors are fully saturated. − Colors such as pink (red and white) and lavender (violet and white) are less saturated, with the degree of saturation being inversely proportional to the amount of white light added. − Hue and saturation taken together are called chromaticity and, therefore, a color may be characterized by its brightness and chromaticity. Brightness is a subjective descriptor that is practically impossible to measure. − It embodies the achromatic (gray level) notion of intensity, and is one of the key factors in describing color sensation. Hue, Saturation and Luminosity 41
  • 42. Hue, Saturation and Luminosity 42
  • 43. Hue, Saturation and Luminosity cylindrical coordinate system 43
  • 47. Hue, Saturation and Intensity (HSI Colour Model) − Brightness is a subjective descriptor that is practically impossible to measure. − It embodies the achromatic notion of intensity and is one of the key factors in describing color sensation. − We do know that intensity (gray level) is a most useful descriptor of achromatic images. This quantity definitely is measurable and easily interpretable. 47       GBif360 GBif   H               2/12 1 )])(()[( )]()[( 2 1 cos BGBRGR BRGR  )],,[min( )( 3 1 BGR BGR S   )( 3 1 BGRI 
  • 48. H (S=1,V=1) S (H=1,V=1) I (H=1,S=0) Hue, Saturation and Intensity 48       GBif360 GBif   H               2/12 1 )])(()[( )]()[( 2 1 cos BGBRGR BRGR  )],,[min( )( 3 1 BGR BGR S   )( 3 1 BGRI 
  • 49. Various Colour Space Components 49
  • 51. Additive vs. Subtractive Color Mixing Subtractive Color Mix The paint absorbs or subtracts out wavelengths and the color you see is the wavelengths that were reflected back to you (not absorbed) Additive mixture The wavelengths are added together so the final color you see is the sum of the wavelengths. 51
  • 52. Additive Primary Colours Additive Primary colours • Red, Green & Blue are additive primaries - used for light. 52
  • 53. Additive Color Mixing 400 500 600 700 nm 400 500 600 700 nm RedGreen Red and green make… 400 500 600 700 nm Yellow Yellow! When colors combine by adding the color spectra. 53
  • 54. Subtractive Primaries Colours − A subtractive color model explains the mixing of a limited set of dyes, inks, paint pigments to create a wider range of colors, each the result of partially or completely subtracting (that is, absorbing) some wavelengths of light and not others. − The color that a surface displays depends on which parts of the visible spectrum are not absorbed and therefore remain visible. 54
  • 55. Subtractive Color Mixing When colors combine by multiplying the color spectra. 400 500 600 700 nm CyanYellow 400 500 600 700 nm Cyan and yellow make… 400 500 600 700 nm Green! Green 55
  • 57. − All colour images can be broken down into 3 primary colours. − Subtractive primaries: Magenta, Yellow & Cyan. − Additive primaries :Red, Green & Blue 57 Additive vs. Subtractive Color Primaries
  • 58. Secondary and Tertiary Colours − Secondary Additive Colours: Cyan, Yellow, Magenta − Primary Subtractive Colours: Red, Green, Blue − Secondary additive colours are primary subtractive colours and visa versa − Additive tertiary: White − Subtractive tertiary: Black 58
  • 59. Using Subtractive and Additive Primaries. Using subtractive primaries. • Colour printers have Cyan, Magenta & Yellow pigments. • Black often included. Using additive primaries. • Colour primaries are Red, Green & Blue • Film and drama set lighting uses additive primaries. • Video uses additive primaries. • The camera splits image into 3 primaries. • Television builds image from 3 primaries. 59
  • 60. − The color circle (color wheel) originated with Sir Isaac Newton, who in the seventeenth century created its first form by joining the ends of the color spectrum. − The color circle is a visual representation of colors that are arranged according to the chromatic relationship between them. Colour Circle (Colour Wheel) 60
  • 61. − Based on the color wheel, for example, the proportion of any color can be increased by decreasing the amount of the opposite (or complementary) color in the image. − Similarly, it can be increased by raising the proportion of the two immediately adjacent colors or decreasing the percentage of the two colors adjacent to the complement. − Suppose, for instance, that there is too much magenta in an RGB image. It can be decreased: (1) by removing both red and blue, or (2) by adding green. Magenta  Removing Red and Blue Adding Green Colour Circle (Colour Wheel) 61
  • 62. Human Cone Sensitivity − The highest point on each curve is called the “peak wavelength”, indicating the wavelength of radiation that the cone is most sensitive to it. Normalized Human Cone Sensitivity 62
  • 63. Spectral Distribution of CIE Illuminants 63
  • 64. Emission Spectrum and Reflectance Spectrums 64
  • 65. − For any given object, we can measure its emission (or reflectance) spectrum, and use that to precisely identify a color. − If we can reproduce the spectrum, we can certainly reproduce the color! − The sunlight reflected from a point on a lemon might have a reflectance spectrum that looks like this: 𝑆 𝜆 Emission Spectrum and Reflectance Spectrums 65
  • 66. Spectral Power Distribution of Light Reflected from Specimen 66
  • 67. Ex: Cones extraction for a point on the lemon − By looking at the normalized areas under the curves, we can see how much the radiation reflected from the real lemon excites each of cones. − In this case, the normalized excitations of the S, M, and L cones are 0.02, 0.12, and 0.16 respectively. Normalized Excitation of the S, M and L Cones 67
  • 68. Two phenomena demonstrate that perceived brightness is not a simple function of intensity. − Mach Band Effect: The visual system tends to undershoot or overshoot around the boundary of regions of different intensities. − Simultaneous Contrast: a region’s perceived brightness does not depend only on its intensity. Perceived Brightness Relation with Intensity 68 Mach band effect. Perceived intensity is not a simple function of actual intensity. Examples of simultaneous contrast. All the inner squares have the same intensity, but they appear progressively darker as the background becomes lighter
  • 69. − The term masking usually refers to a destructive interaction or interference among stimuli that are closely coupled in time or space. − This may result in a failure in detection or errors in recognition. − Here, we are mainly concerned with the detectability of one stimulus when another stimulus is present simultaneously. − The effect of one stimulus on the detectability of another, however, does not have to decrease detectability. Masking Recall 69 I: Gray level (intensity value) Masker: Background 𝐼2 (one stimulus) Disk: Another stimulus 𝐼1 In ∆𝑰 = 𝑰 𝟐 − 𝑰 𝟏,the object can be noticed by the HVS with a 50% chance.
  • 70. − Under what circumstances can the disk-shaped object be discriminated from the background (as a masker stimulus) by the HVS? Weber’s law: − Weber’s law states that for a relatively very wide range of I (Masker), the threshold for disc discrimination, ∆𝑰, is directly proportional to the intensity I. • Bright Background: a larger difference in gray levels is needed for the HVS to discriminate the object from the background. • Dark Background: the intensity difference required could be smaller. Masking Recall 70 Contrast Sensitivity Function (CSF) I: Gray level (intensity value) Masker: Background 𝐼2 (one stimulus) Disk: Another stimulus 𝐼1 In ∆𝑰 = 𝑰 𝟐 − 𝑰 𝟏,the object can be noticed by the HVS with a 50% chance. ∆𝑰 𝑰 = 𝑪𝒐𝒏𝒔𝒕𝒂𝒏𝒕 (≈ 𝟎. 𝟎𝟐)
  • 71. − The HVS demonstrates light adaptation characteristics and as a consequence of that it is sensitive to relative changes in brightness. This effect is referred to as “Luminance masking”. “ Luminance Masking: The perception of brightness is not a linear function of the luminance” − In fact, the threshold of visibility of a brightness pattern is a linear function of the background luminance. − In other words, brighter regions in an image can tolerate more noise due to distortions before it becomes visually annoying. − The direct impact that luminance masking has on image and video compression is related to quantization. − Luminance masking suggests a nonuniform quantization scheme that takes the contrast sensitivity function into consideration. Luminance Masking 71
  • 72. − It can be observed that the noise is more visible in the dark area than in the bright area if comparing, for instance, the dark portion and the bright portion of the cloud above the bridge. 72The bridge in Vancouver: (a) Original and (b) Uniformly corrupted by AWGN. Luminance Masking
  • 73. Luminance Masking − The perception of brightness is not a linear function of the luminance. − The HVS demonstrates light adaptation characteristics and as a consequence of that it is sensitive to relative changes in brightness. Contrast Masking − The changes in contrast are less noticeable when the base contrast is higher than when it is low. − The visibility of certain image components is reduced due to the presence of other strong image components with similar spatial frequencies and orientations at neighboring spatial locations. Contrast Masking 73
  • 74. With same MSE: • The distortions are clearly visible in the ‘‘Caps’’ image. • The distortions are hardly noticeable in the ‘‘Buildings’’ image. • The strong edges and structure in the ‘‘Buildings’’ image effectively mask the distortion, while it is clearly visible in the smooth ‘‘Caps’’ image. This is a consequence of the contrast masking property of the HVS i.e. • The visibility of certain image components is reduced due to the presence of other strong image components with similar spatial frequencies and orientations at neighboring spatial locations. Contrast Masking 74 (a) Original ‘‘Caps’’ image (b) Original ‘‘Buildings’’ image (c) JPEG compressed image, MSE = 160 (d) JPEG compressed image, MSE = 165 (e) JPEG 2000 compressed image, MSE =155 (f) AWGN corrupted image, MSE = 160.
  • 75. − In developing a quality metric, a signal is first decomposed into several frequency bands and the HVS model specifies the maximum possible distortion that can be introduced in each frequency component before the distortion becomes visible. − This is known as the Just Noticeable Difference (JND). − The final stage in the quality evaluation involves combining the errors in the different frequency components, after normalizing them with the corresponding sensitivity thresholds, using some metric such as the Minkowski error. − The final output of the algorithm is either • a spatial map showing the image quality at different spatial locations • a single number describing the overall quality of the image. Developing a Quality Metric Using Just Noticeable Difference (JND) 75
  • 76. Frequency Response of the HVS − Spatial Frequency Response − Temporal Frequency Response and Flicker − Spatio-temporal Response − Smooth Pursuit Eye Movement 76 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 + 𝝋 𝟎] 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕 + ƴ𝝋 𝟎) 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕 + 𝝋 𝟎) 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 + 𝝋 𝟎]
  • 77. Spatial Frequency − Spatial frequency measures how fast the image intensity changes in the image plane − Spatial frequency can be completely characterized by the variation frequencies in two orthogonal directions (horizontal and vertical) − 𝒇 𝒙: cycles/horizontal unit distance − 𝒇 𝒚: cycles/vertical unit distance − It can also be specified by magnitude and angle of change 77 𝒇 𝒔 = 𝒇 𝒙 𝟐 + 𝒇 𝒚 𝟐 𝝋 = 𝐭𝐚𝐧−𝟏 𝒇 𝒚 𝒇 𝒙 𝒇 𝒙 = 𝒇 𝒔 cos 𝝋 𝒇 𝒚 = 𝒇 𝒔 sin 𝝋
  • 78. Spatial Frequency 78 𝑓𝑠 = 25 φ = tan−1 0 5 𝑓𝑠 = 125 φ = tan−1 10 5 − Two-dimensional signals: (a) 𝒇 𝒙, 𝒇 𝒚 = (𝟓, 𝟎) (b) 𝒇 𝒙, 𝒇 𝒚 = (𝟓, 𝟏𝟎) − The horizontal and vertical units are the width and height of the image, respectively. − Therefore, 𝑓𝑥 = 5 means that there are five cycles along each row.
  • 79. − Previously defined spatial frequency depends on viewing distance. − Angular frequency is what matters to the eye! (viewing distance is included in it) 79 𝜽 = 𝟐 𝐭𝐚𝐧−𝟏 𝒉 𝟐𝒅 𝑹𝒂𝒅 ≈ 𝟐𝒉 𝟐𝒅 𝑹𝒂𝒅 = 𝟏𝟖𝟎 𝝅 . 𝒉 𝒅 (𝑫𝒆𝒈𝒓𝒆𝒆) 𝒇 𝒔[𝒄𝒑𝒅] = 𝒇 𝒔[𝑪𝒚𝒄𝒍𝒆𝒔/𝑼𝒏𝒊𝒕 𝒅𝒊𝒔𝒕𝒂𝒏𝒄𝒆] 𝜽 = 𝝅 𝟏𝟖𝟎 . 𝒅 𝒉 𝒇 𝒔[𝑪𝒚𝒄𝒍𝒆𝒔/𝑼𝒏𝒊𝒕 𝒅𝒊𝒔𝒕𝒂𝒏𝒄𝒆] Spatial Frequency (cycles per degree)
  • 80. − If the stimulus is a spatially periodical pattern (or grating), it is defined by its spatial frequency, which is the number of cycles per unit of subtended angle. − The luminance profile of a sinusoidal grating of frequency f oriented along a spatial direction defined by the angle 𝝋 can be written as: − where 𝒀 𝟎 is the mean luminance of the grating, m is its amplitude and 𝒇 𝒙 and 𝒇 𝒚 are its spatial frequencies along the x and y directions, respectively (measured in cycles per degree, cpd) ; that is: 80 𝒀 𝒙, 𝒚 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 ] 𝒇 𝒔 = 𝒇 𝒙 𝟐 + 𝒇 𝒚 𝟐 𝝋 = 𝐭𝐚𝐧−𝟏 𝒇 𝒚 𝒇 𝒙 𝒇 𝒙 = 𝒇 𝒔 cos 𝝋 𝒇 𝒚 = 𝒇 𝒔 sin 𝝋 𝒇 𝒙 = 𝒇 𝒔 cos 𝝋 𝒇𝒚=𝒇𝒔sin𝝋 𝒇 𝒔 𝝋 Spatial Frequency (cycles per degree)
  • 81. − Ex: The value of the phase, 𝝋, determines the luminance at the origin of coordinates (𝑥 = 0, 𝑦 = 0). • If 𝝋 is zero or a multiple of π rad, luminance is zero at the origin and the pattern has odd symmetry • If 𝝋 is 𝝅/𝟐 rad or an odd multiple of 𝝅/𝟐 rad, luminance is maximum at the origin and the pattern has even symmetry. 81 𝒇 𝒔 = 𝒇 𝒙 𝟐 + 𝒇 𝒚 𝟐 𝝋 = 𝐭𝐚𝐧−𝟏 𝒇 𝒚 𝒇 𝒙 𝒇 𝒙 = 𝒇 𝒔 cos 𝝋 𝒇 𝒚 = 𝒇 𝒔 sin 𝝋 Spatial Frequency (cycles per degree) 𝒇 𝒙 = 𝒇 𝒔 cos 𝝋 𝒇𝒚=𝒇𝒔sin𝝋 𝒇 𝒔 𝝋
  • 82. Contrast Measurement − Contrast is the physical parameter describing the magnitude of the luminance variations around the mean in a scene. − Types of stimulus to measure contrast sensitivity • Aperiodic stimulus • Periodic stimulus (Sinusoidal grating, Square grating) 82
  • 83. Contrast Measurement Aperiodical stimulus and Weber’s contrast − If the stimulus is aperiodical, contrast is simply defined as − where △ 𝑌 is the luminance amplitude of the stimulus placed against the luminance 𝑌0 (i.e., the background), provided that 𝑌0 ≠ 0 83 𝑪 = △ 𝒀 𝒀 𝟎 △ 𝒀
  • 84. Contrast Measurement Michaelson’s contrast − Note that the grating shown in figure 𝒇 𝒚 = 𝟎 and 𝝋=𝝅/𝟐 rad. − Contrast is defined in this case as follows: − This definition can be applied to any periodical pattern, no matter if sinusoidal, square or any other type. 84 𝑪 = 𝒎 𝒀 𝟎 = 𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏 𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏 𝒀 𝒙, 𝒚 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒙 𝒙) 𝒀 𝒎𝒊𝒏 𝒀 𝒎𝒂𝒙 𝒎
  • 85. Contrast Sensitivity Function (CSF) Contrast Sensitivity − Contrast sensitivity is the inverse of the minimum contrast necessary to detect an object against a background or to distinguish a spatially modulated pattern from a uniform stimulus (threshold contrast). Contrast Sensitivity Function (CSF) − Contrast sensitivity is a function of the spatial frequency (f) and the orientation (θ or 𝝋) of the stimulus; that is why we talk of the Contrast Sensitivity Function or CSF. 85 𝑪 𝒎𝒊𝒏 = 𝒎 𝒎𝒊𝒏 𝒀 𝟎 = 𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏 𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏 𝑺 = 𝟏 𝑪 𝒎𝒊𝒏 = 𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏 𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏 𝒀 𝒙, 𝒚 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒙 𝒙) 𝒀 𝒎𝒊𝒏 𝒀 𝒎𝒂𝒙 𝒎 𝒎𝒊𝒏
  • 86. Different Spatial Frequency in x Direction 86
  • 87. 87 Spatial Contrast Sensitivity Function Increasing Frequency DecreasingContrast Invisible Visible Low Contrast High Contrast IncreasingContrastSensitivity 𝑺 = 𝟏 𝑪 𝒎𝒊𝒏
  • 88. 88 − Similar to a band-pass filter • Most sensitive to mid frequencies • Least sensitive to high frequencies − Also depends on the orientation of grating • Most sensitive to horizontal and vertical ones Spatial Contrast Sensitivity Function Increasing Frequency DecreasingContrast Invisible Visible Low Contrast High Contrast IncreasingContrastSensitivity 𝑺 = 𝟏 𝑪 𝒎𝒊𝒏
  • 89. 89 Chrominance Spatial Contrast Sensitivity Function
  • 90. 90 − Essentially, the HVS is more sensitive to lower spatial frequencies and less sensitive to high spatial frequencies. • The sensitivity to luminance information peaks at around 5–8 cycles/deg. This corresponds to a contrast grid with a stripe width of 1.8 mm at a distance of 1 m. • Luminance sensitivity falls off either side of this peak and has little sensitivity above 50 cycles/deg. • The peak of the chrominance sensitivity curves occurs at a lower spatial frequency than that for luminance and the response falls off rapidly beyond about 2 cycles/deg. It should also be noted that our sensitivity to luminance information is about three times that for Red-Green and that the Red-Green sensitivity is about twice that of Blue-Yellow. Chrominance Spatial Contrast Sensitivity Function
  • 91. 91 The rapid eye movements that allow us to quickly scan a visual scene. − The different stabilization settings used to remove the effect of saccadic eye movements. − The stabilization allowed us to control retinal image motion independent of eye movements. − Specifically, the image of the stimulus display was slaved to the subject's eye movements. The image of the display screen moved in synchrony with the eye movement so that its image remained stable on the retina, irrespective of eye velocity. − With this technique, we could control the retinal velocity by manipulating the movement of the stimulus on the display. Saccades Eye Movements and Stabilization
  • 92. 92 Saccadic eye movement enhances (increases) the sensitivity but reduces the frequency that peak occurs (peak shifts to left side) • Filled circles were obtained under normal, unstablized conditions • Open squares, with optimal gain setting for stabilization (to control retinal image motion independent of eye movements) • Open circles, with the gain changed about 5 percent. 100 50 20 10 SpatialContrastsensitivity(LogScale) 5 2 0.2 1 0.5 1 2 5 10 Spatial frequency (cpd) Spatial Contrast Sensitivity Function Low Contrast High Contrast The optimal gain setting for stabilization Under normal, unstablized conditions With the gain changed about 5 percent
  • 94. Modulation Transfer Function (MTF) − In optics, a well-known function of spatial frequency used to characterize the quality of an imaging system is the Modulation Transfer Function (MTF) − It can be measured by obtaining the image (output) produced by the system of a sinusoidal pattern of frequency f, orientation θ and contrast 𝐶𝑖𝑛(𝑓, 𝜃) (input) − Where 𝐶 𝑜𝑢𝑡(𝑓, 𝜃) is the contrast of the image, which is also a sinusoid, provided the system is linear and spatially invariant. − This formula may be read as the contrast transmittance of a filter. 94 𝑴𝑻𝑭 𝒇, 𝜽 = 𝑪 𝒐𝒖𝒕(𝒇, 𝜽) 𝑪𝒊𝒏(𝒇, 𝜽)
  • 95. Modulation Transfer Function (MTF) − If the visual system is treated globally as a linear, spatially invariant, imaging system and if it is assumed that, at threshold, the output contrast must be constant, i.e. independent of spatial frequency, it is easy to demonstrate that the MTF and the CSF of the visual system must be proportional. − In fact, if the input is a threshold contrast grating, the MTF can be written as: − We are assuming that 𝐶𝑡ℎ𝑟𝑒𝑠,𝑜𝑢𝑡(𝑓, 𝜃) is an unknown constant, let us say 𝑅0 − Thus, both functions have the same shape and differ only in an unknown global factor. 95 𝑴𝑻𝑭 𝒇, 𝜽 = 𝑪 𝒕𝒉𝒓𝒆𝒔,𝒐𝒖𝒕(𝒇, 𝜽) 𝑪 𝒕𝒉𝒓𝒆𝒔,𝒊𝒏(𝒇, 𝜽) 𝑴𝑻𝑭 𝒇, 𝜽 = 𝑹 𝟎 𝑪 𝒕𝒉𝒓𝒆𝒔,𝒊𝒏(𝒇,𝜽) = 𝑹 𝟎 𝑪𝑺𝑭(𝒇, 𝜽)
  • 96. Temporal Frequency Temporal frequency measures temporal variation (cycles/s) − In a video, the temporal frequency is actually 2-dimensional; each point in space has its own temporal frequency − Non-zero temporal frequency can be caused by camera or object motion 96
  • 97. Temporal Contrast Sensitivity Function: TCSF − A spatially uniform pattern whose luminance is time modulated by a sinusoidal function is mathematically described as follows: − Where 𝒇 𝒕 is the temporal frequency − Again we have 97 𝒀(𝒕) = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕) 𝑺 = 𝟏 𝑪 𝒎𝒊𝒏 = 𝒀 𝒎𝒂𝒙 + 𝒀 𝒎𝒊𝒏 𝒀 𝒎𝒂𝒙 − 𝒀 𝒎𝒊𝒏 TCSF: The function relating contrast sensitivity to temporal frequency
  • 98. 98The responses obtained with different mean brightness levels, B, measured in trolands. Critical flicker frequency: The lowest frame rate at which the eye does not perceive flicker. – Provides guideline for determining the frame rate when designing a video system (Brightness level↑ ⇒ Shift to right) – Critical flicker frequency depends on the mean brightness of the display: – 60 Hz is typically sufficient for watching TV. – Watching a movie needs lower frame rate than TV Temporal Contrast Sensitivity Function: TCSF 200 100 50 TemporalContrastsensitivity(LogScale) 20 10 5 2 1 9300 trolands 850 trolands 77 trolands 7.1trolands 0.65trolands 0.06trolands (Flicker) Frequency (Hz) 2 5 10 20 50
  • 99. Contrast Sensitivity In The Spatio-temporal Domain A luminance pattern that changes as a function of position (x,y) and time, t, is a spatio-temporal pattern. − Spatiotemporal patterns usually employed as stimuli are counterphase gratings and travelling gratings. 99 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 ] 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕)
  • 100. Contrast Sensitivity In The Spatio-temporal Domain − In counterphase sine gratings, luminance is sinusoidally modulated both in space (with frequencies 𝒇 𝒙, 𝒇 𝒚) and in time (with frequency 𝒇 𝒕). − In counterphase sine gratings, if 𝒇 𝒚 = 0 , the corresponding luminance profile would be: 100 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒚 𝒚 ] 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕) 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒙 𝒙) 𝒔𝒊𝒏 (𝟐𝝅𝒇 𝒕 𝒕) 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒕 𝒕 )]
  • 101. Contrast Sensitivity In The Spatio-temporal Domain A travelling grating is a spatial pattern that moves with a given velocity, 𝒗. − If we assume 𝒇 𝒚 = 0 , the luminance profile would be: − where 𝒇 𝒕 = 𝒗 × 𝒇 𝒙 is the temporal frequency of the luminance modulation caused by the motion at each point (𝑥, 𝑦) of the pattern − The sign (±) accompanying the variable 𝑣 indicates whether the grating is moving towards the left or the right, respectively. 101 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅 𝒇 𝒙 𝒙 + 𝒇 𝒕 𝒕 ] 𝒀 𝒙, 𝒚, 𝒕 = 𝒀 𝟎 + 𝒎𝒔𝒊𝒏 [𝟐𝝅𝒇 𝒙 𝒙 ± 𝒗𝒕 ] 𝒇 𝒕 = 𝒗 × 𝒇 𝒙
  • 102. Contrast Sensitivity In The Spatio-temporal Domain (Spatiotemporal CSF) − The contrast sensitivity becomes here a 3D function, or 2D if the spatial pattern is 1D, that is, if the modulation occurs only along the x or the y directions. 102 The spatio-temporal CSF surface Cross-sections of the 2D spatio-temporal CSF surface at different values of the temporal (left) or the spatial temporal frequency (right), to obtain, respectively, the spatial and the temporal CSFs.
  • 103. Spatiotemporal Response 103 300 100 10 30 SpatialContrastsensitivity 0.3 3 1 3 10 Spatial frequency (cpd) 30 300 100 10 30 TemporalContrastsensitivity 0.3 3 1 3 10 Temporal frequency (Hz) 30 Spatiotemporal frequency response of the HVS. − The reciprocal relation between spatial and temporal sensitivity was used in TV system design. Spatial Frequency ↑ ⇒ Temporal Frequency↓ Temporal Frequency ↑ ⇒ Spatial Frequency↓ − Interlaced scan provides tradeoff between spatial and temporal resolution. Spatial frequency responses for different temporal frequencies Temporal frequency responses for different spatial frequencies 1Hz 6Hz 16Hz 22Hz 0.5 cpd 4 cpd 16 cpd 22 cpd
  • 104. (𝒙, 𝒚) (𝒙, 𝒚) 𝒗 𝒙, 𝒗 𝒚 𝒙 + 𝒗 𝒙 𝒕, 𝒚 + 𝒗 𝒚 𝒕 𝒙 + 𝒗 𝒙 𝒕, 𝒚 + 𝒗 𝒚 𝒕 104 − Suppose single object with constant velocity (Temporal Frequency caused by Linear Motion) − Illustration of the constant intensity assumption under motion. − Every point (𝑥, 𝑦) at t=0 is shifted by 𝑣 𝑥 𝑡, 𝑣 𝑦 𝑡 to 𝑥 + 𝑣 𝑥 𝑡, 𝑦 + 𝑣 𝑦 𝑡 at time t, without change in color or intensity. − Alternatively, a point (𝑥, 𝑦) at time t corresponds to a point 𝑥 − 𝑣 𝑥 𝑡, 𝑦 − 𝑣 𝑦 𝑡 at time zero. Relation between Motion, Spatial and Temporal Frequency
  • 105. Relation between Motion, Spatial and Temporal Frequency Consider an object moving with speed (𝒗 𝒙, 𝒗 𝒚) Assume the image pattern at t=0 is 𝝍 𝟎(𝒙, 𝒚) The image pattern at time t is Relation between motion, spatial, and temporal frequency: It means that the temporal frequency of the image (𝒇𝒕) of a moving object depends on motion as well as the spatial frequency of the object. Example: A plane with vertical bar pattern, moving vertically, causes no temporal change; But moving horizontally, it causes fastest temporal change 105 𝝍 𝒙, 𝒚, 𝒕 = 𝝍 𝟎(𝒙 − 𝒗 𝒙 𝒕, 𝒚 − 𝒗 𝒚 𝒕) 𝜳 𝒇 𝒙, 𝒇 𝒚, 𝒇 𝒕 = 𝜳 𝟎(𝒇 𝒙, 𝒇 𝒚)𝜹(𝒇 𝒕 + 𝒗 𝒙 𝒇 𝒙 + 𝒗 𝒚 𝒇 𝒚) 𝒇 𝒕 = −(𝒗 𝒙 𝒇 𝒙 + 𝒗 𝒚 𝒇 𝒚) Continuous Space Fourier Transform (CSFT)
  • 106. Smooth Pursuit Eye Movement − Smooth Pursuit: the eye tracks moving objects − Net effect: reduce the velocity of moving objects on the retinal plane, so that the eye can perceive much higher raw temporal frequencies than indicated by the temporal frequency response. Temporal frequency caused by object motion when the object is moving at (𝑣 𝑥, 𝑣 𝑦): Observed temporal frequency at the retina when the eye is moving at (෤𝑣 𝑥, ෤𝑣 𝑦) 106 ෨𝒇 𝒕 = 𝒇 𝒕 + (෥𝒗 𝒙 𝒇 𝒙 + ෥𝒗 𝒚 𝒇 𝒚) ෨𝒇 𝒕 = 𝟎 𝐢𝐟 ෥𝒗 𝒙=𝒗 𝒙, ෥𝒗 𝒚=𝒗 𝒚
  • 107. A Review on Standard Definition (SD) − Television is transmitted and recorded as frames. • Similar to film. − Each frame is scanned in the camera or camcorder. • This is called a raster scan. • Raster scan scans line by line from top to bottom. • Each line is scanned from left to right. 107
  • 108. Field, Frame, Progressive and Interlaced Scan − Continuous scan is called a progressive scan. − Progressive scans tend to flicker. − Television splits each frame into two scans. • One for the odd lines and another for the even lines. • Each interlaced scan called a field. • Therefore odd lines (odd field) +even lines(even field) = 1 frame. − This is called an interlaced scan. Interlace benefits: I. The needed bandwidth for odd lines (odd field) +even lines(even field) is equal to the needed bandwidth for one frame (ex: 50i/25p). II. Interlaced scans flicker a lot less than progressive scans (ex: 50i/25p). 108
  • 119. The 720 Standard (SMPTE 269M) 119
  • 120. The 1080 Standard (SMPTE 274M) 120
  • 121. Image Formats for High Definition 121
  • 122. High Definition Formats There are three factors in defining High Definition formats Resolution 1. 1920 × 1080 2. 1280 ×720 Scanning method (i /p) 1. Interlaced 2. Progressive Frame rate (fps) 1. 23.98 (24) 2. 25 3. 29.97 (30) 4. 50 5. 59.94 (60) 122
  • 123. 1080i vs. 1080p vs. 720p 1080i – Widely used format which is defined as 1080 lines, 1920 pixels per line, interlace scan. – The 1080i statement alone does not specify the field or frame rate which can be: 25 or 30 fps 50 or 60 fps 1080p – 1080 x 1920 size pictures, progressively scanned. Frame rates can be: 24, 25, 30, 50, or 60 fps 720p – 1280 x 720 size picture progressively scanned. 24, 25, 30, 50, or 60 fps − Progressive scan at a high picture refresh rate: well portray action such as in sporting events for smoother slow motion replays, etc. 123 In Displays In Displays
  • 124. The High Definition Signal Horizontal Interval SMPTE 274M (The 1080 standard) 124
  • 125. The High Definition Signal Vertical Blanking 125
  • 126. 19 22 126 The High Definition Signal Vertical Blanking 1 20 21 560 561 563 564 583 584 1123 1124 1125 2 559 562 582 585 1122 565 Field 2 Field 1 19 42 1 41 42 1121 1122 1124 2 1120 1123 Frame 1125
  • 127. The High Definition Signal Vertical Blanking 127
  • 128. Progressive scan − Delivers higher spatial resolution for a given frame size (better detail) – Has the same (temporal) look as film – Good for post and transfer to film – No motion tear Interlaced scan − Delivers higher temporal resolution for a given frame size (better motion portrayal) – Has the same (temporal) look as video – Shooting is easier – Post production on video is easier – Interlacing causes motion tears and ‘video’ look 128 Scanning Techniques Pros and Cons
  • 129. 129 Odd and even lines are in different places when there is fast motion Odd field Even field Odd + Even No motion Motion Fast Scanning Techniques Pros and Cons
  • 130. Progressive or Interlace Shooting? 130
  • 132. Interlaced (field 1) (50i) 132
  • 133. Interlaced (field 2) (50i) 133
  • 135. Interlace (50i)Progressive (25p) Delivers higher spatial resolution for a given frame size (better detail) Delivers higher temporal resolution for a given frame size (better motion portrayal) 135 Interlaced Frame (50i) and Progressive Frame (25p)
  • 136. In High Speed Progressive (50p) Interlace (50i) 136
  • 137. Standard Monochrome Signals First commercial standards were 60 lines. Original ‘high definition’ is 405 lines monochrome. Later standards were 525 and 625 lines. • Half the number of lines in each field. Each line consists of active line and horizontal blanking. • Active line for the picture and horizontal blanking for the flyback interval. • Horizontal syncs in the horizontal blanking locks the picture horizontally. Each fields and frame consists of active video and vertical blanking. • Active video is all the lines within the picture. • Vertical blanking are the lines that are not seen. • Vertical syncs in the vertical blanking locks the picture vertically. Signal is “zero” for black. Signal increases as the brightness increases. Negative signal used for synchronisation. 137
  • 138. Composite Video Signal (Monochrome) Front Porch Vision 12 µs 52 µs Composite video signal: video signal+Blanking+sync pulse 138
  • 140. The Basic Television Signal 140
  • 142. The basic television signal 142
  • 143. The basic television signal Short white areas of the line for the sails produce sharp white spikes in the signal. This part of the line with black shadows produces a low signal. Trees and bushes with light and dark areas produce an undulating signal. The sky is bright and produces a high signal almost as high as the white sails. Shadows in the trees produce a low signal. Very small bright area between the trees produces a very sharp spike in the signal 143
  • 145. Color Video Signal Formats 145 − Colour pictures can be broken down into three primaries. − Red Green Blue. − Original plan to use these primaries in colour television. − The colour are called components.
  • 146. RGB  RGB signals offer the most faithful reproduction in both image brightness and color depth. This is because they are obtained right after the R, G, and B imagers with minimum video processing in between.  Each one of the R, G, and B signals contains information on brightness and color in the same channel.  RGB signals are also called full bandwidth signals because they are not limited in bandwidth, which is the case with other signal formats. This is another reason why RGB signals provide the best video quality. 146 Color Video Signal Formats
  • 151. Problem with Red, Green & Blue components Many existing black-&-white television customers. • Needed to keep these customers happy when colour TV was introduced. Old black and white signal needed. Matrix in the camera converts from RGB to Y (R-Y) (B-Y) . • Y is the black-&-white signal. • (R-Y) and (B-Y) are two colour difference signals 151
  • 152. Y ,R,G and B Relative Human Sensitivity Wavelength(nm) 400nm 700nm 47% 47.092.017.0 92.0 59.0 47.092.017.0 47.0 3.0 47.092.017.0 17.0 11.0       Y=0.11B+0.3R+0.59G 92% 17% 152 If we measure a human eye’s sensitivity to every wavelength, we get a luminosity function. 550nm
  • 153. Video Signal Formats Y/R-Y/B-Y – The Y/R-Y/B-Y signal is called the component signal. – The Y/R-Y/B-Y signal is obtained by feeding the RGB signal to a matrix circuit, which separates it into color information and brightness information. – This makes the signal easier to process. – Information on the total brightness of the three RGB signals is combined into one signal called the luminance signal (Y), while information on color is packed into two signals called the color difference signals (R-Y/B-Y). – Information on luminance is not bandwidth-restricted and is equivalent to that of the RGB signal. – Information on color (R-Y/B-Y) is bandwidth-limited to a certain extent, but kept sufficient for the human eye’s sensitivity to fine color detail, which is less than that to brightness. 153
  • 155. Matrix R BG Y R-Y B-Y Old Black & White televisions ignore the colour components and only use the monochrome component 155
  • 156. Video Signal Formats Composite Video – The composite signal is obtained by adding the luminance (Y) and chrominance (C) signals of the Y/C signal to form one signal, which contains both brightness and color information. – This is achieved in such a way that the bandwidth of the chrominance signal overlaps with that of the luminance signal. This allows the composite signal to provide both luminance and chrominance information (color images) using the same bandwidth as the black and white signal. – Technically, this is achieved by modulating the color signals on a carrier signal (= color subcarrier) that does not interfere with the luminance signal’s spectrum. – The frequency of the color carrier signal is determined so its spectrum interleaves with the spectrum of the luminance signal.  For NTSC video, this is approximately 3.58 MHz.  for PAL video ,this it is approximately 4.43 MHz. – This prevents the chrominance (C) and luminance signals (Y) from mixing with each other when they are added together to form the composite signal. – The composite signal can be separated back into its luminance and chrominance components using special filters, known as comb filters. 156
  • 158. PAL Colour Signal − Improved European colour television standard. − Co-designed in Germany and England. − More complex than NTSC, but better colours. − 625 total lines in each frame. − 576 picture lines. − Interlaced scanning at 25 frames per second. − 50 fields per second. PAL = Phase Alternation by Line 158
  • 160. 160 The Color Bars Test Signal
  • 161. U=0.493 (B’-Y’) V=0.877 (R’-Y’) These particular weighting factors ensure that the subcarrier excursions are around 33% above white level for saturated yellow and cyan color bars and 33% maximum below black level for red and blue bars. 161
  • 162. NTSC Color Signal  NTSC is a standard-definition composite video signal format primarily used in North America, Japan, Korea, Taiwan, and parts of South America.  Its name is an acronym for National Television Systems Committee.  Tends to suffer from bad colours. • Nicknamed “Never The Same Colour”!  525 total lines in each frame. • 483 picture lines.  Interlaced scanning at 30 frames per second. • Actually 29.97 frames per second to be exact. • 60 fields per second.  Color information is encoded on a 3.58-MHz sub-carrier, which is transmitted together with the luminance information. 162
  • 163. Q: Green-PurpleI: Orange-Cyan Arctan (Q/ I) = Hue Square (I 2+ Q 2) = Saturation 163
  • 164. • The positive polarity of Q is purple, the negative is green. The positive polarity of I is orange, the negative is cyan. Thus, Q is often called the "green-purple" or "purple-green" axis information and I is often called the "orange-cyan" or "cyan-orange" axis information. • The human eye is more sensitive to spatial variations in the "orange-cyan" (the color of face!) than it is for the "green-purple“. Thus, the "orange-cyan" or I signal has a maximum bandwidth of 1.5 MHz and the "purple-green" only has a maximum bandwidth of 0.5 MHz. • Now, the Q and I signals are both modulated by a 3.58 MHz carrier wave. However, they are modulated out of 90 degrees out of phase (QAM). These two signals are then summed together to make the C or chrominance signal. • The nomenclature of the two signals aids in remembering what is going on. The I signal is In-phase with the 3.58 MHz carrier wave. The Q signal is in Quadrature (i.e. 1/4 of the way around the circle or 90 degrees out of phase, or orthogonal) with the 3.58 MHz carrier wave. 164 NTSC Color Signal
  • 165. • Position the band limited chrominance at the high end of the luminance spectrum, where the luminance is weak, but still sufficiently lower than the audio (at 4.5 MHz). • The two chrominance components (I and Q) are multiplexed onto the same sub- carrier using QAM. • The resulting video signal including the baseband luminance signal plus the chrominance components modulated to fc is called composite video signal. 165 NTSC Color Signal
  • 166. Colour Television Standards PAL 1963 NTSC 1953 SECAM 1958~ 1967 Color System Frame per Second Lines Quantity Bandwidth (MHz) B &W Modulation Color Modulation Audio Modulation NTSC 30 525 6 AM QAM FM PAL 25 625 7-8 AM QAM FM SECAM 25 625 7-8 AM FM FM 166
  • 167. The chromaticity of a color is then specified by two derived parameters x and y 167 CIE xy Chromaticity Diagram (The CIE 1931 color space chromaticity diagram) The CIE 1931 color space chromaticity diagram. locus of non-spectral purples X, Y, and Z are the imaginary primaries. Y means luminance, Z is somewhat equal to blue, and X is a mix of cone response curves chosen to be orthogonal to luminance and non-negative.
  • 168. A color gamut is the complete range of colors allowed for a specific color space. No video, film or printing technology is able to fill this space. • NTSC and PAL are well inside natural colours. • The extent of any technology is called the gamut. Outside edge defines fully saturated colours. Purple is “impossible”. Each corner of the gamut defines the primary colours. Color Gamut 168
  • 172. (Inner triangle: HDTV primaries, Outer triangle: UHDTV primaries) 0 .1 .2 .3 .4 .5 .6 .7 .8 0 .1 .2 .3 .4 .5 .6 .7 .8 y 0 .1 .2 .3 .4 .5 .6 .7 .8 0 .1 .2 .3 .4 .5 .6 .7 .8 y (a) Carnation x (b) Geranium and marigold x Wide Color Gamut Makes Deeper Colors Available 172
  • 175. HD SD Colorimetry SD and HD comparison 175
  • 176. Colorimetry SD and HD comparison 176 HD SD
  • 180. Colorimetry SD and HD comparison 180 SD HD
  • 181. Colorimetry Color bars: How does it look, what does it matter Out of Gamut 181 SD HD
  • 182. Colorimetry Color bars: How does it look, what does it matter − Correct colour bars will look wrong! − Incorrect conversion/colourimetry will produce • Oversaturation • Undersaturation • Gamut alarms − HD has a redefined colourimetry − Wider range of reproducible colours − More colourful pictures − Colour bars are different 182
  • 183. Decibels (dB) Decibels are defined by the following equation: dB = 20 × log(v’/v) (v’: Value to express in decibels v: Well-known value = 1.0 (V))  This can be rearranged as: dB/20 v’ = v × 10  Since the relative value is being discussed, by substituting v = 1.0 (volt) the given equation is: dB/20 v’ = 10 183
  • 184. Decibels (dB) − The most decibel values that need to be remembered are shown in the following table. Referring to this table:  A 20 dB signal gain up means the signal level has been boosted by 10 times. A 6 dB signal drop (= minus 6 dB) means the signal level has fallen to one half. 184
  • 185. S/N (Signal-to-Noise) Ratio  Noise refers to the distortion of the original signal due to external electrical or mechanical Factor. (S/N )ratio = 20 × log (Vp-p/Nrms) (dB)  A 60 dB S/N means that the noise level is one-thousandth of the signal level.  For video, the signal amplitude (Vp-p) is calculated as 1 volt, which is the voltage (potential difference) from the bottom of the H-sync signal (sync tip) to the white video level.  Noise level changes over time, and amplitude cannot be used to express its amount. Root mean square (rms) is a statistical measure for expressing a group of varying values, and allows the magnitude of noise to be expressed with a single value. Root mean square can be considered a kind of average of temporal values. 185
  • 187. 4:4:4 Line structure8 x 3 x 720 x 576 x 25 = 250 Mbps Full color resolution 8-bit SD system : 250 Mbps 8-bit HD system : 1.25 Gbps 187 Color Sampling and Sub-Sampling
  • 188. 4:2:2 Line structureOfficial Broadcast Specification Half Horizontal Color resolution 8-bit SD system : 168 Mbps 8-bit HD system : 830 Mbps 188 Color Sampling and Sub-Sampling
  • 189. 4:1:1 Line structureQuarter Horizontal Color Resolution 8-bit SD system : 126 Mbps 8-bit HD system : 519 Mbps 189 Color Sampling and Sub-Sampling
  • 190. 4:2:0 Line structureHalf Vertical & Horizontal Color Resolution 8-bit SD system :126 Mbps 8-bit HD system : 519 Mbps 190 Color Sampling and Sub-Sampling
  • 191. 191 Color Sampling and Sub-Sampling
  • 192. 192 Color Sampling and Sub-Sampling
  • 193. 4:2:0 Line structureInterlaced video 193 Color Sampling and Sub-Sampling
  • 194. Comparison Sampling Y R-Y B-Y 4:4:4 720 720 720 Samples on Every line 4:2:2 720 360 360 Samples on Every line 4:1:1 720 180 180 Samples on Every line 4:2:0 720 360 0 0 360 Samples on Alternate lines 194 Color Sampling and Sub-Sampling
  • 195. Rec BT-601/656 – Digital Standard for Component Video – 27 MHz stream of 8 / 10 bit 4:2:2 Samples – 8 bit range 219 levels black to white (16-235) – Sync/Blanking replaced by SAV & EAV signals – Ancillary data can be sent during Blanking 195 128 16 235 0 & 255 Y VU
  • 197. Rec BT-601/656 – Multiple A/D and D/A conversion generations should be avoided 197
  • 198. 198 An image along with its Y′, U, and V components. A color image and its Y, CB and CR components. Colour Space Recommended by CCIR-601
  • 199. Colour Space Recommended by CCIR-601 – The colour space in PAL is represented by YUV, where Y represents the luminance and U and V represent the two colour components. The basis YUV colour space can be generated from gamma-corrected RGB (referred to in equations as R’G’B’) components as follows: – It should be noted that colour space recommended by CCIR-601 is very close to the PAL system. – The precise luminance and chrominance equations under this recommendation are as follows: 199
  • 200. Colour Space Recommended by CCIR-601 – The slight departure from the PAL parameters is due to the requirement that in the digital range, Y should take values in the range of 16–235 quantum levels. – Also, the normally AC chrominance components of U and V are centred on the grey level 128, and the range is defined from 16 to 240. – The reasons for these modifications are – to reduce the granular noise of all three signals in later stages of processing – to make chrominance values positive to ease processing operations (e.g. storage) 200
  • 201. What Is Analogue? – Analogue signals are described over an infinite number of values. Advantages • Simpler • Need less bandwidth Disadvantages • Affected by noise and distortion. • Quality tends to drop over time and with each new generation. • Impossible to store in computer based systems. 201 signal Signal + noise
  • 202. What Is Digital? – Digital signals are described as a series of definite individual numbers. • These numbers are often called samples. • In video the samples are often called pixels. Advantages • Less affected by noise or distortion. • Quality remains the same. • Easy to store on computer systems and transmit on networks. Disadvantages • May be more complex. • Needs more bandwidth 202
  • 203. Why Digital? – Higher Quality – More reproducible results – Greater reliability – Lower cost – Less maintenance – Greater functionality – Powerful self diagnostic systems – Computer control – …… 203
  • 204. Analog to Digital Conversion 204
  • 205. Fs=f T=1/f T Analog to Digital Conversion Sampling Frequency 205
  • 206. Analog to Digital Conversion 206
  • 207. Analog to Digital Conversion 207 Sampling Frequency Fs=2f T=1/2f T
  • 208. Minimum Fs restriction: – Nyquist law (Fs ≥ 2 signal B.W) Maximum Fs restriction: – Chanel bandwidth (bit rate) Ex: for Y signal we select Fs=13.5MHZ because: – 13.5MHZ ≥ 2×5MHZ – 13.5MHZ=864×15625 Fs Restriction in Analog to Digital Conversion 208
  • 210. Bit resolution=2 4 different digital levels Bit Resolution Effect (B) 210
  • 211. Bit resolution=3 8 different digital levels Bit Resolution Effect (B) 211
  • 212. ` 212
  • 213. Minimum bit resolution restriction: Signal to noise ratio Signal to Quantization Noise Ratio (SQNR)dB ≈6.02B+1.78 Peak Signal to Noise ratio(PSNR)dB ≈6.02B+11 Maximum bit resolution restriction: Chanel band width (bit rate) Ex: for video B=8,10,12,14 bits Bit Resolution Restriction in Analog to Digital Conversion 213
  • 214. Analog to Digital Conversion Summary Sampling Frequency (Fs) Bit Resolution or Bit Depth (B) The quality of the digital signal can be increased in two ways (Both of these use bandwidth, storage capacity, increased complexity). • Increase the number of samples per second. • Increase the bit resolution for each sample. 214
  • 215. Sampling frequency & bit resolution 215
  • 216. 216 Sampling frequency & bit resolution
  • 217. 217 Quality of the digital signal can be increased in two ways. − Increase the number of samples per second. − Increase the bit resolution for each sample. Both of these use bandwidth, storage capacity, increased complexity. Sampling frequency & bit resolution
  • 218. 4 levels (2 bits) 16 levels (4 bits) 256 levels (8 bits) Bit Resolution Effect (B) 218
  • 219. Bit = binary digit Graphics can also described by the number of bits to represent each pixel’s color depth. 1-bit =monochrome 8-bit = 256 colors 24-bit ≈ 17 Millions of color 32-bit ≈ 4 Milliards(billions) of color 219 Bit Resolution Effect (B)