SlideShare a Scribd company logo
1 of 252
Digital Image Processing
Gopal Krishan Prajapat
Image Processing Books
• Gonzalez, R. C. and Woods, R. E., "Digital
Image Processing", Prentice Hall.
• Jain, A. K., "Fundamentals of Digital Image
Processing", PHI Learning, 1
st
Ed.
• Bernd, J., "Digital Image Processing", Springer,
6
th
Ed.
• Burger, W. and Burge, M. J., "Principles of
Digital Image Processing", Springer
• Scherzer, O., " Handbook of Mathematical
Methods in Imaging", Springer
Sunday, December 18, 2022 2
Why we need Image Processing?
• Improvement of pictorial information for
human perception
• Image processing for autonomus machine
applications
• Efficient storage and transmission
Sunday, December 18, 2022 3
What is digital image processing?
• An image may be defined as a two dimensional
function f(x,y), where ‘x’ and ‘y’ are spatial(plane)
coordinates and the amplitude of ‘f’ at any pair of
coordinates (x,y) is called the intensity or gray
level of the image at that point.
• When x,y, and the amplitude values of ‘f’ are all
finite, descrete quantities we call the image a
digital image.
• The field of digital image processing refers to
processing digital images by means of digital
computers.
Sunday, December 18, 2022 4
What is digital image processing?
(Cont…)
Sunday, December 18, 2022 5
Image Processing Applications
• Automobile driver assistance
– Lane departure warning
– Adaptive cruise control
– Obstacle warning
• Digital Photography
– Image Enhancement
– Compression
– Color manipulation
– Image editing
– Digital cameras
• Sports analysis
– sports refereeing and commentary
– 3D visualization and tracking sports actions
Sunday, December 18, 2022 6
Image Processing Applications(Cont…)
• Film and Video
– Editing
– Special effects
• Image Database
– Content based image retrieval
– visual search of products
– Face recognition
• Industrial Automation and Inspection
– vision-guided robotics
– Inspection systems
• Medical and Biomedical
– Surgical assistance
– Sensor fusion
– Vision based diagnosis
• Astronomy
– Astronomical Image Enhancement
– Chemical/Spectral Analysis
Sunday, December 18, 2022 7
Image Processing Applications(Cont...)
• Arial Photography
– Image Enhancement
– Missile Guidance
– Geological Mapping
• Robotics
– Autonomous Vehicles
• Security and Safety
– Biometry verification (face, iris)
– Surveillance (fences, swimming pools)
• Military
– Tracking and localizing
– Detection
– Missile guidance
• Traffic and Road Monitoring
– Traffic monitoring
– Adaptive traffic lights
Sunday, December 18, 2022 8
Brief History of IP
• In 1920s, submarine cables were used to transmit
digitized newspaper pictures between London &
New York – using Bartlane cable picture
transmission System.
• Specialized printing equipments(eg. Telegraphic
printer) used to code the picture for cable
transmission and its reproduction on the
receiving end.
• In 1921, printing procedure was changed to
photographic reproduction from tapes perforated
at telegraph receiving terminals.
• This improved both tonal quality & resolution.
Sunday, December 18, 2022 9
Brief History of IP(Cont…)
Sunday, December 18, 2022 10
Brief History of IP(Cont…)
• Bartlane system was capable of coding 5 distinct
brightness levels. This was increased to 15 levels
by 1929.
• Improvement of processing techniques continued
for next 35 years .
• In 1964 computer processing techniques were
used to improve the pictures of moon tranmitted
by ranger 7 at Jet Propulsion Laboratory.
• This was the basis of modern Image Processing
techniques.
Sunday, December 18, 2022 11
Image Processing Steps
Sunday, December 18, 2022 12
Components of IP System
Sunday, December 18, 2022 13
Image Acquisition Process
Sunday, December 18, 2022 14
Image Sensing and Acquisition
Sunday, December 18, 2022 15
Image Sensing and Acquisition(Cont…)
• Image acquisition using a single sensor
Sunday, December 18, 2022 16
Image Sensing and Acquisition(Cont…)
• Using sensor strips
Sunday, December 18, 2022 17
Image Representation
Sunday, December 18, 2022 18
x
y
IMAGE
An image is a 2-D light intensity function F(X,Y).
F(X,Y) = R(X,Y)* I(X,Y) , where
R(X,Y) = Reflectivity of the surface of the corresponding image point.
I(X,Y) = Intensity of the incident light.
A digital image F(X,Y) is discretized both in spatial coordinates and brightness.
It can be considered as a matrix whose row, column indices specify a point in
the image & the element value identifies gray level value at that point known
as pixel or pels.
Image Representation (Cont..)
Sunday, December 18, 2022 19
(0,0) (0,1) ... (0, 1)
(1,0) (1,1) ... (1, 1)
( , )
... ... ... ...
( 1,0) ( 1,1) ... ( 1, 1)
f f f N
f f f N
f x y
f M f M f M N

 
 

 

 
 
   
 
Image Representation in Matrix form
Image Representation (Cont..)
Sunday, December 18, 2022 20
Image Representation (Cont..)
Sunday, December 18, 2022 21
Image Representation (Cont..)
Sunday, December 18, 2022 22
( , ) ( , ) ( , )
( , ): intensity at the point ( , )
( , ): illumination at the point ( , )
(the amount of source illumination incident on the scene)
( , ): reflectance/transmissivity
f x y i x y r x y
f x y x y
i x y x y
r x y

at the point ( , )
(the amount of illumination reflected/transmitted by the object)
where 0 < ( , ) < and 0 < ( , ) < 1
x y
i x y r x y

Image Representation (Cont..)
• By theory of real numbers :
Between any two given points there are infinite
number of points.
• Now by this theory :
An image should be represented by infinite
number of points.
Each such image point may contain one of the
infinitely many possible intensity/color values
needing infinite number of bits.
Obviously such a representation is not possible in
any digital computer.
Sunday, December 18, 2022 23
Image Sampling and Quantization
• By above slides we came to know that we need to
find some other way to represent an image in
digital format.
• So we will consider some discrete set of points
known as grid and in each rectangular grid
consider intensity of a particular point. This
process is known as sampling.
• Image representation by 2-d finite matrix –
Sampling
• Each matrix element represented by one of the
finite set of discrete values - Quantization
Sunday, December 18, 2022 24
Image Sampling and Quantization
Sunday, December 18, 2022 25
Colour Image Processing
• Why we need CIP when we get information from
black and white image itself?
1. Colour is a very powerful descriptor & using the
colour information we can extract the objects of
interest from an image very easily which is not so easy
in some cases using black & white pr simple gray level
image.
2. Human eyes can distinguish between thousands of
colours & colour shades whereas when we talk about
only black and white image or gray scale image we
can distinguish only about dozens of intensity
distinguishness or different gray levels.
Sunday, December 18, 2022 26
Color Image processing(Cont…)
• The color that human perceive in an object =
the light reflected from the object
Illumination source
scene
reflection
Humen eye
Colour Image Processing(Cont...)
• In CIP there are 2 major areas:
1.FULL CIP : Image which are acquired by full colour
TV camera or by full color scanner, than, you find
that all the colour you perceive they are present in
the images.
2.PSEUDO CIP : Is a problem where we try to assign
certain colours to a range of gray levels. Pseudo CIP is
mostly used for human interpretation.
So here it is very difficult to distinguish between two
ranges which are very nearer to each other or gray
intensity value are very near to each other.
Sunday, December 18, 2022 28
Colour Image Processing(Cont...)
• Problem with CIP
Interpretation of color from human eye is a
psycophisological problem and we have not yet
been fully understand what is the mechanism by
which we really interpret a color.
Sunday, December 18, 2022 29
Colour Image Processing(Cont...)
• In 1666 Isacc Newton discover color spectrum
by optical prism.
Sunday, December 18, 2022 30
Colour Image Processing(Cont...)
• We can perceive the color depending on the nature of light
which is reflected by the object surface.
• Spectrum of light or spectrum of energy in the visible range that
we are able to perceive a color(400 nm to 700 nm)
Sunday, December 18, 2022 31
Colour Image Processing(Cont...)
• Attribute of Light
Achromatic Light : A light which has no color
component i.e., the only attribute which
describes that particular light is the intensity of
the light.
Chromatic Light : Contain color component.
• 3 quantities that describe the quality of light:
Radiance
Luminance
Brightness
Sunday, December 18, 2022 32
Colour Image Processing(Cont...)
• Radiance : Total amount of energy which comes
out of a light (Unit : watts)
• Luminance : Amount of energy that is perceived
by an observer (Unit : Lumens)
• Brightness : It is a subjective thing. Practically we
can’t measure brightness.
We have 3 primary colors:
Red
Blue
Green
Sunday, December 18, 2022 33
Colour Image Processing(Cont...)
• Newton discovered 7 different color but only 3
colors i.e., red, green and blue are the primary
colors. Why?
Because by mixing these 3 colors in some proportion
we can get all other colors.
There are around 6-7 millions cone cells in our eyes
which are responsible for color sensations.
Around 65% cone cells are sensitive to red color.
Around 33% cone cells are sensitive to green color.
Around 2% cone cells are sensitive to blue color.
Sunday, December 18, 2022 34
Colour Image Processing(Cont...)
• According to CIE standard
Red have wavelength : 700 nm
Green have wavelength : 546.1 nm
Blue have wavelength : 435.6 nm
But, practically :
Red is sensitive to 450 nm to 700 nm
Green is sensitive to 400 nm to 650 nm
Blue is sensitive to 400 nm to 550 nm
Sunday, December 18, 2022 35
Colour Image Processing(Cont...)
• Note : In practical no single wavelength can
specify any particular color.
• By spectrum color also we can see that there is
no clear cut boundaries between any two color.
• One color slowly or smoothly get merged into
another color i.e., there is no clear cut boundary
between transition of color in spectrum.
• So, we can say a band of color give red, green
and blue color sensation respectively.
Sunday, December 18, 2022 36
Colour Image Processing(Cont...)
• Mixing of Primary color generates the secondary
colors i.e.,
 RED+BLUE=Magenta
 GREEN+BLUE = Cyan
 RED+GREEN = yellow
• Here red, green and blue are the primary color
and magenta, cyan and yellow are the secondary
color.
• Pigments : The primary color of pigment is
defined as wavelength which are absorbed by the
pigment and it reflect other wavelength.
Sunday, December 18, 2022 37
Colour Image Processing(Cont...)
• Primary color of light should be opposite of primary color of
pigment i.e., magenta , cyan and yellow are primary color of
pigment.
• If we mix red, green and blue color in appropriate proportion
we get white light and similarly when we mix magenta, cyan
and yellow we get black color.
Sunday, December 18, 2022 38
Colour Image Processing(Cont...)
• For hardware i.e., camera, printer, display
device, scanner this above concept of color is
used i.e., concept of primary color
component.
• But when we perceive a color for human
beings we don’t think that how much
red,green and blue components are mixed in
that particular color.
• So the way by which we human differentiate
or recognize or distinguish color are :
Brightness, Hue and Saturation.
Sunday, December 18, 2022 39
Colour Image Processing(Cont...)
• Spectrum colors are not diluted i.e., spectrum
colors are fully saturated . It means no white
light or white component are added to it.
• Example: Pink is not spectrum color.
Red + white = pink
Here red is fully saturated.
• So, Hue+Saturation indicates chromaticity of
light and Brightness gives some sentation of
intensity.
Sunday, December 18, 2022 40
Colour Image Processing(Cont...)
• Brightness : Achromatic notion of Intensity.
• Hue : It represents the dominant wavelength
present in a mixture of colors.
• Saturation : eg., when we say color is red i.e.,
we may have various shades of red. So
saturation indicates what is the purity of red
i.e., what is the amount of light which has
been mixed to that particular color to make it
a diluted one.
Sunday, December 18, 2022 41
Colour Image Processing(Cont...)
• The amount of red, green and blue
component is needed to get another color
component is known as tristimulus.
• Tristimulus = (X,Y,Z)
• Chromatic cofficient for red = X/(X+Y+Z) , for
green = Y/(X+Y+Z) , for blue = Z/(X+Y+Z).
• Here X+Y+Z=1
• So any color can be specified by its chromatic
cofficient or a color can be specified by a
chromaticity diagram.
Sunday, December 18, 2022 42
Colour Image Processing(Cont...)
• Here Z = 1-(X+Y) , In chromaticity diagram around the
boundary we have all the color of the spectrum colors and
point of equal energy is : white color.
Sunday, December 18, 2022 43
Colour Image Processing(Cont...)
• Color Models : A coordinate system within
which a specified color will be represented by
a single point.
• RGB , CMY , CMYK : Hardware oriented
• HSI : Hue , Saturation and Intensity :
Application oriented / Perception oriented
• In HSI model : I part gives you gray scale
information. H & S taken together gives us
chromatic information.
Sunday, December 18, 2022 44
Colour Image Processing(Cont...)
• RGB Color Model : Here a color model is represented
by 3 primary colors i.e., red , green and blue.
• In RGB color model we can have 224 different color
combinations but practically 216 different colors can
be represented by RGB model.
• RGB color model is based on Cartesian coordinate
system.
• This is an additive color model
• Active displays, such as computer monitors and
television sets, emit combinations of red, green and
blue light.
Sunday, December 18, 2022 45
Colour Image Processing(Cont...)
• RGB Color Model
Sunday, December 18, 2022 46
Colour Image Processing(Cont...)
• RGB Color Model
• RGB 24-bit color cube is shown below
Sunday, December 18, 2022 47
Colour Image Processing(Cont...)
• RGB example:
Sunday, December 18, 2022 48
Original Green Band Blue Band
Red Band
Colour Image Processing(Cont...)
• CMY Color Model : secondary colors of light, or
primary colors of pigments & Used to generate
hardcopy output
Sunday, December 18, 2022 49
Source: www.hp.com
Passive displays, such as colour inkjet printers, absorb light instead of
emitting it. Combinations of cyan, magenta and yellow inks are used. This
is a subtractive colour model.
Colour Image Processing(Cont...)
• Equal proportion of CMY gives a muddy black
color i.e., it is not a pure black color. So, to get
pure black color with CMY another
component is also specified known as Black
component i.e., we get CMYK model.
• In CMYK “K” is the black component.
Sunday, December 18, 2022 50
































B
G
R
Y
M
C
1
1
1
Colour Image Processing(Cont...)
• HSI Color Model (Based on human perception of
colors )
• H = What is the dominant specified color present in a
particular color. It is a subjective measure of color.
• S = How much a pure spectrum color is really diluted
by mixing white color to it i.e., Mixing more “white”
with a color reduces its saturation.
If we mix white color in different proportion with a
color we get different shades of that color.
• I = Chromatic notation of brightness of black and
white image i.e., the brightness or darkness of an
object.
Sunday, December 18, 2022 51
Colour Image Processing(Cont...)
• HSI Color Model
Sunday, December 18, 2022 52
H
dominant
wavelength
S
purity
% white
I
Intensity
Colour Image Processing(Cont...)
• HSI Color Model
Sunday, December 18, 2022 53
RGB -> HSI model
Colour Image Processing(Cont...)
• HSI Color Model
Sunday, December 18, 2022 54
Colour Image Processing(Cont...)
• Pseudo-color Image Processing
Assign colors to gray values based on a specified
criterion
For human visualization and interpretation of
gray-scale events
Intensity slicing
Gray level to color transformations
Sunday, December 18, 2022 55
Colour Image Processing(Cont...)
• Pseudo-color Image Processing(cont…)
Intensity slicing
 Here first consider an intensity image to be a 3D plane.
 Place a plane which is parallel to XY plane(it will slice the
plane into two different hubs).
 We can assign different color on two different sides of
the plane i.e., any pixel whose intensity level is above
the plane will be coded with one color and any pixel
below the plane will be coded with the other.
 Level that lie on the plane itself may be arbitrarily
assigned one of the two colors.
Sunday, December 18, 2022 56
Colour Image Processing(Cont...)
Intensity slicing
 Geometric interpretation of the intensity slicing
technique
Sunday, December 18, 2022 57
Colour Image Processing(Cont...)
Intensity slicing
 Let we have total ‘L’ number of intensity values: 0 to (L-1)
 L0 corresponds to black [ f(x , y) = 0]
 LL-1 corresponds to white [ f(x , y) = L-1]
 Suppose ‘P’ number of planes perpendicular to the
intensity axis i.e., they are parallel to the image plane and
these planes will be placed at the intensity values given
by L1,L2,L3,………,LP.
 Where , 0< P < L-1.
Sunday, December 18, 2022 58
Colour Image Processing(Cont...)
• Intensity slicing
 The P planes partition the gray scale(intensity)
into (P+1) intervals, V1,V2,V3,………,VP+1.
 Color assigned to location (x,y) is given by the
relation
f(x , y) = Ck if f(x , y) ∈ Vk
Sunday, December 18, 2022 59
Colour Image Processing(Cont...)
• Intensity slicing
 Give ROI(region of interest) one color and rest part
other color
 Keep ROI as it is and rest assign one color
 Keep rest as it is and give ROI one color
Sunday, December 18, 2022 60
Colour Image Processing(Cont...)
• Pseudo-coloring is also used from gray to color
image transformation.
• Gray level to color transformation
Sunday, December 18, 2022 61
Colour Image Processing(Cont...)
• Gray level to color transformation
fR(X,Y) = f(x,y)
fG(X,Y) = 0.33f(x,y)
fB(X,Y) = 0.11f(x,y)
 Combining these 3 planes we get the pseudo
color image.
 Application of Pseudo CIP : Machine using at
railways and airport for bag checking.
Sunday, December 18, 2022 62
Image Enhancement
• Intensity Transformation Functions
• Enhancing an image provides better contrast and a more
detailed image as compare to non enhanced image.
Image enhancement has very applications. It is used to
enhance medical images, images captured in remote
sensing, images from satellite e.t.c
• The transformation function has been given below
s = T ( r )
• where r is the pixels of the input image and s is the pixels
of the output image. T is a transformation function that
maps each value of r to each value of s.
Sunday, December 18, 2022 63
Image Enhancement(Cont…)
• Image enhancement can be done through gray
level transformations which are discussed below.
• There are three basic gray level transformation.
• Linear
• Logarithmic
• Power – law
Sunday, December 18, 2022 64
Image Enhancement(Cont…)
• Linear Transformation
 Linear transformation includes simple identity and negative
transformation.
 Identity transition is shown by a straight line. In this transition,
each value of the input image is directly mapped to each other
value of output image. That results in the same input image
and output image. And hence is called identity transformation.
• Negative Transformation
 The second linear transformation is negative transformation,
which is invert of identity transformation. In negative
transformation, each value of the input image is subtracted
from the L-1 and mapped onto the output image.
Sunday, December 18, 2022 65
Image Enhancement(Cont…)
Sunday, December 18, 2022 66
Image Enhancement(Cont…)
• Negative Transformation
s = (L – 1) – r
s = 255 – r
Sunday, December 18, 2022 67
Image Enhancement(Cont…)
• Logarithmic Transformations
 The log transformations can be defined by this
formula
s = c log(r + 1).
 Where s and r are the pixel values of the output
and the input image and c is a constant. The value
1 is added to each of the pixel value of the input
image because if there is a pixel intensity of 0 in
the image, then log (0) is equal to infinity. So 1 is
added, to make the minimum value at least 1.
Sunday, December 18, 2022 68
Image Enhancement(Cont…)
• Logarithmic Transformations
 In log transformation we decrease the dynamic range of a
particular intensity i.e., here intensity of the pixels are
increased which we require to get more information. The
maximum information is contained in the center pixel.
 Log transformation is mainly applied in frequency domain.
Sunday, December 18, 2022 69
Image Enhancement(Cont…)
• Logarithmic Transformation
Sunday, December 18, 2022 70
Image Enhancement(Cont…)
• Power – Law transformations
• This symbol γ is called gamma, due to which this
transformation is also known as gamma
transformation.
Sunday, December 18, 2022 71
s = crγ, c,γ –positive constants
curve the grayscale components either to brighten
the intensity (when γ < 1) or darken the intensity
(when γ > 1).
Image Enhancement(Cont…)
• Power – Law transformations
Sunday, December 18, 2022 72
Image Enhancement(Cont…)
• Power – Law transformations
• Variation in the value of γ varies the enhancement
of the images. Different display devices / monitors
have their own gamma correction, that’s why they
display their image at different intensity.
• This type of transformation is used for enhancing
images for different type of display devices. The
gamma of different display devices is different.
For example Gamma of CRT lies in between of 1.8
to 2.5, that means the image displayed on CRT is
dark.
Sunday, December 18, 2022 73
Image Enhancement(Cont…)
• Power – Law transformations
 Gamma Correction
 Different camera or video recorder devices do not
correctly capture luminance. (they are not linear)
Different display devices (monitor, phone screen, TV) do
not display luminance correctly neither. So, one needs to
correct them, therefore the gamma correction function
is needed. Gamma correction function is used to correct
image's luminance.
s=cr^γ
s=cr^(1/2.5)
Sunday, December 18, 2022 74
Image Enhancement(Cont…)
Sunday, December 18, 2022 75
Image Enhancement(Cont…)
Sunday, December 18, 2022 76
Image Enhancement(Cont…)
• Piecewise-Linear Transformation Functions
 Three types:
 Contrast Stretching
 Intensity Level Slicing
 Bit-Plane Slicing
Sunday, December 18, 2022 77
Image Enhancement(Cont…)
• Contrast stretching
 Aims increase the dynamic range of the gray
levels in the image being processed.
 Contrast stretching is a process that expands the
range of intensity levels in a image so that it
spans the full intensity range of the recording
medium or display device.
 Contrast-stretching transformations increase the
contrast between the darks and the lights
Sunday, December 18, 2022 78
Image Enhancement(Cont…)
• Contrast stretching
Sunday, December 18, 2022 79
Image Enhancement(Cont…)
• Contrast stretching
 The locations of (r1,s1) and (r2,s2) control the shape of
the transformation function.
– If r1= s1 and r2= s2 the transformation is a linear
function and produces no changes.
– If r1=r2, s1=0 and s2=L-1, the transformation becomes
a thresholding function that creates a binary image.
– Intermediate values of (r1,s1) and (r2,s2) produce
various degrees of spread in the gray levels of the
output image, thus affecting its contrast.
– Generally, r1≤r2 and s1≤s2 is assumed.
Sunday, December 18, 2022 80
Image Enhancement(Cont…)
Sunday, December 18, 2022 81
Thresholding function
Image Enhancement(Cont…)
• Intensity-level slicing
 Highlighting a specific range of gray levels in an
image.
 One way is to display a high value for all gray levels
in the range of interest and a low value for all
other gray levels (binary image).
 The second approach is to brighten the desired
range of gray levels but preserve the background
and gray-level tonalities in the image
Sunday, December 18, 2022 82
Image Enhancement(Cont…)
• Intensity Level Slicing
Sunday, December 18, 2022 83
Image Enhancement(Cont…)
• Bit-Plane Slicing
• To highlight the contribution made to the total
image appearance by specific bits.
– i.e. Assuming that each pixel is represented by 8 bits,
the image is composed of 8 1-bit planes.
– Plane 0 contains the least significant bit and plane 7
contains the most significant bit.
– Only the higher order bits (top four) contain visually
significant data. The other bit planes contribute the
more subtle details.
– Plane 7 corresponds exactly with an image thresholded
at gray level 128.
Sunday, December 18, 2022 84
Image Enhancement(Cont…)
• Bit-Plane Slicing
Sunday, December 18, 2022 85
Image Enhancement(Cont…)
• Histogram Processing
 Two Types : (a). Histogram Stretching (b). Histogram
equalization
 Histogram Stretching
 Contrast is the difference between maximum and
minimum pixel intensity.
 Pictorial view to represent the distribution of pixel which
tell frequency of pixel.
Sunday, December 18, 2022 86
Image Enhancement(Cont…)
• The histogram of digital image with gray values
is the discrete function
Sunday, December 18, 2022 87
1
1
0 ,
,
, 
L
r
r
r 
n
n
r
p k
k 
)
(
nk: Number of pixels with gray value rk
n: total Number of pixels in the image
The function p(rk) represents the fraction of the total
number of pixels with gray value rk.
The shape of a histogram provides useful information for
contrast enhancement.
Image Enhancement(Cont…)
• Histogram Processing
Sunday, December 18, 2022 88
Dark image
Bright image
Image Enhancement(Cont…)
• Histogram Processing
Sunday, December 18, 2022 89
Low contrast image
High contrast image
Image Enhancement(Cont…)
• Histogram Stretching
Sunday, December 18, 2022 90
Image Enhancement(Cont…)
• Histogram Stretching (cont…)
• In the above example (0,8) is smin and smax respectively and
rmin = 0 , rmax = 4 is given.
• S-0 = (8 – 0) / (4 – 0) * (r – 0)
• s=(8/4) r
• S= 2r
• Now we have a relation between r and s.
• So get different values of ‘s’ for given rmin to rmax .
Sunday, December 18, 2022 91
Image Enhancement(Cont…)
• Histogram Stretching (cont…)
Sunday, December 18, 2022 92
Image Enhancement(Cont…)
• Histogram Equalization
– Recalculate the picture gray levels to make the
distribution more equalized
– Used widely in image editing tools and computer
vision algorithms
– Can also be applied to color images
Sunday, December 18, 2022 93
Objective of histogram equalization
• We want to find T(r) so that
Ps(s) is a flat line.
Historgram, color v.4e
94
sk
rk
L-1
L-1 r
T(r)
0
Objective:
To find the
Relation s=T(r)
Pr(r)
r
Ps(s)=a constant
s
L-1
L-1
L-1
Equalized distribution
Input random distribution
The original image
The probability of
these levels are lower)
The probability of
these levels are higher
The probability of all
levels are the same
In Ps(s)
s=T(r)
we want to prove ps(s)= constant
)
3
(
calculus
of
thorem
l
Fundementa
•
)
2
(
)
(
)
(
)
(
)
(
by
sides
both
ate
differenti
1
)
(
)
(
Theory
y
probabilit
Basic
•















x
a
s
r
r
s
r
s
f(x)
f(t)dt
dx
d
dr
ds
s
p
r
p
ds
dr
r
p
s
p
ds
dr
r
p
ds
s
p
constant
1
1
)
(
show
(3)
and
(2)
(1),
formula
and
formula
above
with the
continue
:
Exercise
...
)
(
)
1
(
)
(
)
1
(
)
(
)
1
(
)
(
:
)
1
(
,
)
(
)
(
Since
0
0









 












L
s
p
dr
dw
w
p
L
d
dr
r
dT
dr
ds
dw
w
p
L
r
T
s
from
dr
r
dT
dr
ds
r
T
s
s
r
r
r
r
95
•
 





r
r dw
w
p
L
r
T
s
0
)
1
(
)
(
)
1
(
)
(
Image Enhancement(Cont…)
• Histogram Equalization
Sunday, December 18, 2022 96
• Let rk, k[0..L-1] be intensity levels and let p(rk) be its
normalized histogram function.
• Histogram equalization is applying the transformation of
‘r’ to get ‘s’ where ‘r’ belongs to 0 to L-1.
• As, T(r) is continuous & differentiable
ʃPss ds=ʃprr dr =1
differentiating w.r.t ‘s’ we get :
Image Enhancement(Cont…)
• Histogram Equalization(cont…)
 So, e.q. (1)
 The transformation function T(r) for histogram
equalization is :
 Differentiate w.r.t ‘r’ :
 As we know, , SO,
 From eq.(1) we get, which is a constant.
Sunday, December 18, 2022 97
s
p
r
p
dr
ds
s
r





r
r dw
w
p
L
r
T
S
0
)
(
)
1
(
)
(




r
r dw
w
p
dr
d
L
r
T
dr
d
dr
ds
0
)
(
)
1
(
)
(
Histogram Equalization : Discrete form for
practical use
• From the continuous form (1) to discrete form
1
,..,
2
,
1
,
0
,
1
make
to
need
we
,
)
(
If
:
histogram
normlzied
a
obtain
that to
Recall
)
(
)
1
(
)
(
)
1
(
)
(
)
1
(
)
(
0
0
0



















L
k
n
MN
L
s
MN
n
r
p
r
P
L
r
T
s
dw
w
p
L
r
T
s
k
j
j
k
k
k
k
j
j
r
k
k
r
r
98
Histogram Equalization - Example
• Let f be an image with size 64x64 pixels and L=8 and let f has the intensity
distribution as shown in the table
p r(rk )=nk/MN
nk
rk
0.19
790
0
0.25
1023
2
0.21
850
1
0.16
656
3
0.08
329
4
0.06
245
5
0.03
122
6
0.02
81
7
.
00
.
7
,
86
.
6
,
65
.
6
,
23
.
6
,
67
.
5
,
55
.
4
08
.
3
))
(
)
(
(
7
)
(
7
)
(
33
.
1
)
(
7
)
(
7
)
(
7
6
5
4
3
2
1
0
1
0
1
1
0
0
0
0
0



















s
s
s
s
s
s
r
p
r
p
r
p
r
T
s
r
p
r
p
r
T
s
r
r
j
j
r
r
j
j
r
round the values to the nearest integer
Sunday, December 18, 2022 100
Histogram Equalization - Example
Sunday, December 18, 2022 101
Filtering
• Image filtering is used to:
 Remove noise
 Sharpen contrast
 Highlight contours
 Detect edges
 Image filters can be classified as linear or nonlinear.
 Linear filters are also know as convolution filters as
they can be represented using a matrix multiplication.
 Thresholding and image equalisation are examples of
nonlinear operations, as is the median filter.
Sunday, December 18, 2022 102
Filtering(cont…)
• There are two types of processing:
• Point Processing (eg. Histogram equalization)
• Mask Processing
 Two types of filtering methods:
• Smoothing
Linear (Average Filter) and Non-Linear (Median
Filter)
• Sharpening
Laplacian
Gradient
Sunday, December 18, 2022 103
Filtering(Cont…)
Sunday, December 18, 2022 104
output image
Filtering(Cont…)
• Correlation [ 1-D & 2-D]
• Convolution [ 1-D & 2-D]
• In correlation we use weight to get output image and for
applying convolution we just rotate the weight 180
degree.
• Eg. Weight
• After 180 degree rotation
• After 180 degree rotation
Sunday, December 18, 2022 105
1 2 3
3 2 1
1 2 3
4 5 6
7 8 9
9 8 7
6 5 4
3 2 1
Filtering(Cont…)
• 1-D Correlation
• I =
• W=
• Output =
• For convolution just rotate mask 180 degree.
Sunday, December 18, 2022 106
1 2 3 4
1 2 3
[(2*1)+(3*2)]
8
[(1*1)+(2*2)+
(3*3)]
14
[(1*2)+(2*3)+
(3*4)]
20
[(1*3)+(4*2)]
11
Filtering(Cont…)
• A filtering method is linear when the output is a
weighted sum of the input pixels. Eg. Average filter
• Methods that do not satisfy the above property are
called non-linear. Eg. Median filter
• Average (or mean) filtering is a method of ‘smoothing’
images by reducing the amount of intensity variation
between neighbouring pixels.
• The average filter works by moving through the image
pixel by pixel, replacing each value with the average value
of neighbouring pixels, including itself.
Sunday, December 18, 2022 107
Filtering(Cont…)
• Average filter mask (2-D):
Sunday, December 18, 2022 108
Filtering(Cont…)
Sunday, December 18, 2022 109
Filtering(Cont…)
Sunday, December 18, 2022 110
Filtering(Cont…)
Sunday, December 18, 2022 111
Filtering(Cont…)
Sunday, December 18, 2022 112
Filtering(Cont…)
Sunday, December 18, 2022 113
Filtering(Cont…)
• When we apply average filter noise is removed
but blurring is introduced and to remove blurring
we use weighted filter.
Sunday, December 18, 2022 114
Filtering(Cont…)
• Median Filter (non-linear filter)
• Very effective in removing salt and pepper or impulsive noise
while preserving image detail
• Disadvantages: computational complexity, non linear filter
• The median filter works by moving through the image pixel by
pixel, replacing each value with the median value of
neighbouring pixels.
• The pattern of neighbours is called the "window", which
slides, pixel by pixel over the entire image 2 pixel, over the
entire image.
• The median is calculated by first sorting all the pixel values
from the window into numerical order, and then replacing the
pixel being considered with the middle (median) pixel value.
Sunday, December 18, 2022 115
Filtering(Cont…)
• Median Filter Example:
Sunday, December 18, 2022 116
Filtering(Cont…)
Sunday, December 18, 2022 117
Filtering(Cont…)
Sunday, December 18, 2022 118
Filtering(Cont…)
Sunday, December 18, 2022 119
Filtering(Cont…)
Sunday, December 18, 2022 120
Filtering(Cont…)
Sunday, December 18, 2022 121
• From left to right: the results of a 3 x 3, 5 x 5 and 7 x 7 median filter
Filtering(Cont…)
 Sharpening(high pass filter) is performed by noting only the
gray level changes in the image that is the differentiation.
• Sharpening is used for edge detection ,line detection, point
detection and it also highlight changes.
 Operation of Image Differentiation
• Enhance edges and discontinuities (magnitude of output gray
level >>0)
• De-emphasize areas with slowly varying gray-level values
(output gray level: 0)
 Mathematical Basis of Filtering for Image Sharpening
• First-order and second-order derivatives
• Approximation in discrete-space domain
• Implementation by mask filtering
Sunday, December 18, 2022 122
Filtering(Cont…)
 Common sharpening filters:
• Gradient (1st order derivative)
• Laplacian (2nd order derivative)
• Taking the derivative of an image results in sharpening
the image.
• The derivative of an image (i.e., 2D function) can be
computed using the gradient.
Sunday, December 18, 2022 123
Filtering(Cont…)
 Gradient (rotation variant or non-isotropic)
Sunday, December 18, 2022 124
or
Sensitive to
vertical
edges
Sensitive to
horizontal
edges
Filtering(Cont…)
 Gradient
Sunday, December 18, 2022 125
Kernels used in prewitt edge detection
Filtering(Cont…)
• Laplacian
Sunday, December 18, 2022 126
Original Mask
, C = +1 or C= -1
Filtering(Cont…)
• Laplacian(rotation invariant or isotropic)
Sunday, December 18, 2022 127
(b)Extended
Laplacian
mask to
increase
sharpness
and it covers
diagonal
also, so ,
provide good
results.
Image Transforms
• Many times, image processing tasks are best
performed in a domain other than the spatial
domain.
• Key steps
(1) Transform the image
(2) Carry the task(s) in the transformed domain.
(3) Apply inverse transform to return to the spatial
domain.
Math Review - Complex numbers
• Real numbers:
1
-5.2

• Complex numbers
4.2 + 3.7i
9.4447 – 6.7i
-5.2 (-5.2 + 0i)
1


i
We often denote in EE i by j
Math Review - Complex numbers
• Complex numbers
4.2 + 3.7i
9.4447 – 6.7i
-5.2 (-5.2 + 0i)
• General Form
Z = a + bi
Re(Z) = a
Im(Z) = b
• Amplitude
A = | Z | = √(a2 + b2)
• Phase
 =  Z = tan-1(b/a)
Real and imaginary parts
Math Review – Complex Numbers
• Polar Coordinate
Z = a + bi
• Amplitude
A = √(a2 + b2)
• Phase
 = tan-1(b/a)
a
b


A

Math Review – Complex Numbers and
Cosine Waves
• Cosine wave has three properties
– Frequency
– Amplitude
– Phase
• Complex number has two properties
– Amplitude
– Wave
• Complex numbers to represent cosine waves at varying frequency
– Frequency 1: Z1 = 5 +2i
– Frequency 2: Z2 = -3 + 4i
– Frequency 3: Z3 = 1.3 – 1.6i
Simple but great idea !!
Fourier Transforms & its Properties
• Jean Baptiste Joseph Fourier (1768-1830)
Sunday, December 18, 2022 133
• Had crazy idea (1807):
• Any periodic function can be
rewritten as a weighted sum of
Sines and Cosines of different
frequencies.
• Don’t believe it?
– Neither did Lagrange,
Laplace, Poisson and other
big wigs
– Not translated into English
until 1878!
• But it’s true!
– called Fourier Series
– Possibly the greatest tool
used in Engineering
Fourier Transforms & its Properties
• In image processing:
– Instead of time domain: spatial domain (normal image
space)
– frequency domain: space in which each image value at
image position F represents the amount that the
intensity values in image I vary over a specific distance
related to F
Sunday, December 18, 2022 134
Fourier Transforms & its Properties
• Fourier Transforms & Inverse Fourier Transforms
Sunday, December 18, 2022 135
Fourier Transforms & its Properties
Sunday, December 18, 2022 136
Fourier Transforms & its Properties
Sunday, December 18, 2022 137
Fourier Transforms & its Properties
Sunday, December 18, 2022 138
Fourier Transforms & its Properties
• As we deal with 2-d discrete images so we need to
discuss 2-D discrete Fourier Transforms.
Sunday, December 18, 2022 139
Fourier Transforms & its Properties
• Inverse F.T
Sunday, December 18, 2022 140
Fourier Transforms & its Properties
• If the image is represented as square array i.e.,
M=N than F.T and I.F.T is given by equation:
Sunday, December 18, 2022 141
Fourier Transforms & its Properties
Sunday, December 18, 2022 142
Fourier Transforms & its Properties
• Separability property
Sunday, December 18, 2022 143
Fourier Transforms & its Properties
Sunday, December 18, 2022 144
Fourier Transforms & its Properties
• Periodicity: The DFT and its inverse are periodic wit
period N.
Sunday, December 18, 2022 145
Fourier Transforms & its Properties
Sunday, December 18, 2022 146
• Scaling: If a signal is multiply by a scalar quantity ‘a’
than its Fourier Transformation is also multiplied by
same scalar quantity ‘a’.
Fourier Transforms & its Properties
• Distributivity:
Sunday, December 18, 2022 147
but …
Fourier Transforms & its Properties
• Average:
Sunday, December 18, 2022 148
Average:
F(u,v) at u=0, v=0:
So:
Fourier Transforms & its Properties
Sunday, December 18, 2022 149
Frequency Domain Filters
Frequency Domain Filters
Frequency Domain Filters
• Low pass filter: it allows low frequency range signal to pass as
output.(Useful for noise suppression)
• High pass filter: it allows high pass frequency range to pass as
output. (Useful for edge detection)
• D(u,v) is distance of (u , v) in frequency domain from the origin of
the frequency rectangle.
• Do implies that all signals lies in this range i.e., D(u,v)<= Do all
low pass frequency to pass to the output and rest are not allowed
to pass as output.
Frequency Domain Filters
Frequency domain filters
Frequency Domain Filters
• In the above example, for the same cut of
frequency the blurring is more in Ideal low
pass filter than in butterworth filter and as cut
of frequency increases the number of
undesired lines are increased in Ideal low pass
filter than in butterworth filter.
Frequency domain filters
Frequency domain filters
Image Restoration
• Image restoration and image enhancement share a
common goal: to improve image for human perception
• Image enhancement is mainly a subjective process in
which individuals’ opinions are involved in process
design.
• Image restoration is mostly an objective process which:
• utilizes a prior knowledge of degradation phenomenon to
recover image.
• models the degradation and then to recover the original
image.
• The objective of restoration is to obtain an image
estimate which is as close as possible to the original
input image.
Sunday, December 18, 2022 159
Sunday, December 18, 2022 160
Image Restoration
Sunday, December 18, 2022 161
g(x,y)=f(x,y)*h(x,y)+h(x,y)
G(u,v)=F(u,v)H(u,v)+N(u,v)
If H is a linear, position-invariant process (filter), the degraded
image is given in the spatial domain by:
whose equivalent frequency domain representation is:
where h(x,y) is a system that causes image distortion and h(x,y) is noise.
Image Restoration
Sunday, December 18, 2022 162
g(x,y)=f(x,y)+h(x,y)
G(u,v)=F(u,v)+N(u,v)
Sunday, December 18, 2022 163
Sunday, December 18, 2022 164
Sunday, December 18, 2022 165
Sunday, December 18, 2022 166
Sunday, December 18, 2022 167
Sunday, December 18, 2022 168
Sunday, December 18, 2022 169
Sunday, December 18, 2022 170
Sunday, December 18, 2022 171
Sunday, December 18, 2022 172
Sunday, December 18, 2022 173
Sunday, December 18, 2022 174
Sunday, December 18, 2022 175
Image Restoration
• Homomorphic Filter
• In some images, the quality of the image has reduced
because of non-uniform illumination.
• Homomorphic filtering can be used to perform illumination
correction.
 We can view an image f(x,y) as a product of two components:
 F(x,y) = r(x,y). i(x,y) , where
 r(x,y) = Reflectivity of the surface of the corresponding image
point.
 i(x,y) = Intensity of the incident light.
 The above equation is known as illumination-reflectance
model.
Sunday, December 18, 2022 176
Image Restoration
• The illumination-reflectance model can be used to
address the problem of improving the quality of
an image that has been acquired under poor
illumination conditions.
• For many images, the illumination is the primary
contributor to the dynamic range and varies
slowly in space. While reflectance component
r(x,y) represents the details of object and varies
rapidly in space.
Sunday, December 18, 2022 177
Image Restoration
• The illumination & the reflectance components are to be
handled seperately, the logarithm of input function f(x,y)
is taken because f(x,y) is product of i(x,y) & r(x,y). The log
of f(x,y) seperates the components as illustrated below:
ln[f(x,y)] = ln[i(x,y) . r(x,y)]
ln[f(x,y)]= ln[i(x,y)] + ln[r(x,y)]
• Talking Fourier Transforms of the above equation :
F(u,v) = FI (u,v) + FR(u,v)
Where FI (u,v) & FR(u,v) are the Fourier transforms of
illumination and reflectance components respectively.
Sunday, December 18, 2022 178
Image Restoration
• Then the desired filter function H(u,v) can be applied
seperatily to illumination & the reflectance components
as shown below:
F(u,v).H(u,v) = FI (u,v). H(u,v) + FR(u,v). H(u,v)
• In order to visualize the image , Inverse fourier trasforms
followed by exponential function is applied:
= F-1 [F(u,v).H(u,v)] = F-1 [FI (u,v). H(u,v)] + F-1 [FR(u,v). H(u,v)]
• The desired enhanced image is obtained by taking
exponential operation as:
Sunday, December 18, 2022 179
Inverse Filter
after we obtain H(u,v), we can estimate F(u,v) by the inverse filter:
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
ˆ
v
u
H
v
u
N
v
u
F
v
u
H
v
u
G
v
u
F 


From degradation model:
)
,
(
)
,
(
)
,
(
)
,
( v
u
N
v
u
H
v
u
F
v
u
G 

Noise is enhanced
when H(u,v) is small.
In practical, the inverse filter is not
Popularly used.
Inverse Filter: Example
6
/
5
2
2
)
(
0025
.
0
)
,
( v
u
e
v
u
H 


Original image
Blurred image
Due to Turbulence
Result of applying
the full filter
Result of applying
the filter with D0=70
Result of applying
the filter with D0=40
Result of applying
the filter with D0=85
Wiener Filter: Minimum Mean Square Error Filter
Objective: optimize mean square error:  
2
2
)
ˆ
( f
f
E
e 

)
,
(
)
,
(
/
)
,
(
)
,
(
)
,
(
)
,
(
1
)
,
(
)
,
(
/
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
)
,
(
ˆ
2
2
2
*
2
*
v
u
G
v
u
S
v
u
S
v
u
H
v
u
H
v
u
H
v
u
G
v
u
S
v
u
S
v
u
H
v
u
H
v
u
G
v
u
S
v
u
H
v
u
S
v
u
S
v
u
H
v
u
F
f
f
f
f






























h
h
h
Wiener Filter Formula:
where
H(u,v) = Degradation function
Sh(u,v) = Power spectrum of noise
Sf(u,v) = Power spectrum of the undegraded image
Approximation of Wiener Filter
)
,
(
)
,
(
/
)
,
(
)
,
(
)
,
(
)
,
(
1
)
,
(
ˆ
2
2
v
u
G
v
u
S
v
u
S
v
u
H
v
u
H
v
u
H
v
u
F
f 









h
Wiener Filter Formula:
Approximated Formula:
)
,
(
)
,
(
)
,
(
)
,
(
1
)
,
(
ˆ
2
2
v
u
G
K
v
u
H
v
u
H
v
u
H
v
u
F










Difficult to estimate
Practically, K is chosen manually to obtained the best visual result!
Wiener Filter: Example
Original image
Result of the inverse
filter with D0=70
Result of the
Wiener filter
Blurred image
Due to Turbulence
Wiener Filter
•It is better than inverse filter.
•It incorporates both the degradation function and the
statistical characteristics of the noise( mean, spectrum etc..)
into the restoration process.
•Here we consider image and noise as random function.
•Objective is to find of the uncorrupted image f such
that mean square error between them is minimize.
•Assumption :
•Image and noise are not correlated.
•Any one of them should have zero mean.
•Gray levels in the estimate are linear functions of the
degraded image.
Image Compression
Sunday, December 18, 2022 186
Image Compression
Sunday, December 18, 2022 187
Image Compression
Sunday, December 18, 2022 188
Image Compression
Sunday, December 18, 2022 189
Image Compression
Sunday, December 18, 2022 190
Image Compression
Sunday, December 18, 2022 191
Image Compression
Sunday, December 18, 2022 192
Image Compression
Sunday, December 18, 2022 193
Image Compression
Sunday, December 18, 2022 194
Image Compression
Sunday, December 18, 2022 195
Image Compression
Sunday, December 18, 2022 196
Sunday, December 18, 2022 197
Image Compression
Sunday, December 18, 2022 198
Huffman Coding
Sunday, December 18, 2022 199
Sunday, December 18, 2022 200
010100111100 = a3a1a2a2a6
Sunday, December 18, 2022 201
Sunday, December 18, 2022 202
• a2,a1,a3,a5,a4 = (0.25,0.25,0.2,0.15,0.15)
• Consider a five symbol sequence
{a1,a2,a3,a3,a4} from a four symbol source
code. Generate the arithmetic code for the
same.
Sunday, December 18, 2022 203
Image Compression
Sunday, December 18, 2022 204
Image Compression
Sunday, December 18, 2022 205
Image Compression
Sunday, December 18, 2022 206
Image Compression
Sunday, December 18, 2022 207
Image Compression
Sunday, December 18, 2022 208
Image Compression
Sunday, December 18, 2022 209
Image Compression
Sunday, December 18, 2022 210
Image Compression
Sunday, December 18, 2022 211
Sunday, December 18, 2022 212
Entropy
encoder
Image Compression
Sunday, December 18, 2022 213
Sunday, December 18, 2022 214
Sunday, December 18, 2022 215
Image segmentation
&
Representation
216
• Image segmentation
– ex: edge-based, region-based
• Image representation (Boundary Representation)
– ex: Chain code , polygonal approximation
– Image description (Boundary Descriptors)
– ex: boundary-based, regional-based
217
OUTLINE
Image Segmentation
Image segmentation(cont…)
• Segmentation is used to subdivide an image into
its constituent parts or objects.
• This step determines the eventual success or
failure of image analysis.
• Generally, the segmentation is carried out only up
to the objects of interest are isolated. e..g. face
detection.
• The goal of segmentation is to simplify and/or
change the representation of an image into
something that is more meaningful and easier to
analyse.
Classification of the Segmentation techniques
Image
Segmentation
Discontinuity Similarity
e.g.
- Point Detection
- Line Detection
- Edge Detection
e.g.
- Thresholding
- Region Growing
- Region splitting
&
merging
• There are three basic types of gray-level discontinuities in a
digital image: points, lines, and edges
• The most common way to look for discontinuities is to run a
mask through the image.
• We say that a point, line, and edge has been detected at the
location on which the mask is centered if ,where
221
edge-based segmentation(1)
R T
 1 1 2 2 9 9
......
R w z w z w z
   
edge-based segmentation(2)
• Point detection
a point detection mask
• Line detection
a line detection mask
222
edge-based segmentation(3)
• Edge detection: Gradient
operation
223
x
y
f
G x
G f
y
f




 
 
    
 
 
 
1
2 2 2
( ) x y
f mag f G G
 
    
 
1
( , ) tan ( )
y
x
G
x y
G
 

edge-based segmentation(4)
• Edge detection: Laplacian
operation
224
2 2
2
2 2
f f
f
x y
 
  
 
2
2
2 2
2 2
4
( )
r
r
h r e 



 

    
 
225
Region Based Segmentation
Region Growing
Region growing techniques start with one pixel of a potential region and try to
grow it by adding adjacent pixels till the pixels being compared are too disimilar.
• The first pixel selected can be just the first unlabeled pixel in the image or a set of
seed pixels can be chosen from the image.
• Usually a statistical test is used to decide which pixels can be added to a region.
• Region Growing technique
• Assign some seed point.
• Assign some threshold value{value that is nearer to maximum value
of pixel value}
• Compare seed point with pixel value around it.
Sunday, December 18, 2022 226
• Eg. Threshold < 3
• Answer of the above image through region
growing technique is:
Sunday, December 18, 2022 227
• Region Splitting and Merging
• Threshold <=3
• Split the image into equal parts.
• If (Maximum pixel value – Minimum Pixel value) does not satisfy
the threshold constraint than again split region.
Sunday, December 18, 2022 228
Sunday, December 18, 2022 229
Sunday, December 18, 2022 230
Sunday, December 18, 2022 231
Boundary Representation
• Image regions (including segments) can be represented by
either the border or the pixels of the region. These can be
viewed as external or internal characteristics, respectively.
• Chain codes
•
Boundary Representation
Chain Codes
Boundary Representation
Chain Codes
• Chain codes can be based on either 4-connectedness
or 8-connectedness.
• The first difference of the chain code:
– This difference is obtained by counting the number of
direction changes (in a counterclockwise direction)
– For example, the first difference of the 4-direction chain
code 10103322 is 3133030.
• Assuming the first difference code represent a closed
path, rotation normalization can be achieved by
circularly shifting the number of the code so that the
list of numbers forms the smallest possible integer.
• Size normalization can be achieved by adjusting the
size of the resampling grid.
Sunday, December 18, 2022 235
Sunday, December 18, 2022 236
Boundary Representation
Polygonal Approximations
• Polygonal approximations: to represent a boundary by straight line
segments, and a closed path becomes a polygon.
• The number of straight line segments used determines the accuracy of the
approximation.
• Only the minimum required number of sides necessary to preserve the
needed shape information should be used (Minimum perimeter polygons).
• A larger number of sides will only add noise to the model.
Boundary Representation
Polygonal Approximations
• Minimum perimeter polygons: (Merging and splitting)
– Merging and splitting are often used together to ensure that
vertices appear where they would naturally in the boundary.
– A least squares criterion to a straight line is used to stop the
processing.
239
Hough Transform
• The Hough transform is a method for detecting
lines or curves specified by a parametric function.
• If the parameters are p1, p2, … pn, then the Hough
procedure uses an n-dimensional accumulator array
in which it accumulates votes for the correct parameters
of the lines or curves found on the image.
y = mx + b
image
m
b
accumulator
Q. Given 3 points, Use Hough Transform to draw a line joining
these points : (1,1), (2,2) & (3,3).
Sunday, December 18, 2022 240
Sunday, December 18, 2022 241
Question. Given 5 points, use Hough transform to draw a line joining the points (1,4) , (2,3),
(3,1), (4,1), (5,0). (RTU-2016)
Boundary Descriptors
• There are several simple geometric measures
that can be useful for describing a boundary.
– The length of a boundary: the number of pixels
along a boundary gives a rough approximation of
its length.
– Curvature: the rate of change of slope
• To measure a curvature accurately at a point in a digital
boundary is difficult
• The difference between the slops of adjacent boundary
segments is used as a descriptor of curvature at the
point of intersection of segments
Boundary Descriptors
Shape Numbers
First difference
• The shape number of a boundary is defined as the first
difference of smallest magnitude.
• The order n of a shape number is defined as the number of
digits in its representation.
Boundary Descriptors
Shape Numbers
Boundary Descriptors
Shape Numbers
Boundary Descriptors
Fourier Descriptors
• This is a way of using the Fourier transform to
analyze the shape of a boundary.
– The x-y coordinates of the boundary are treated as the real
and imaginary parts of a complex number.
– Then the list of coordinates is Fourier transformed using
the DFT (chapter 4).
– The Fourier coefficients are called the Fourier descriptors.
– The basic shape of the region is determined by the first
several coefficients, which represent lower frequencies.
– Higher frequency terms provide information on the fine
detail of the boundary.
Boundary Descriptors
Fourier Descriptors
Regional Descriptors
• Some simple descriptors
– The area of a region: the number of pixels in the
region
– The perimeter of a region: the length of its
boundary
– The compactness of a region: (perimeter)2/area
– The mean and median of the gray levels
– The minimum and maximum gray-level values
– The number of pixels with values above and below
the mean
Regional Descriptors
Example
Regional Descriptors
Topological Descriptors
Topological property 1:
the number of holes (H)
Topological property 2:
the number of connected
components (C)
Regional Descriptors
Topological Descriptors
Topological property 3:
Euler number: the number of connected components subtract the number of holes
E = C - H
E=0 E= -1
Regional Descriptors
Topological Descriptors
Topological
property 4:
the largest
connected
component.

More Related Content

Similar to DIP-CHAPTERs-GOPAL SIR.pptx

Week06 bme429-cbir
Week06 bme429-cbirWeek06 bme429-cbir
Week06 bme429-cbir
Ikram Moalla
 
Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...
Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...
Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...
MonikaSharma151258
 
Computer Vision - Image Formation.pptx
Computer Vision - Image Formation.pptxComputer Vision - Image Formation.pptx
Computer Vision - Image Formation.pptx
AmmarahMajeed
 
1 [Autosaved].pptx
1 [Autosaved].pptx1 [Autosaved].pptx
1 [Autosaved].pptx
SsdSsd5
 

Similar to DIP-CHAPTERs-GOPAL SIR.pptx (20)

Week06 bme429-cbir
Week06 bme429-cbirWeek06 bme429-cbir
Week06 bme429-cbir
 
Chapter 1 and 2 gonzalez and woods
Chapter 1 and 2 gonzalez and woodsChapter 1 and 2 gonzalez and woods
Chapter 1 and 2 gonzalez and woods
 
CHAPTER_1_updated_8_aug.ppt
CHAPTER_1_updated_8_aug.pptCHAPTER_1_updated_8_aug.ppt
CHAPTER_1_updated_8_aug.ppt
 
DIP Notes Unit-1 PPT.pdf
DIP Notes Unit-1 PPT.pdfDIP Notes Unit-1 PPT.pdf
DIP Notes Unit-1 PPT.pdf
 
Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...
Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...
Unit-1Chapter-1Lecture-2.pptxkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk...
 
Chapter01 (2)
Chapter01 (2)Chapter01 (2)
Chapter01 (2)
 
DIP Notes Unit-1 PPT , engineering, computer Science
DIP Notes Unit-1 PPT , engineering, computer ScienceDIP Notes Unit-1 PPT , engineering, computer Science
DIP Notes Unit-1 PPT , engineering, computer Science
 
It 603
It 603It 603
It 603
 
Chap_1_Digital_Image_Fundamentals_DD (2).pdf
Chap_1_Digital_Image_Fundamentals_DD (2).pdfChap_1_Digital_Image_Fundamentals_DD (2).pdf
Chap_1_Digital_Image_Fundamentals_DD (2).pdf
 
Computer Vision - Image Formation.pptx
Computer Vision - Image Formation.pptxComputer Vision - Image Formation.pptx
Computer Vision - Image Formation.pptx
 
computervision1.pdf it is about computer vision
computervision1.pdf it is about computer visioncomputervision1.pdf it is about computer vision
computervision1.pdf it is about computer vision
 
Intro+Imaging.ppt
Intro+Imaging.pptIntro+Imaging.ppt
Intro+Imaging.ppt
 
Lecture 1 for Digital Image Processing (2nd Edition)
Lecture 1 for Digital Image Processing (2nd Edition)Lecture 1 for Digital Image Processing (2nd Edition)
Lecture 1 for Digital Image Processing (2nd Edition)
 
IT6005 digital image processing question bank
IT6005   digital image processing question bankIT6005   digital image processing question bank
IT6005 digital image processing question bank
 
1 [Autosaved].pptx
1 [Autosaved].pptx1 [Autosaved].pptx
1 [Autosaved].pptx
 
Human Visual System in Digital Image Processing.ppt
Human Visual System in Digital Image Processing.pptHuman Visual System in Digital Image Processing.ppt
Human Visual System in Digital Image Processing.ppt
 
It 603
It 603It 603
It 603
 
It 603
It 603It 603
It 603
 
Poster rough draft
Poster rough draftPoster rough draft
Poster rough draft
 
Introduction to digital image processing
Introduction to digital image processingIntroduction to digital image processing
Introduction to digital image processing
 

Recently uploaded

DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
MayuraD1
 
Digital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptxDigital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptx
pritamlangde
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
HenryBriggs2
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
ssuser89054b
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Kandungan 087776558899
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
Epec Engineered Technologies
 

Recently uploaded (20)

Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network Devices
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and properties
 
Jaipur ❤CALL GIRL 0000000000❤CALL GIRLS IN Jaipur ESCORT SERVICE❤CALL GIRL IN...
Jaipur ❤CALL GIRL 0000000000❤CALL GIRLS IN Jaipur ESCORT SERVICE❤CALL GIRL IN...Jaipur ❤CALL GIRL 0000000000❤CALL GIRLS IN Jaipur ESCORT SERVICE❤CALL GIRL IN...
Jaipur ❤CALL GIRL 0000000000❤CALL GIRLS IN Jaipur ESCORT SERVICE❤CALL GIRL IN...
 
💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...
💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...
💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - V
 
Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . ppt
 
School management system project Report.pdf
School management system project Report.pdfSchool management system project Report.pdf
School management system project Report.pdf
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
 
AIRCANVAS[1].pdf mini project for btech students
AIRCANVAS[1].pdf mini project for btech studentsAIRCANVAS[1].pdf mini project for btech students
AIRCANVAS[1].pdf mini project for btech students
 
Unleashing the Power of the SORA AI lastest leap
Unleashing the Power of the SORA AI lastest leapUnleashing the Power of the SORA AI lastest leap
Unleashing the Power of the SORA AI lastest leap
 
Digital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptxDigital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptx
 
Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
 

DIP-CHAPTERs-GOPAL SIR.pptx

  • 2. Image Processing Books • Gonzalez, R. C. and Woods, R. E., "Digital Image Processing", Prentice Hall. • Jain, A. K., "Fundamentals of Digital Image Processing", PHI Learning, 1 st Ed. • Bernd, J., "Digital Image Processing", Springer, 6 th Ed. • Burger, W. and Burge, M. J., "Principles of Digital Image Processing", Springer • Scherzer, O., " Handbook of Mathematical Methods in Imaging", Springer Sunday, December 18, 2022 2
  • 3. Why we need Image Processing? • Improvement of pictorial information for human perception • Image processing for autonomus machine applications • Efficient storage and transmission Sunday, December 18, 2022 3
  • 4. What is digital image processing? • An image may be defined as a two dimensional function f(x,y), where ‘x’ and ‘y’ are spatial(plane) coordinates and the amplitude of ‘f’ at any pair of coordinates (x,y) is called the intensity or gray level of the image at that point. • When x,y, and the amplitude values of ‘f’ are all finite, descrete quantities we call the image a digital image. • The field of digital image processing refers to processing digital images by means of digital computers. Sunday, December 18, 2022 4
  • 5. What is digital image processing? (Cont…) Sunday, December 18, 2022 5
  • 6. Image Processing Applications • Automobile driver assistance – Lane departure warning – Adaptive cruise control – Obstacle warning • Digital Photography – Image Enhancement – Compression – Color manipulation – Image editing – Digital cameras • Sports analysis – sports refereeing and commentary – 3D visualization and tracking sports actions Sunday, December 18, 2022 6
  • 7. Image Processing Applications(Cont…) • Film and Video – Editing – Special effects • Image Database – Content based image retrieval – visual search of products – Face recognition • Industrial Automation and Inspection – vision-guided robotics – Inspection systems • Medical and Biomedical – Surgical assistance – Sensor fusion – Vision based diagnosis • Astronomy – Astronomical Image Enhancement – Chemical/Spectral Analysis Sunday, December 18, 2022 7
  • 8. Image Processing Applications(Cont...) • Arial Photography – Image Enhancement – Missile Guidance – Geological Mapping • Robotics – Autonomous Vehicles • Security and Safety – Biometry verification (face, iris) – Surveillance (fences, swimming pools) • Military – Tracking and localizing – Detection – Missile guidance • Traffic and Road Monitoring – Traffic monitoring – Adaptive traffic lights Sunday, December 18, 2022 8
  • 9. Brief History of IP • In 1920s, submarine cables were used to transmit digitized newspaper pictures between London & New York – using Bartlane cable picture transmission System. • Specialized printing equipments(eg. Telegraphic printer) used to code the picture for cable transmission and its reproduction on the receiving end. • In 1921, printing procedure was changed to photographic reproduction from tapes perforated at telegraph receiving terminals. • This improved both tonal quality & resolution. Sunday, December 18, 2022 9
  • 10. Brief History of IP(Cont…) Sunday, December 18, 2022 10
  • 11. Brief History of IP(Cont…) • Bartlane system was capable of coding 5 distinct brightness levels. This was increased to 15 levels by 1929. • Improvement of processing techniques continued for next 35 years . • In 1964 computer processing techniques were used to improve the pictures of moon tranmitted by ranger 7 at Jet Propulsion Laboratory. • This was the basis of modern Image Processing techniques. Sunday, December 18, 2022 11
  • 12. Image Processing Steps Sunday, December 18, 2022 12
  • 13. Components of IP System Sunday, December 18, 2022 13
  • 14. Image Acquisition Process Sunday, December 18, 2022 14
  • 15. Image Sensing and Acquisition Sunday, December 18, 2022 15
  • 16. Image Sensing and Acquisition(Cont…) • Image acquisition using a single sensor Sunday, December 18, 2022 16
  • 17. Image Sensing and Acquisition(Cont…) • Using sensor strips Sunday, December 18, 2022 17
  • 18. Image Representation Sunday, December 18, 2022 18 x y IMAGE An image is a 2-D light intensity function F(X,Y). F(X,Y) = R(X,Y)* I(X,Y) , where R(X,Y) = Reflectivity of the surface of the corresponding image point. I(X,Y) = Intensity of the incident light. A digital image F(X,Y) is discretized both in spatial coordinates and brightness. It can be considered as a matrix whose row, column indices specify a point in the image & the element value identifies gray level value at that point known as pixel or pels.
  • 19. Image Representation (Cont..) Sunday, December 18, 2022 19 (0,0) (0,1) ... (0, 1) (1,0) (1,1) ... (1, 1) ( , ) ... ... ... ... ( 1,0) ( 1,1) ... ( 1, 1) f f f N f f f N f x y f M f M f M N                    Image Representation in Matrix form
  • 22. Image Representation (Cont..) Sunday, December 18, 2022 22 ( , ) ( , ) ( , ) ( , ): intensity at the point ( , ) ( , ): illumination at the point ( , ) (the amount of source illumination incident on the scene) ( , ): reflectance/transmissivity f x y i x y r x y f x y x y i x y x y r x y  at the point ( , ) (the amount of illumination reflected/transmitted by the object) where 0 < ( , ) < and 0 < ( , ) < 1 x y i x y r x y 
  • 23. Image Representation (Cont..) • By theory of real numbers : Between any two given points there are infinite number of points. • Now by this theory : An image should be represented by infinite number of points. Each such image point may contain one of the infinitely many possible intensity/color values needing infinite number of bits. Obviously such a representation is not possible in any digital computer. Sunday, December 18, 2022 23
  • 24. Image Sampling and Quantization • By above slides we came to know that we need to find some other way to represent an image in digital format. • So we will consider some discrete set of points known as grid and in each rectangular grid consider intensity of a particular point. This process is known as sampling. • Image representation by 2-d finite matrix – Sampling • Each matrix element represented by one of the finite set of discrete values - Quantization Sunday, December 18, 2022 24
  • 25. Image Sampling and Quantization Sunday, December 18, 2022 25
  • 26. Colour Image Processing • Why we need CIP when we get information from black and white image itself? 1. Colour is a very powerful descriptor & using the colour information we can extract the objects of interest from an image very easily which is not so easy in some cases using black & white pr simple gray level image. 2. Human eyes can distinguish between thousands of colours & colour shades whereas when we talk about only black and white image or gray scale image we can distinguish only about dozens of intensity distinguishness or different gray levels. Sunday, December 18, 2022 26
  • 27. Color Image processing(Cont…) • The color that human perceive in an object = the light reflected from the object Illumination source scene reflection Humen eye
  • 28. Colour Image Processing(Cont...) • In CIP there are 2 major areas: 1.FULL CIP : Image which are acquired by full colour TV camera or by full color scanner, than, you find that all the colour you perceive they are present in the images. 2.PSEUDO CIP : Is a problem where we try to assign certain colours to a range of gray levels. Pseudo CIP is mostly used for human interpretation. So here it is very difficult to distinguish between two ranges which are very nearer to each other or gray intensity value are very near to each other. Sunday, December 18, 2022 28
  • 29. Colour Image Processing(Cont...) • Problem with CIP Interpretation of color from human eye is a psycophisological problem and we have not yet been fully understand what is the mechanism by which we really interpret a color. Sunday, December 18, 2022 29
  • 30. Colour Image Processing(Cont...) • In 1666 Isacc Newton discover color spectrum by optical prism. Sunday, December 18, 2022 30
  • 31. Colour Image Processing(Cont...) • We can perceive the color depending on the nature of light which is reflected by the object surface. • Spectrum of light or spectrum of energy in the visible range that we are able to perceive a color(400 nm to 700 nm) Sunday, December 18, 2022 31
  • 32. Colour Image Processing(Cont...) • Attribute of Light Achromatic Light : A light which has no color component i.e., the only attribute which describes that particular light is the intensity of the light. Chromatic Light : Contain color component. • 3 quantities that describe the quality of light: Radiance Luminance Brightness Sunday, December 18, 2022 32
  • 33. Colour Image Processing(Cont...) • Radiance : Total amount of energy which comes out of a light (Unit : watts) • Luminance : Amount of energy that is perceived by an observer (Unit : Lumens) • Brightness : It is a subjective thing. Practically we can’t measure brightness. We have 3 primary colors: Red Blue Green Sunday, December 18, 2022 33
  • 34. Colour Image Processing(Cont...) • Newton discovered 7 different color but only 3 colors i.e., red, green and blue are the primary colors. Why? Because by mixing these 3 colors in some proportion we can get all other colors. There are around 6-7 millions cone cells in our eyes which are responsible for color sensations. Around 65% cone cells are sensitive to red color. Around 33% cone cells are sensitive to green color. Around 2% cone cells are sensitive to blue color. Sunday, December 18, 2022 34
  • 35. Colour Image Processing(Cont...) • According to CIE standard Red have wavelength : 700 nm Green have wavelength : 546.1 nm Blue have wavelength : 435.6 nm But, practically : Red is sensitive to 450 nm to 700 nm Green is sensitive to 400 nm to 650 nm Blue is sensitive to 400 nm to 550 nm Sunday, December 18, 2022 35
  • 36. Colour Image Processing(Cont...) • Note : In practical no single wavelength can specify any particular color. • By spectrum color also we can see that there is no clear cut boundaries between any two color. • One color slowly or smoothly get merged into another color i.e., there is no clear cut boundary between transition of color in spectrum. • So, we can say a band of color give red, green and blue color sensation respectively. Sunday, December 18, 2022 36
  • 37. Colour Image Processing(Cont...) • Mixing of Primary color generates the secondary colors i.e.,  RED+BLUE=Magenta  GREEN+BLUE = Cyan  RED+GREEN = yellow • Here red, green and blue are the primary color and magenta, cyan and yellow are the secondary color. • Pigments : The primary color of pigment is defined as wavelength which are absorbed by the pigment and it reflect other wavelength. Sunday, December 18, 2022 37
  • 38. Colour Image Processing(Cont...) • Primary color of light should be opposite of primary color of pigment i.e., magenta , cyan and yellow are primary color of pigment. • If we mix red, green and blue color in appropriate proportion we get white light and similarly when we mix magenta, cyan and yellow we get black color. Sunday, December 18, 2022 38
  • 39. Colour Image Processing(Cont...) • For hardware i.e., camera, printer, display device, scanner this above concept of color is used i.e., concept of primary color component. • But when we perceive a color for human beings we don’t think that how much red,green and blue components are mixed in that particular color. • So the way by which we human differentiate or recognize or distinguish color are : Brightness, Hue and Saturation. Sunday, December 18, 2022 39
  • 40. Colour Image Processing(Cont...) • Spectrum colors are not diluted i.e., spectrum colors are fully saturated . It means no white light or white component are added to it. • Example: Pink is not spectrum color. Red + white = pink Here red is fully saturated. • So, Hue+Saturation indicates chromaticity of light and Brightness gives some sentation of intensity. Sunday, December 18, 2022 40
  • 41. Colour Image Processing(Cont...) • Brightness : Achromatic notion of Intensity. • Hue : It represents the dominant wavelength present in a mixture of colors. • Saturation : eg., when we say color is red i.e., we may have various shades of red. So saturation indicates what is the purity of red i.e., what is the amount of light which has been mixed to that particular color to make it a diluted one. Sunday, December 18, 2022 41
  • 42. Colour Image Processing(Cont...) • The amount of red, green and blue component is needed to get another color component is known as tristimulus. • Tristimulus = (X,Y,Z) • Chromatic cofficient for red = X/(X+Y+Z) , for green = Y/(X+Y+Z) , for blue = Z/(X+Y+Z). • Here X+Y+Z=1 • So any color can be specified by its chromatic cofficient or a color can be specified by a chromaticity diagram. Sunday, December 18, 2022 42
  • 43. Colour Image Processing(Cont...) • Here Z = 1-(X+Y) , In chromaticity diagram around the boundary we have all the color of the spectrum colors and point of equal energy is : white color. Sunday, December 18, 2022 43
  • 44. Colour Image Processing(Cont...) • Color Models : A coordinate system within which a specified color will be represented by a single point. • RGB , CMY , CMYK : Hardware oriented • HSI : Hue , Saturation and Intensity : Application oriented / Perception oriented • In HSI model : I part gives you gray scale information. H & S taken together gives us chromatic information. Sunday, December 18, 2022 44
  • 45. Colour Image Processing(Cont...) • RGB Color Model : Here a color model is represented by 3 primary colors i.e., red , green and blue. • In RGB color model we can have 224 different color combinations but practically 216 different colors can be represented by RGB model. • RGB color model is based on Cartesian coordinate system. • This is an additive color model • Active displays, such as computer monitors and television sets, emit combinations of red, green and blue light. Sunday, December 18, 2022 45
  • 46. Colour Image Processing(Cont...) • RGB Color Model Sunday, December 18, 2022 46
  • 47. Colour Image Processing(Cont...) • RGB Color Model • RGB 24-bit color cube is shown below Sunday, December 18, 2022 47
  • 48. Colour Image Processing(Cont...) • RGB example: Sunday, December 18, 2022 48 Original Green Band Blue Band Red Band
  • 49. Colour Image Processing(Cont...) • CMY Color Model : secondary colors of light, or primary colors of pigments & Used to generate hardcopy output Sunday, December 18, 2022 49 Source: www.hp.com Passive displays, such as colour inkjet printers, absorb light instead of emitting it. Combinations of cyan, magenta and yellow inks are used. This is a subtractive colour model.
  • 50. Colour Image Processing(Cont...) • Equal proportion of CMY gives a muddy black color i.e., it is not a pure black color. So, to get pure black color with CMY another component is also specified known as Black component i.e., we get CMYK model. • In CMYK “K” is the black component. Sunday, December 18, 2022 50                                 B G R Y M C 1 1 1
  • 51. Colour Image Processing(Cont...) • HSI Color Model (Based on human perception of colors ) • H = What is the dominant specified color present in a particular color. It is a subjective measure of color. • S = How much a pure spectrum color is really diluted by mixing white color to it i.e., Mixing more “white” with a color reduces its saturation. If we mix white color in different proportion with a color we get different shades of that color. • I = Chromatic notation of brightness of black and white image i.e., the brightness or darkness of an object. Sunday, December 18, 2022 51
  • 52. Colour Image Processing(Cont...) • HSI Color Model Sunday, December 18, 2022 52 H dominant wavelength S purity % white I Intensity
  • 53. Colour Image Processing(Cont...) • HSI Color Model Sunday, December 18, 2022 53 RGB -> HSI model
  • 54. Colour Image Processing(Cont...) • HSI Color Model Sunday, December 18, 2022 54
  • 55. Colour Image Processing(Cont...) • Pseudo-color Image Processing Assign colors to gray values based on a specified criterion For human visualization and interpretation of gray-scale events Intensity slicing Gray level to color transformations Sunday, December 18, 2022 55
  • 56. Colour Image Processing(Cont...) • Pseudo-color Image Processing(cont…) Intensity slicing  Here first consider an intensity image to be a 3D plane.  Place a plane which is parallel to XY plane(it will slice the plane into two different hubs).  We can assign different color on two different sides of the plane i.e., any pixel whose intensity level is above the plane will be coded with one color and any pixel below the plane will be coded with the other.  Level that lie on the plane itself may be arbitrarily assigned one of the two colors. Sunday, December 18, 2022 56
  • 57. Colour Image Processing(Cont...) Intensity slicing  Geometric interpretation of the intensity slicing technique Sunday, December 18, 2022 57
  • 58. Colour Image Processing(Cont...) Intensity slicing  Let we have total ‘L’ number of intensity values: 0 to (L-1)  L0 corresponds to black [ f(x , y) = 0]  LL-1 corresponds to white [ f(x , y) = L-1]  Suppose ‘P’ number of planes perpendicular to the intensity axis i.e., they are parallel to the image plane and these planes will be placed at the intensity values given by L1,L2,L3,………,LP.  Where , 0< P < L-1. Sunday, December 18, 2022 58
  • 59. Colour Image Processing(Cont...) • Intensity slicing  The P planes partition the gray scale(intensity) into (P+1) intervals, V1,V2,V3,………,VP+1.  Color assigned to location (x,y) is given by the relation f(x , y) = Ck if f(x , y) ∈ Vk Sunday, December 18, 2022 59
  • 60. Colour Image Processing(Cont...) • Intensity slicing  Give ROI(region of interest) one color and rest part other color  Keep ROI as it is and rest assign one color  Keep rest as it is and give ROI one color Sunday, December 18, 2022 60
  • 61. Colour Image Processing(Cont...) • Pseudo-coloring is also used from gray to color image transformation. • Gray level to color transformation Sunday, December 18, 2022 61
  • 62. Colour Image Processing(Cont...) • Gray level to color transformation fR(X,Y) = f(x,y) fG(X,Y) = 0.33f(x,y) fB(X,Y) = 0.11f(x,y)  Combining these 3 planes we get the pseudo color image.  Application of Pseudo CIP : Machine using at railways and airport for bag checking. Sunday, December 18, 2022 62
  • 63. Image Enhancement • Intensity Transformation Functions • Enhancing an image provides better contrast and a more detailed image as compare to non enhanced image. Image enhancement has very applications. It is used to enhance medical images, images captured in remote sensing, images from satellite e.t.c • The transformation function has been given below s = T ( r ) • where r is the pixels of the input image and s is the pixels of the output image. T is a transformation function that maps each value of r to each value of s. Sunday, December 18, 2022 63
  • 64. Image Enhancement(Cont…) • Image enhancement can be done through gray level transformations which are discussed below. • There are three basic gray level transformation. • Linear • Logarithmic • Power – law Sunday, December 18, 2022 64
  • 65. Image Enhancement(Cont…) • Linear Transformation  Linear transformation includes simple identity and negative transformation.  Identity transition is shown by a straight line. In this transition, each value of the input image is directly mapped to each other value of output image. That results in the same input image and output image. And hence is called identity transformation. • Negative Transformation  The second linear transformation is negative transformation, which is invert of identity transformation. In negative transformation, each value of the input image is subtracted from the L-1 and mapped onto the output image. Sunday, December 18, 2022 65
  • 67. Image Enhancement(Cont…) • Negative Transformation s = (L – 1) – r s = 255 – r Sunday, December 18, 2022 67
  • 68. Image Enhancement(Cont…) • Logarithmic Transformations  The log transformations can be defined by this formula s = c log(r + 1).  Where s and r are the pixel values of the output and the input image and c is a constant. The value 1 is added to each of the pixel value of the input image because if there is a pixel intensity of 0 in the image, then log (0) is equal to infinity. So 1 is added, to make the minimum value at least 1. Sunday, December 18, 2022 68
  • 69. Image Enhancement(Cont…) • Logarithmic Transformations  In log transformation we decrease the dynamic range of a particular intensity i.e., here intensity of the pixels are increased which we require to get more information. The maximum information is contained in the center pixel.  Log transformation is mainly applied in frequency domain. Sunday, December 18, 2022 69
  • 70. Image Enhancement(Cont…) • Logarithmic Transformation Sunday, December 18, 2022 70
  • 71. Image Enhancement(Cont…) • Power – Law transformations • This symbol γ is called gamma, due to which this transformation is also known as gamma transformation. Sunday, December 18, 2022 71 s = crγ, c,γ –positive constants curve the grayscale components either to brighten the intensity (when γ < 1) or darken the intensity (when γ > 1).
  • 72. Image Enhancement(Cont…) • Power – Law transformations Sunday, December 18, 2022 72
  • 73. Image Enhancement(Cont…) • Power – Law transformations • Variation in the value of γ varies the enhancement of the images. Different display devices / monitors have their own gamma correction, that’s why they display their image at different intensity. • This type of transformation is used for enhancing images for different type of display devices. The gamma of different display devices is different. For example Gamma of CRT lies in between of 1.8 to 2.5, that means the image displayed on CRT is dark. Sunday, December 18, 2022 73
  • 74. Image Enhancement(Cont…) • Power – Law transformations  Gamma Correction  Different camera or video recorder devices do not correctly capture luminance. (they are not linear) Different display devices (monitor, phone screen, TV) do not display luminance correctly neither. So, one needs to correct them, therefore the gamma correction function is needed. Gamma correction function is used to correct image's luminance. s=cr^γ s=cr^(1/2.5) Sunday, December 18, 2022 74
  • 77. Image Enhancement(Cont…) • Piecewise-Linear Transformation Functions  Three types:  Contrast Stretching  Intensity Level Slicing  Bit-Plane Slicing Sunday, December 18, 2022 77
  • 78. Image Enhancement(Cont…) • Contrast stretching  Aims increase the dynamic range of the gray levels in the image being processed.  Contrast stretching is a process that expands the range of intensity levels in a image so that it spans the full intensity range of the recording medium or display device.  Contrast-stretching transformations increase the contrast between the darks and the lights Sunday, December 18, 2022 78
  • 79. Image Enhancement(Cont…) • Contrast stretching Sunday, December 18, 2022 79
  • 80. Image Enhancement(Cont…) • Contrast stretching  The locations of (r1,s1) and (r2,s2) control the shape of the transformation function. – If r1= s1 and r2= s2 the transformation is a linear function and produces no changes. – If r1=r2, s1=0 and s2=L-1, the transformation becomes a thresholding function that creates a binary image. – Intermediate values of (r1,s1) and (r2,s2) produce various degrees of spread in the gray levels of the output image, thus affecting its contrast. – Generally, r1≤r2 and s1≤s2 is assumed. Sunday, December 18, 2022 80
  • 81. Image Enhancement(Cont…) Sunday, December 18, 2022 81 Thresholding function
  • 82. Image Enhancement(Cont…) • Intensity-level slicing  Highlighting a specific range of gray levels in an image.  One way is to display a high value for all gray levels in the range of interest and a low value for all other gray levels (binary image).  The second approach is to brighten the desired range of gray levels but preserve the background and gray-level tonalities in the image Sunday, December 18, 2022 82
  • 83. Image Enhancement(Cont…) • Intensity Level Slicing Sunday, December 18, 2022 83
  • 84. Image Enhancement(Cont…) • Bit-Plane Slicing • To highlight the contribution made to the total image appearance by specific bits. – i.e. Assuming that each pixel is represented by 8 bits, the image is composed of 8 1-bit planes. – Plane 0 contains the least significant bit and plane 7 contains the most significant bit. – Only the higher order bits (top four) contain visually significant data. The other bit planes contribute the more subtle details. – Plane 7 corresponds exactly with an image thresholded at gray level 128. Sunday, December 18, 2022 84
  • 85. Image Enhancement(Cont…) • Bit-Plane Slicing Sunday, December 18, 2022 85
  • 86. Image Enhancement(Cont…) • Histogram Processing  Two Types : (a). Histogram Stretching (b). Histogram equalization  Histogram Stretching  Contrast is the difference between maximum and minimum pixel intensity.  Pictorial view to represent the distribution of pixel which tell frequency of pixel. Sunday, December 18, 2022 86
  • 87. Image Enhancement(Cont…) • The histogram of digital image with gray values is the discrete function Sunday, December 18, 2022 87 1 1 0 , , ,  L r r r  n n r p k k  ) ( nk: Number of pixels with gray value rk n: total Number of pixels in the image The function p(rk) represents the fraction of the total number of pixels with gray value rk. The shape of a histogram provides useful information for contrast enhancement.
  • 88. Image Enhancement(Cont…) • Histogram Processing Sunday, December 18, 2022 88 Dark image Bright image
  • 89. Image Enhancement(Cont…) • Histogram Processing Sunday, December 18, 2022 89 Low contrast image High contrast image
  • 90. Image Enhancement(Cont…) • Histogram Stretching Sunday, December 18, 2022 90
  • 91. Image Enhancement(Cont…) • Histogram Stretching (cont…) • In the above example (0,8) is smin and smax respectively and rmin = 0 , rmax = 4 is given. • S-0 = (8 – 0) / (4 – 0) * (r – 0) • s=(8/4) r • S= 2r • Now we have a relation between r and s. • So get different values of ‘s’ for given rmin to rmax . Sunday, December 18, 2022 91
  • 92. Image Enhancement(Cont…) • Histogram Stretching (cont…) Sunday, December 18, 2022 92
  • 93. Image Enhancement(Cont…) • Histogram Equalization – Recalculate the picture gray levels to make the distribution more equalized – Used widely in image editing tools and computer vision algorithms – Can also be applied to color images Sunday, December 18, 2022 93
  • 94. Objective of histogram equalization • We want to find T(r) so that Ps(s) is a flat line. Historgram, color v.4e 94 sk rk L-1 L-1 r T(r) 0 Objective: To find the Relation s=T(r) Pr(r) r Ps(s)=a constant s L-1 L-1 L-1 Equalized distribution Input random distribution The original image The probability of these levels are lower) The probability of these levels are higher The probability of all levels are the same In Ps(s) s=T(r)
  • 95. we want to prove ps(s)= constant ) 3 ( calculus of thorem l Fundementa • ) 2 ( ) ( ) ( ) ( ) ( by sides both ate differenti 1 ) ( ) ( Theory y probabilit Basic •                x a s r r s r s f(x) f(t)dt dx d dr ds s p r p ds dr r p s p ds dr r p ds s p constant 1 1 ) ( show (3) and (2) (1), formula and formula above with the continue : Exercise ... ) ( ) 1 ( ) ( ) 1 ( ) ( ) 1 ( ) ( : ) 1 ( , ) ( ) ( Since 0 0                        L s p dr dw w p L d dr r dT dr ds dw w p L r T s from dr r dT dr ds r T s s r r r r 95 •        r r dw w p L r T s 0 ) 1 ( ) ( ) 1 ( ) (
  • 96. Image Enhancement(Cont…) • Histogram Equalization Sunday, December 18, 2022 96 • Let rk, k[0..L-1] be intensity levels and let p(rk) be its normalized histogram function. • Histogram equalization is applying the transformation of ‘r’ to get ‘s’ where ‘r’ belongs to 0 to L-1. • As, T(r) is continuous & differentiable ʃPss ds=ʃprr dr =1 differentiating w.r.t ‘s’ we get :
  • 97. Image Enhancement(Cont…) • Histogram Equalization(cont…)  So, e.q. (1)  The transformation function T(r) for histogram equalization is :  Differentiate w.r.t ‘r’ :  As we know, , SO,  From eq.(1) we get, which is a constant. Sunday, December 18, 2022 97 s p r p dr ds s r      r r dw w p L r T S 0 ) ( ) 1 ( ) (     r r dw w p dr d L r T dr d dr ds 0 ) ( ) 1 ( ) (
  • 98. Histogram Equalization : Discrete form for practical use • From the continuous form (1) to discrete form 1 ,.., 2 , 1 , 0 , 1 make to need we , ) ( If : histogram normlzied a obtain that to Recall ) ( ) 1 ( ) ( ) 1 ( ) ( ) 1 ( ) ( 0 0 0                    L k n MN L s MN n r p r P L r T s dw w p L r T s k j j k k k k j j r k k r r 98
  • 99. Histogram Equalization - Example • Let f be an image with size 64x64 pixels and L=8 and let f has the intensity distribution as shown in the table p r(rk )=nk/MN nk rk 0.19 790 0 0.25 1023 2 0.21 850 1 0.16 656 3 0.08 329 4 0.06 245 5 0.03 122 6 0.02 81 7 . 00 . 7 , 86 . 6 , 65 . 6 , 23 . 6 , 67 . 5 , 55 . 4 08 . 3 )) ( ) ( ( 7 ) ( 7 ) ( 33 . 1 ) ( 7 ) ( 7 ) ( 7 6 5 4 3 2 1 0 1 0 1 1 0 0 0 0 0                    s s s s s s r p r p r p r T s r p r p r T s r r j j r r j j r round the values to the nearest integer
  • 100. Sunday, December 18, 2022 100
  • 101. Histogram Equalization - Example Sunday, December 18, 2022 101
  • 102. Filtering • Image filtering is used to:  Remove noise  Sharpen contrast  Highlight contours  Detect edges  Image filters can be classified as linear or nonlinear.  Linear filters are also know as convolution filters as they can be represented using a matrix multiplication.  Thresholding and image equalisation are examples of nonlinear operations, as is the median filter. Sunday, December 18, 2022 102
  • 103. Filtering(cont…) • There are two types of processing: • Point Processing (eg. Histogram equalization) • Mask Processing  Two types of filtering methods: • Smoothing Linear (Average Filter) and Non-Linear (Median Filter) • Sharpening Laplacian Gradient Sunday, December 18, 2022 103
  • 104. Filtering(Cont…) Sunday, December 18, 2022 104 output image
  • 105. Filtering(Cont…) • Correlation [ 1-D & 2-D] • Convolution [ 1-D & 2-D] • In correlation we use weight to get output image and for applying convolution we just rotate the weight 180 degree. • Eg. Weight • After 180 degree rotation • After 180 degree rotation Sunday, December 18, 2022 105 1 2 3 3 2 1 1 2 3 4 5 6 7 8 9 9 8 7 6 5 4 3 2 1
  • 106. Filtering(Cont…) • 1-D Correlation • I = • W= • Output = • For convolution just rotate mask 180 degree. Sunday, December 18, 2022 106 1 2 3 4 1 2 3 [(2*1)+(3*2)] 8 [(1*1)+(2*2)+ (3*3)] 14 [(1*2)+(2*3)+ (3*4)] 20 [(1*3)+(4*2)] 11
  • 107. Filtering(Cont…) • A filtering method is linear when the output is a weighted sum of the input pixels. Eg. Average filter • Methods that do not satisfy the above property are called non-linear. Eg. Median filter • Average (or mean) filtering is a method of ‘smoothing’ images by reducing the amount of intensity variation between neighbouring pixels. • The average filter works by moving through the image pixel by pixel, replacing each value with the average value of neighbouring pixels, including itself. Sunday, December 18, 2022 107
  • 108. Filtering(Cont…) • Average filter mask (2-D): Sunday, December 18, 2022 108
  • 114. Filtering(Cont…) • When we apply average filter noise is removed but blurring is introduced and to remove blurring we use weighted filter. Sunday, December 18, 2022 114
  • 115. Filtering(Cont…) • Median Filter (non-linear filter) • Very effective in removing salt and pepper or impulsive noise while preserving image detail • Disadvantages: computational complexity, non linear filter • The median filter works by moving through the image pixel by pixel, replacing each value with the median value of neighbouring pixels. • The pattern of neighbours is called the "window", which slides, pixel by pixel over the entire image 2 pixel, over the entire image. • The median is calculated by first sorting all the pixel values from the window into numerical order, and then replacing the pixel being considered with the middle (median) pixel value. Sunday, December 18, 2022 115
  • 116. Filtering(Cont…) • Median Filter Example: Sunday, December 18, 2022 116
  • 121. Filtering(Cont…) Sunday, December 18, 2022 121 • From left to right: the results of a 3 x 3, 5 x 5 and 7 x 7 median filter
  • 122. Filtering(Cont…)  Sharpening(high pass filter) is performed by noting only the gray level changes in the image that is the differentiation. • Sharpening is used for edge detection ,line detection, point detection and it also highlight changes.  Operation of Image Differentiation • Enhance edges and discontinuities (magnitude of output gray level >>0) • De-emphasize areas with slowly varying gray-level values (output gray level: 0)  Mathematical Basis of Filtering for Image Sharpening • First-order and second-order derivatives • Approximation in discrete-space domain • Implementation by mask filtering Sunday, December 18, 2022 122
  • 123. Filtering(Cont…)  Common sharpening filters: • Gradient (1st order derivative) • Laplacian (2nd order derivative) • Taking the derivative of an image results in sharpening the image. • The derivative of an image (i.e., 2D function) can be computed using the gradient. Sunday, December 18, 2022 123
  • 124. Filtering(Cont…)  Gradient (rotation variant or non-isotropic) Sunday, December 18, 2022 124 or Sensitive to vertical edges Sensitive to horizontal edges
  • 125. Filtering(Cont…)  Gradient Sunday, December 18, 2022 125 Kernels used in prewitt edge detection
  • 126. Filtering(Cont…) • Laplacian Sunday, December 18, 2022 126 Original Mask , C = +1 or C= -1
  • 127. Filtering(Cont…) • Laplacian(rotation invariant or isotropic) Sunday, December 18, 2022 127 (b)Extended Laplacian mask to increase sharpness and it covers diagonal also, so , provide good results.
  • 128. Image Transforms • Many times, image processing tasks are best performed in a domain other than the spatial domain. • Key steps (1) Transform the image (2) Carry the task(s) in the transformed domain. (3) Apply inverse transform to return to the spatial domain.
  • 129. Math Review - Complex numbers • Real numbers: 1 -5.2  • Complex numbers 4.2 + 3.7i 9.4447 – 6.7i -5.2 (-5.2 + 0i) 1   i We often denote in EE i by j
  • 130. Math Review - Complex numbers • Complex numbers 4.2 + 3.7i 9.4447 – 6.7i -5.2 (-5.2 + 0i) • General Form Z = a + bi Re(Z) = a Im(Z) = b • Amplitude A = | Z | = √(a2 + b2) • Phase  =  Z = tan-1(b/a) Real and imaginary parts
  • 131. Math Review – Complex Numbers • Polar Coordinate Z = a + bi • Amplitude A = √(a2 + b2) • Phase  = tan-1(b/a) a b   A 
  • 132. Math Review – Complex Numbers and Cosine Waves • Cosine wave has three properties – Frequency – Amplitude – Phase • Complex number has two properties – Amplitude – Wave • Complex numbers to represent cosine waves at varying frequency – Frequency 1: Z1 = 5 +2i – Frequency 2: Z2 = -3 + 4i – Frequency 3: Z3 = 1.3 – 1.6i Simple but great idea !!
  • 133. Fourier Transforms & its Properties • Jean Baptiste Joseph Fourier (1768-1830) Sunday, December 18, 2022 133 • Had crazy idea (1807): • Any periodic function can be rewritten as a weighted sum of Sines and Cosines of different frequencies. • Don’t believe it? – Neither did Lagrange, Laplace, Poisson and other big wigs – Not translated into English until 1878! • But it’s true! – called Fourier Series – Possibly the greatest tool used in Engineering
  • 134. Fourier Transforms & its Properties • In image processing: – Instead of time domain: spatial domain (normal image space) – frequency domain: space in which each image value at image position F represents the amount that the intensity values in image I vary over a specific distance related to F Sunday, December 18, 2022 134
  • 135. Fourier Transforms & its Properties • Fourier Transforms & Inverse Fourier Transforms Sunday, December 18, 2022 135
  • 136. Fourier Transforms & its Properties Sunday, December 18, 2022 136
  • 137. Fourier Transforms & its Properties Sunday, December 18, 2022 137
  • 138. Fourier Transforms & its Properties Sunday, December 18, 2022 138
  • 139. Fourier Transforms & its Properties • As we deal with 2-d discrete images so we need to discuss 2-D discrete Fourier Transforms. Sunday, December 18, 2022 139
  • 140. Fourier Transforms & its Properties • Inverse F.T Sunday, December 18, 2022 140
  • 141. Fourier Transforms & its Properties • If the image is represented as square array i.e., M=N than F.T and I.F.T is given by equation: Sunday, December 18, 2022 141
  • 142. Fourier Transforms & its Properties Sunday, December 18, 2022 142
  • 143. Fourier Transforms & its Properties • Separability property Sunday, December 18, 2022 143
  • 144. Fourier Transforms & its Properties Sunday, December 18, 2022 144
  • 145. Fourier Transforms & its Properties • Periodicity: The DFT and its inverse are periodic wit period N. Sunday, December 18, 2022 145
  • 146. Fourier Transforms & its Properties Sunday, December 18, 2022 146 • Scaling: If a signal is multiply by a scalar quantity ‘a’ than its Fourier Transformation is also multiplied by same scalar quantity ‘a’.
  • 147. Fourier Transforms & its Properties • Distributivity: Sunday, December 18, 2022 147 but …
  • 148. Fourier Transforms & its Properties • Average: Sunday, December 18, 2022 148 Average: F(u,v) at u=0, v=0: So:
  • 149. Fourier Transforms & its Properties Sunday, December 18, 2022 149
  • 150.
  • 153. Frequency Domain Filters • Low pass filter: it allows low frequency range signal to pass as output.(Useful for noise suppression) • High pass filter: it allows high pass frequency range to pass as output. (Useful for edge detection) • D(u,v) is distance of (u , v) in frequency domain from the origin of the frequency rectangle. • Do implies that all signals lies in this range i.e., D(u,v)<= Do all low pass frequency to pass to the output and rest are not allowed to pass as output.
  • 156. Frequency Domain Filters • In the above example, for the same cut of frequency the blurring is more in Ideal low pass filter than in butterworth filter and as cut of frequency increases the number of undesired lines are increased in Ideal low pass filter than in butterworth filter.
  • 159. Image Restoration • Image restoration and image enhancement share a common goal: to improve image for human perception • Image enhancement is mainly a subjective process in which individuals’ opinions are involved in process design. • Image restoration is mostly an objective process which: • utilizes a prior knowledge of degradation phenomenon to recover image. • models the degradation and then to recover the original image. • The objective of restoration is to obtain an image estimate which is as close as possible to the original input image. Sunday, December 18, 2022 159
  • 160. Sunday, December 18, 2022 160
  • 161. Image Restoration Sunday, December 18, 2022 161 g(x,y)=f(x,y)*h(x,y)+h(x,y) G(u,v)=F(u,v)H(u,v)+N(u,v) If H is a linear, position-invariant process (filter), the degraded image is given in the spatial domain by: whose equivalent frequency domain representation is: where h(x,y) is a system that causes image distortion and h(x,y) is noise.
  • 162. Image Restoration Sunday, December 18, 2022 162 g(x,y)=f(x,y)+h(x,y) G(u,v)=F(u,v)+N(u,v)
  • 163. Sunday, December 18, 2022 163
  • 164. Sunday, December 18, 2022 164
  • 165. Sunday, December 18, 2022 165
  • 166. Sunday, December 18, 2022 166
  • 167. Sunday, December 18, 2022 167
  • 168. Sunday, December 18, 2022 168
  • 169. Sunday, December 18, 2022 169
  • 170. Sunday, December 18, 2022 170
  • 171. Sunday, December 18, 2022 171
  • 172. Sunday, December 18, 2022 172
  • 173. Sunday, December 18, 2022 173
  • 174. Sunday, December 18, 2022 174
  • 175. Sunday, December 18, 2022 175
  • 176. Image Restoration • Homomorphic Filter • In some images, the quality of the image has reduced because of non-uniform illumination. • Homomorphic filtering can be used to perform illumination correction.  We can view an image f(x,y) as a product of two components:  F(x,y) = r(x,y). i(x,y) , where  r(x,y) = Reflectivity of the surface of the corresponding image point.  i(x,y) = Intensity of the incident light.  The above equation is known as illumination-reflectance model. Sunday, December 18, 2022 176
  • 177. Image Restoration • The illumination-reflectance model can be used to address the problem of improving the quality of an image that has been acquired under poor illumination conditions. • For many images, the illumination is the primary contributor to the dynamic range and varies slowly in space. While reflectance component r(x,y) represents the details of object and varies rapidly in space. Sunday, December 18, 2022 177
  • 178. Image Restoration • The illumination & the reflectance components are to be handled seperately, the logarithm of input function f(x,y) is taken because f(x,y) is product of i(x,y) & r(x,y). The log of f(x,y) seperates the components as illustrated below: ln[f(x,y)] = ln[i(x,y) . r(x,y)] ln[f(x,y)]= ln[i(x,y)] + ln[r(x,y)] • Talking Fourier Transforms of the above equation : F(u,v) = FI (u,v) + FR(u,v) Where FI (u,v) & FR(u,v) are the Fourier transforms of illumination and reflectance components respectively. Sunday, December 18, 2022 178
  • 179. Image Restoration • Then the desired filter function H(u,v) can be applied seperatily to illumination & the reflectance components as shown below: F(u,v).H(u,v) = FI (u,v). H(u,v) + FR(u,v). H(u,v) • In order to visualize the image , Inverse fourier trasforms followed by exponential function is applied: = F-1 [F(u,v).H(u,v)] = F-1 [FI (u,v). H(u,v)] + F-1 [FR(u,v). H(u,v)] • The desired enhanced image is obtained by taking exponential operation as: Sunday, December 18, 2022 179
  • 180. Inverse Filter after we obtain H(u,v), we can estimate F(u,v) by the inverse filter: ) , ( ) , ( ) , ( ) , ( ) , ( ) , ( ˆ v u H v u N v u F v u H v u G v u F    From degradation model: ) , ( ) , ( ) , ( ) , ( v u N v u H v u F v u G   Noise is enhanced when H(u,v) is small. In practical, the inverse filter is not Popularly used.
  • 181. Inverse Filter: Example 6 / 5 2 2 ) ( 0025 . 0 ) , ( v u e v u H    Original image Blurred image Due to Turbulence Result of applying the full filter Result of applying the filter with D0=70 Result of applying the filter with D0=40 Result of applying the filter with D0=85
  • 182. Wiener Filter: Minimum Mean Square Error Filter Objective: optimize mean square error:   2 2 ) ˆ ( f f E e   ) , ( ) , ( / ) , ( ) , ( ) , ( ) , ( 1 ) , ( ) , ( / ) , ( ) , ( ) , ( ) , ( ) , ( ) , ( ) , ( ) , ( ) , ( ) , ( ˆ 2 2 2 * 2 * v u G v u S v u S v u H v u H v u H v u G v u S v u S v u H v u H v u G v u S v u H v u S v u S v u H v u F f f f f                               h h h Wiener Filter Formula: where H(u,v) = Degradation function Sh(u,v) = Power spectrum of noise Sf(u,v) = Power spectrum of the undegraded image
  • 183. Approximation of Wiener Filter ) , ( ) , ( / ) , ( ) , ( ) , ( ) , ( 1 ) , ( ˆ 2 2 v u G v u S v u S v u H v u H v u H v u F f           h Wiener Filter Formula: Approximated Formula: ) , ( ) , ( ) , ( ) , ( 1 ) , ( ˆ 2 2 v u G K v u H v u H v u H v u F           Difficult to estimate Practically, K is chosen manually to obtained the best visual result!
  • 184. Wiener Filter: Example Original image Result of the inverse filter with D0=70 Result of the Wiener filter Blurred image Due to Turbulence
  • 185. Wiener Filter •It is better than inverse filter. •It incorporates both the degradation function and the statistical characteristics of the noise( mean, spectrum etc..) into the restoration process. •Here we consider image and noise as random function. •Objective is to find of the uncorrupted image f such that mean square error between them is minimize. •Assumption : •Image and noise are not correlated. •Any one of them should have zero mean. •Gray levels in the estimate are linear functions of the degraded image.
  • 197. Sunday, December 18, 2022 197
  • 200. Sunday, December 18, 2022 200 010100111100 = a3a1a2a2a6
  • 201. Sunday, December 18, 2022 201
  • 202. Sunday, December 18, 2022 202
  • 203. • a2,a1,a3,a5,a4 = (0.25,0.25,0.2,0.15,0.15) • Consider a five symbol sequence {a1,a2,a3,a3,a4} from a four symbol source code. Generate the arithmetic code for the same. Sunday, December 18, 2022 203
  • 212. Sunday, December 18, 2022 212 Entropy encoder
  • 214. Sunday, December 18, 2022 214
  • 215. Sunday, December 18, 2022 215
  • 217. • Image segmentation – ex: edge-based, region-based • Image representation (Boundary Representation) – ex: Chain code , polygonal approximation – Image description (Boundary Descriptors) – ex: boundary-based, regional-based 217 OUTLINE
  • 219. Image segmentation(cont…) • Segmentation is used to subdivide an image into its constituent parts or objects. • This step determines the eventual success or failure of image analysis. • Generally, the segmentation is carried out only up to the objects of interest are isolated. e..g. face detection. • The goal of segmentation is to simplify and/or change the representation of an image into something that is more meaningful and easier to analyse.
  • 220. Classification of the Segmentation techniques Image Segmentation Discontinuity Similarity e.g. - Point Detection - Line Detection - Edge Detection e.g. - Thresholding - Region Growing - Region splitting & merging
  • 221. • There are three basic types of gray-level discontinuities in a digital image: points, lines, and edges • The most common way to look for discontinuities is to run a mask through the image. • We say that a point, line, and edge has been detected at the location on which the mask is centered if ,where 221 edge-based segmentation(1) R T  1 1 2 2 9 9 ...... R w z w z w z    
  • 222. edge-based segmentation(2) • Point detection a point detection mask • Line detection a line detection mask 222
  • 223. edge-based segmentation(3) • Edge detection: Gradient operation 223 x y f G x G f y f                    1 2 2 2 ( ) x y f mag f G G          1 ( , ) tan ( ) y x G x y G   
  • 224. edge-based segmentation(4) • Edge detection: Laplacian operation 224 2 2 2 2 2 f f f x y        2 2 2 2 2 2 4 ( ) r r h r e              
  • 225. 225 Region Based Segmentation Region Growing Region growing techniques start with one pixel of a potential region and try to grow it by adding adjacent pixels till the pixels being compared are too disimilar. • The first pixel selected can be just the first unlabeled pixel in the image or a set of seed pixels can be chosen from the image. • Usually a statistical test is used to decide which pixels can be added to a region.
  • 226. • Region Growing technique • Assign some seed point. • Assign some threshold value{value that is nearer to maximum value of pixel value} • Compare seed point with pixel value around it. Sunday, December 18, 2022 226
  • 227. • Eg. Threshold < 3 • Answer of the above image through region growing technique is: Sunday, December 18, 2022 227
  • 228. • Region Splitting and Merging • Threshold <=3 • Split the image into equal parts. • If (Maximum pixel value – Minimum Pixel value) does not satisfy the threshold constraint than again split region. Sunday, December 18, 2022 228
  • 229. Sunday, December 18, 2022 229
  • 230. Sunday, December 18, 2022 230
  • 231. Sunday, December 18, 2022 231
  • 232. Boundary Representation • Image regions (including segments) can be represented by either the border or the pixels of the region. These can be viewed as external or internal characteristics, respectively. • Chain codes •
  • 234. Boundary Representation Chain Codes • Chain codes can be based on either 4-connectedness or 8-connectedness. • The first difference of the chain code: – This difference is obtained by counting the number of direction changes (in a counterclockwise direction) – For example, the first difference of the 4-direction chain code 10103322 is 3133030. • Assuming the first difference code represent a closed path, rotation normalization can be achieved by circularly shifting the number of the code so that the list of numbers forms the smallest possible integer. • Size normalization can be achieved by adjusting the size of the resampling grid.
  • 235. Sunday, December 18, 2022 235
  • 236. Sunday, December 18, 2022 236
  • 237. Boundary Representation Polygonal Approximations • Polygonal approximations: to represent a boundary by straight line segments, and a closed path becomes a polygon. • The number of straight line segments used determines the accuracy of the approximation. • Only the minimum required number of sides necessary to preserve the needed shape information should be used (Minimum perimeter polygons). • A larger number of sides will only add noise to the model.
  • 238. Boundary Representation Polygonal Approximations • Minimum perimeter polygons: (Merging and splitting) – Merging and splitting are often used together to ensure that vertices appear where they would naturally in the boundary. – A least squares criterion to a straight line is used to stop the processing.
  • 239. 239 Hough Transform • The Hough transform is a method for detecting lines or curves specified by a parametric function. • If the parameters are p1, p2, … pn, then the Hough procedure uses an n-dimensional accumulator array in which it accumulates votes for the correct parameters of the lines or curves found on the image. y = mx + b image m b accumulator
  • 240. Q. Given 3 points, Use Hough Transform to draw a line joining these points : (1,1), (2,2) & (3,3). Sunday, December 18, 2022 240
  • 241. Sunday, December 18, 2022 241 Question. Given 5 points, use Hough transform to draw a line joining the points (1,4) , (2,3), (3,1), (4,1), (5,0). (RTU-2016)
  • 242. Boundary Descriptors • There are several simple geometric measures that can be useful for describing a boundary. – The length of a boundary: the number of pixels along a boundary gives a rough approximation of its length. – Curvature: the rate of change of slope • To measure a curvature accurately at a point in a digital boundary is difficult • The difference between the slops of adjacent boundary segments is used as a descriptor of curvature at the point of intersection of segments
  • 243. Boundary Descriptors Shape Numbers First difference • The shape number of a boundary is defined as the first difference of smallest magnitude. • The order n of a shape number is defined as the number of digits in its representation.
  • 246. Boundary Descriptors Fourier Descriptors • This is a way of using the Fourier transform to analyze the shape of a boundary. – The x-y coordinates of the boundary are treated as the real and imaginary parts of a complex number. – Then the list of coordinates is Fourier transformed using the DFT (chapter 4). – The Fourier coefficients are called the Fourier descriptors. – The basic shape of the region is determined by the first several coefficients, which represent lower frequencies. – Higher frequency terms provide information on the fine detail of the boundary.
  • 248. Regional Descriptors • Some simple descriptors – The area of a region: the number of pixels in the region – The perimeter of a region: the length of its boundary – The compactness of a region: (perimeter)2/area – The mean and median of the gray levels – The minimum and maximum gray-level values – The number of pixels with values above and below the mean
  • 250. Regional Descriptors Topological Descriptors Topological property 1: the number of holes (H) Topological property 2: the number of connected components (C)
  • 251. Regional Descriptors Topological Descriptors Topological property 3: Euler number: the number of connected components subtract the number of holes E = C - H E=0 E= -1