SlideShare a Scribd company logo
1 of 121
Digital Image Processing:
Introduction
Introduction
“One picture is worth more than ten thousand
words”
Anonymous
Contents
This lecture will cover:
◦ What is a digital image?
◦ What is digital image processing?
◦ History of digital image processing
◦ State of the art examples of digital image
processing
◦ Key stages in digital image processing
What is a Digital Image?
A digital image is a representation of a two-
dimensional image as a finite set of digital
values, called picture elements or pixels
Pixel values typically represent gray levels,
colours, heights, opacities etc
Remember digitization implies that a digital
image is an approximation of a real scene
1 pixel
Common image formats include:
◦ 1 sample per point (B&W or Grayscale)
◦ 3 samples per point (Red, Green, and Blue)
◦ 4 samples per point (Red, Green, Blue, and “Alpha”,
a.k.a. Opacity)
128 230 123
232 123 321
123 77 89
80 255 255
The figure is an example of digital
image that you are now viewing
on your computer screen. But
actually , this image is nothing but
a two dimensional array of
numbers ranging between 0 and
255.
Each number represents the value
of the function f(x,y) at any point.
In this case the value 128 , 230
,123 each represents an individual
pixel value. The dimensions of the
picture is actually the dimensions
of this two dimensional array.
What is Digital Image Processing?
Digital image processing focuses on two
major tasks
◦ Improvement of pictorial information for
human interpretation
◦ Processing of image data for storage,
transmission and representation for
autonomous machine perception.
How it works
In the above figure , an image has been captured by a camera and has been
sent to a digital system to remove all the other details , and just focus on
the water drop by zooming it in such a way that the quality of the image
remains the same.
Introduction
 Signal processing is a discipline in electrical engineering
and in mathematics that deals with analysis and processing
of analog and digital signals , and deals with storing ,
filtering , and other operations on signals.
 These signals include transmission signals , sound or voice
signals , image signals , and other signals e.t.c.
 Out of all these signals , the field that deals with the type
of signals for which the input is an image and the output is
also an image is done in image processing. As it name
suggests, it deals with the processing on images.
 It can be further divided into analog image processing and
digital image processing.
 Analog image processing
Analog image processing is done on analog signals. It
includes processing on two dimensional analog signals. In
this type of processing, the images are manipulated by
electrical means by varying the electrical signal. The
common example include is the television image.
 Digital image processing
The digital image processing deals with developing a digital
system that performs operations on an digital image.
Digital image processing has dominated over analog image
processing with the passage of time due its wider range of
applications.
The continuum from image processing to
computer vision can be broken up into low-,
mid- and high-level processes
Low Level Process
Input: Image
Output: Image
Examples: Noise
removal, image
sharpening
Mid Level Process
Input: Image
Output: Attributes
Examples: Object
recognition, segmentation
High Level Process
Input: Attributes
Output: Understanding
Examples: Scene
understanding,
autonomous navigation
In this course we will stop here
History of Digital Image Processing
Early 1920s: One of the first applications of
digital imaging was in the news-
paper industry
◦ The Bartlane cable picture
transmission service
◦ Images were transferred by submarine cable
between London and New York
◦ Pictures were coded for cable transfer and
reconstructed at the receiving end on a telegraph
printer
Early digital image
History of DIP (cont…)
Mid to late 1920s: Improvements to the
Bartlane system resulted in higher quality
images
◦ New reproduction
processes based
on photographic
techniques
◦ Increased number
of tones in
reproduced images Improved
digital image Early 15 tone digital image
History of DIP (cont…)
1960s: Improvements in computing
technology and the onset of the space race led
to a surge of work in digital image processing
◦ 1964: Computers used to
improve the quality of
images of the moon taken
by the Ranger 7 probe
◦ Such techniques were used
in other space missions
including the Apollo landings A picture of the moon taken by
the Ranger 7 probe minutes
before landing
History of DIP (cont…)
1970s: Digital image processing begins to be
used in medical applications
◦ 1979: Sir Godfrey N.
Hounsfield & Prof. Allan M.
Cormack share the Nobel
Prize in medicine for the
invention of tomography,
the technology behind
Computerised Axial
Tomography (CAT) scans
Typical head slice CAT image
History of DIP (cont…)
1980s - Today: The use of digital image
processing techniques has exploded and they
are now used for all kinds of tasks in all
kinds of areas
◦ Image enhancement/restoration
◦ Artistic effects
◦ Medical visualisation
◦ Industrial inspection
◦ Law enforcement
◦ Human computer interfaces
Examples: Image Enhancement
One of the most common uses of DIP
techniques: improve quality, remove noise
etc
Examples: The Hubble Telescope
Launched in 1990 the Hubble
telescope can take images of
very distant objects
However, an incorrect mirror
made many of Hubble’s
images useless
Image processing
techniques were
used to fix this
Examples: Artistic Effects
Artistic effects are
used to make images
more visually
appealing, to add
special effects and to
make composite
images
Examples: Medicine
Take slice from MRI scan of canine heart, and find
boundaries between types of tissue
◦ Image with gray levels representing tissue density
◦ Use a suitable filter to highlight edges
Original MRI Image of a Dog Heart Edge Detection Image
Examples: GIS
Geographic Information Systems
◦ Digital image processing techniques are used
extensively to manipulate satellite imagery
◦ Terrain classification
◦ Meteorology
Examples: GIS (cont…)
Night-Time Lights of
the World data set
◦ Global inventory of
human settlement
◦ Not hard to imagine
the kind of analysis
that might be done
using this data
Examples: Industrial Inspection
Human operators are
expensive, slow and
unreliable
Make machines do the
job instead
Industrial vision systems
are used in all kinds of
industries
Can we trust them?
Examples: PCB Inspection
Printed Circuit Board (PCB) inspection
◦ Machine inspection is used to determine that
all components are present and that all solder
joints are acceptable
◦ Both conventional imaging and x-ray imaging
are used
Examples: Law Enforcement
Image processing
techniques are used
extensively by law
enforcers
◦ Number plate
recognition for speed
cameras/automated toll
systems
◦ Fingerprint recognition
◦ Enhancement of CCTV
images
Examples: HCI
Try to make human
computer interfaces more
natural
◦ Face recognition
◦ Gesture recognition
Applications of Digital Image
Processing
 Image sharpening and restoration
 Medical field
 Remote sensing
 Transmission and encoding
 Machine/Robot vision
 Color processing
 Pattern recognition
 Video processing
 Microscopic Imaging
 Others
Image sharpening and restoration
 Image sharpening and restoration refers here to process images
that have been captured from the modern camera to make them
a better image or to manipulate those images in way to achieve
desired result. It refers to do what Photoshop usually does.
 This includes Zooming, blurring , sharpening , gray scale to
color conversion, detecting edges and vice versa , Image
retrieval and Image recognition. The common examples are:
Original Zoomed Blurr
Edges
Sharp image
UV imaging
 In the field of remote sensing , the area of the earth is scanned by a
satellite or from a very high ground and then it is analyzed to obtain
information about it. One particular application of digital image
processing in the field of remote sensing is to detect infrastructure
damages caused by an earthquake.
Hurdle detection
 Hurdle detection is one of the common task that has been
done through image processing, by identifying different type
of objects in the image and then calculating the distance
between robot and hurdles.
Line follower robot
 Most of the robots today work by following the line and thus are
called line follower robots. This help a robot to move on its path
and perform some tasks. This has also been achieved through image
processing.
Fundamental Steps in Digital Image Processing:
Image
Acquisition
Image
Restoration
Morphological
Processing
Segmentation
Object
Recognition
Image
Enhancement Representation
& Description
Problem Domain
Colour Image
Processing
Image
Compression
Wavelets &
Multiresolution
processing
Outputs of these processes generally are images
Knowledge Base
Step 1: Image Acquisition
The image is captured by a sensor (eg. Camera),
and digitized if the output of the camera or
sensor is not already in digital form, using
analogue-to-digital convertor
Step 2: Image Enhancement
The process of manipulating an image so that the result
is more suitable than the original for specific
applications.
The idea behind enhancement techniques is to bring out
details that are hidden, or simple to highlight certain
features of interest in an image.
Step 3: Image Restoration
- Improving the appearance of an image
- Tend to be mathematical or probabilistic models.
Enhancement, on the other hand, is based on human
subjective preferences regarding what constitutes a
“good” enhancement result.
Step 4: Colour Image Processing
Use the colour of the image to extract features of interest
in an image.
Colour modeling and processing in a digital domain etc.
Step 5: Wavelets
Are the foundation of representing images in various
degrees of resolution. It is used for image data
compression where images are subdivided into smaller
regions.
Step 6: Compression
Techniques for reducing the storage required to
save an image or the bandwidth required to
transmit it.
Tools for extracting image
components that are useful in
the representation and
description of shape.
In this step, there would be a
transition from processes that
output images, to processes that
output image attributes.
Step 7: Morphological Processing
Step 8: Image Segmentation
Segmentation procedures partition an image into its
constituent parts or objects.
Important Tip: The more accurate the segmentation,
the more likely recognition is to succeed.
Step 9: Representation and Description
- Representation: Make a decision whether the data should
be represented as a boundary or as a complete region. It is
almost always follows the output of a segmentation stage.
- Boundary Representation: Focus on external shape
characteristics, such as corners and inflections.
- Region Representation: Focus on internal properties,
such as texture or skeleton shape.
Transforming raw data into a form suitable for subsequent
computer processing. Description deals with extracting
attributes that result in some quantitative information of
interest or are basic for differentiating one class of objects
from another.
Step 10: Object Recognition
Recognition: the process that assigns label to an object
based on the information provided by its description.
Recognition is the process that assigns a label, such as,
“vehicle” to an object based on its descriptors.
Components of an Image Processing
System
Network
Image displays Computer Mass storage
Hardcopy
Specialized image
processing hardware
Image processing
software
Image sensors
Problem Domain
Typical general-
purpose DIP
system
Components of an Image Processing
System
1. Image Sensors
Two elements are required to acquire
digital images. The first is the physical
device that is sensitive to the energy
radiated by the object we wish to image
(Sensor). The second, called a digitizer,
is a device for converting the output of
the physical sensing device into digital
form.
Components of an Image Processing
System
2. Specialized Image Processing Hardware
Usually consists of the digitizer, mentioned before, plus
hardware that performs other primitive operations, such as an
arithmetic logic unit (ALU), which performs arithmetic and
logical operations in parallel on entire images.
This type of hardware sometimes is called a front-end
subsystem, and its most distinguishing characteristic is speed.
In other words, this unit performs functions that require fast
data throughputs that the typical main computer cannot handle.
Components of an Image Processing
System
4. Image Processing Software
Software for image processing consists of specialized
modules that perform specific tasks. A well-designed
package also includes the capability for the user to write
code that, as a minimum, utilizes the specialized modules.
Components of an Image Processing
System
5. Mass Storage Capability
Mass storage capability is a must in a image processing
applications. And image of sized 1024 * 1024 pixels
requires one megabyte of storage space if the image is not
compressed.
Digital storage for image processing applications falls
into three principal categories:
1. Short-term storage for use during processing.
2. on line storage for relatively fast recall
3. Archival storage, characterized by infrequent access
Components of an Image Processing
System
5. Mass Storage Capability
One method of providing short-term storage is computer memory.
Another is by specialized boards, called frame buffers, that store
one or more images and can be accessed rapidly.
The on-line storage method, allows virtually instantaneous image
zoom, as well as scroll (vertical shifts) and pan (horizontal shifts).
On-line storage generally takes the form of magnetic disks and
optical-media storage. The key factor characterizing on-line
storage is frequent access to the stored data.
Finally, archival storage is characterized by massive storage
requirements but infrequent need for access.
Components of an Image Processing
System
6. Image Displays
The displays in use today are mainly color (preferably
flat screen) TV monitors. Monitors are driven by the
outputs of the image and graphics display cards that are
an integral part of a computer system.
Components of an Image Processing
System
7. Hardcopy devices
Used for recording images, include laser
printers, film cameras, heat-sensitive
devices, inkjet units and digital units,
such as optical and CD-Rom disks.
Components of an Image Processing
System
8. Networking
Is almost a default function in any computer
system, in use today. Because of the large
amount of data inherent in image processing
applications the key consideration in image
transmission is bandwidth.
In dedicated networks, this typically is not a
problem, but communications with remote sites
via the internet are not always as efficient.
Summary
We have looked at:
◦ What is a digital image?
◦ What is digital image processing?
◦ History of digital image processing
◦ State of the art examples of digital image
processing
◦ Key stages in digital image processing
Next time we will start to see how it all
works…
Digital image = a multidimensional
array of numbers (such as intensity image)
or vectors (such as color image)
Each component in the image
called pixel associates with the
pixel value (a single number in
the case of intensity images or a
vector in the case of color
images).












39
87
15
32
22
13
25
15
37
26
6
9
28
16
10
10












39
65
65
54
42
47
54
21
67
96
54
32
43
56
70
65












99
87
65
32
92
43
85
85
67
96
90
60
78
56
70
99
Visual Perception: Human Eye
(Picture from Microsoft Encarta 2000)
Cross Section of the Human Eye
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
1. The lens contains 60-70% water, 6% of fat.
2. The iris diaphragm controls amount of light that enters the eye.
3. Light receptors in the retina
- About 6-7 millions cones for bright light vision called photopic
- Density of cones is about 150,000 elements/mm2.
- Cones involve in color vision.
- Cones are concentrated in fovea about 1.5x1.5 mm2.
- About 75-150 millions rods for dim light vision called scotopic
- Rods are sensitive to low level of light and are not involved
color vision.
4. Blind spot is the region of emergence of the optic nerve from the eye.
Visual Perception: Human Eye (cont.)
Blind-Spot Experiment
Draw an image similar to that below on a
piece of paper (the dot and cross are about 6
inches apart)
Close your right eye and focus on the cross
with your left eye
Hold the image about 20 inches away from
your face and move it slowly towards you
The dot should disappear!
Image Formation In The Eye
Muscles within the eye can be used to
change the shape of the lens allowing us
focus on objects that are near or far away
An image is focused onto the retina causing
rods and cones to become excited which
ultimately send signals to the brain
Position
Intensity
Brightness Adaptation of Human Eye : Mach Band Effect
Mach Band Effect
Intensities of surrounding points effect
perceived brightness at each point.
In this image, edges between bars
appear brighter on the right side and
darker on the left side.
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
In area A, brightness perceived is darker while in area B is
brighter. This phenomenon is called Mach Band Effect.
Position
Intensity A
B
Mach Band Effect (Cont)
Simultaneous contrast. All small squares have exactly the same intensity
but they appear progressively darker as background becomes lighter.
Brightness Adaptation of Human Eye : Simultaneous Contrast
Simultaneous Contrast
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Optical illusion
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Visible Spectrum
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Image Sensors
Single sensor
Line sensor
Array sensor
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Image Sensors : Single Sensor
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Image Sensors : Line Sensor
Fingerprint sweep sensor
Computerized Axial Tomography
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Image “After snow storm”
Fundamentals of Digital Images
f(x,y)
x
y
w An image: a multidimensional function of spatial coordinates.
w Spatial coordinate: (x,y) for 2D case such as photograph,
(x,y,z) for 3D case such as CT scan images
(x,y,t) for movies
w The function f may represent intensity (for monochrome images)
or color (for color images) or other associated values.
Origin
Conventional Coordinate for Image Representation
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Digital Image Types : Intensity Image
Intensity image or monochrome image
each pixel corresponds to light intensity
normally represented in gray scale (gray
level).












39
87
15
32
22
13
25
15
37
26
6
9
28
16
10
10
Gray scale values












39
87
15
32
22
13
25
15
37
26
6
9
28
16
10
10












39
65
65
54
42
47
54
21
67
96
54
32
43
56
70
65












99
87
65
32
92
43
85
85
67
96
90
60
78
56
70
99
Digital Image Types : RGB Image
Color image or RGB image:
each pixel contains a vector
representing red, green and
blue components.
RGB components
Image Types : Binary Image
Binary image or black and white image
Each pixel contains one bit :
1 represent white
0 represents black












1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
Binary data
Image Types : Index Image
Index image
Each pixel contains index number
pointing to a color in a color table










2
5
6
7
4
6
9
4
1
Index value
Index
No.
Red
component
Green
component
Blue
component
1 0.1 0.5 0.3
2 1.0 0.0 0.0
3 0.0 1.0 0.0
4 0.5 0.5 0.5
5 0.2 0.8 0.9
… … … …
Color Table
Digital Image Acquisition Process
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Generating a Digital Image
Basic Relationship of Pixels
x
y
(0,0)
Conventional indexing method
(x,y) (x+1,y)
(x-1,y)
(x,y-1)
(x,y+1)
(x+1,y-1)
(x-1,y-1)
(x-1,y+1) (x+1,y+1)
Neighbors of a Pixel
p (x+1,y)
(x-1,y)
(x,y-1)
(x,y+1)
4-neighbors of p:
N4(p) =
(x-1,y)
(x+1,y)
(x,y-1)
(x,y+1)
Neighborhood relation is used to tell adjacent pixels. It is useful for analyzing
regions.
Note: q N4(p) implies p N4(q)
4-neighborhood relation considers only vertical and horizontal neighbors.
p (x+1,y)
(x-1,y)
(x,y-1)
(x,y+1)
(x+1,y-1)
(x-1,y-1)
(x-1,y+1) (x+1,y+1)
Neighbors of a Pixel (cont.)
8-neighbors of p:
(x-1,y-1)
(x,y-1)
(x+1,y-1)
(x-1,y)
(x+1,y)
(x-1,y+1)
(x,y+1)
(x+1,y+1)
N8(p) =
8-neighborhood relation considers all neighbor pixels.
p
(x+1,y-1)
(x-1,y-1)
(x-1,y+1) (x+1,y+1)
Diagonal neighbors of p:
ND(p)=
(x-1,y-1)
(x+1,y-1)
(x-1,y+1)
(x+1,y+1)
Neighbors of a Pixel (cont.)
Diagonal -neighborhood relation considers only diagonal
neighbor pixels.
Connectivity
Connectivity is adapted from neighborhood relation.
Two pixels are connected if they are in the same class (i.e. the same
color or the same range of intensity) and they are neighbors of one
another.
For p and q from the same class
w 4-connectivity: p and q are 4-connected if q Î N4(p)
w 8-connectivity: p and q are 8-connected if q Î N8(p)
w mixed-connectivity (m-connectivity):
p and q are m-connected if q Î N4(p) or
q Î ND(p) and N4(p) Ç N4(q) = Æ
Adjacency
A pixel p is adjacent to pixel q is they are connected.
Two image subsets S1 and S2 are adjacent if some pixel in S1 is adjacent
to some pixel in S2
S1
S2
We can define type of adjacency: 4-adjacency, 8-adjacency or m-
adjacency depending on type of connectivity.
Types of Adjacency
1. 4-adjacency: Two pixels p and q with
values from V are 4-adjacent if q is in the
set N4(p).
2. 8-adjacency: Two pixels p and q with
values from V are 8-adjacent if q is in the
set N8(p).
3. m-adjacency =(mixed)
Types of Adjacency
 m-adjacency:
Two pixels p and q with values from V
are m-adjacent if :
 q is in N4(p) or
 q is in ND(p) and the set N4(p) ∩ N4(q) has no
pixel whose values are from V (no intersection)
 Important Note: the type of adjacency
used must be specified
Types of Adjacency
 Mixed adjacency is a modification of 8-
adjacency. It is introduced to eliminate the
ambiguities that often arise when 8-adjacency
is used.
 For example:
Types of Adjacency
 In this example, we can note that to connect between
two pixels (finding a path between two pixels):
◦ In 8-adjacency way, you can find multiple paths
between two pixels
◦ While, in m-adjacency, you can find only one path
between two pixels
 So, m-adjacency has eliminated the multiple path
connection that has been generated by the 8-adjacency.
 Two subsets S1 and S2 are adjacent, if some pixel in S1
is adjacent to some pixel in S2. Adjacent means, either
4-, 8- or m-adjacency.
Path
A path from pixel p at (x,y) to pixel q at (s,t) is a sequence of distinct
pixels:
(x0,y0), (x1,y1), (x2,y2),…, (xn,yn)
such that
(x0,y0) = (x,y) and (xn,yn) = (s,t)
and
(xi,yi) is adjacent to (xi-1,yi-1), i = 1,…,n
p
q
We can define type of path: 4-path, 8-path or m-path depending on type of
adjacency.
Path (cont.)
p
q
p
q
p
q
8-path from p to q
results in some ambiguity
m-path from p to q
solves this ambiguity
8-path m-path
Distance
 For pixels p, q and z, with coordinates (x,y), (s,t) and (v,w),
respectively, D is a distance function if:
(a) D (p,q) ≥ 0 (D (p,q) = 0 iff p = q),
(b) D (p,q) = D (q, p), and
(c) D (p,z) ≤ D (p,q) + D (q,z).
Distance (cont.)
D4-distance (city-block distance) is defined as
t
y
s
x
q
p
D -
+
-

)
,
(
4
1 2
1
0
1 2
1
2
2
2
2
2
2
Pixels with D4(p) = 1 is 4-neighbors of p.
Distance (cont.)
D8-distance (chessboard distance) is defined as
)
,
max(
)
,
(
8 t
y
s
x
q
p
D -
-

1
2
1
0
1
2
1
2
2
2
2
2
2
Pixels with D8(p) = 1 is 8-neighbors of p.
2
2
2
2
2
2
2
2
1
1
1
1
Distance Measures
 The Euclidean Distance between p and q
is defined as:
De (p,q) = [(x – s)2 + (y - t)2]1/2
Pixels having a distance less than or equal
to some value r from (x,y) are the points
contained in a disk of
radius r centered at (x,y)
p (x,y)
q (s,t)
Distance Measures
 The D4 distance (also called city-block distance)
between p and q is defined as:
D4 (p,q) = | x – s | + | y – t |
Pixels having a D4 distance from
(x,y), less than or equal to some
value r form a Diamond
centered at (x,y)
p (x,y)
q (s,t)
D4
Distance Measures
Example:
The pixels with distance D4 ≤ 2 from (x,y) form the
following contours of constant distance.
The pixels with D4 = 1 are
the 4-neighbors of (x,y)
Distance Measures
 The D8 distance (also called chessboard distance)
between p and q is defined as:
D8 (p,q) = max(| x – s |,| y – t |)
Pixels having a D8 distance from
(x,y), less than or equal to some
value r form a square
Centered at (x,y)
p (x,y)
q (s,t)
D8(b)
D8(a)
D8 = max(D8(a) , D8(b))
Distance Measures
Example:
D8 distance ≤ 2 from (x,y) form the
following contours of constant distance.
Distance Measures
 Dm distance:
is defined as the shortest m-path between the
points.
In this case, the distance between two pixels
will depend on the values of the pixels along
the path, as well as the values of their
neighbors.
Distance Measures
 Example:
Consider the following arrangement of pixels
and assume that p, p2, and p4 have value 1 and
that p1 and p3 can have can have a value of 0 or
1
Suppose that we consider
the adjacency of pixels
values 1 (i.e. V = {1})
Distance Measures
 Cont. Example:
Now, to compute the Dm between points p
and p4
Here we have 4 cases:
Case1: If p1 =0 and p3 = 0
The length of the shortest m-path
(the Dm distance) is 2 (p, p2, p4)
Distance Measures
 Cont. Example:
Case2: If p1 =1 and p3 = 0
now, p1 and p will no longer be adjacent
(see m-adjacency definition)
then, the length of the shortest
path will be 3 (p, p1, p2, p4)
Distance Measures
 Cont. Example:
Case3: If p1 =0 and p3 = 1
The same applies here, and the shortest –
m-path will be 3 (p, p2, p3, p4)
Distance Measures
 Cont. Example:
Case4: If p1 =1 and p3 = 1
The length of the shortest m-path will be
4 (p, p1 , p2, p3, p4)
Image Sampling and Quantization
Image sampling: discretize an image in the spatial domain
Spatial resolution / image resolution: pixel size or number of pixels
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
How to choose the spatial resolution
= Sampling locations
Original
image
Sampled
image
Under sampling, we lost some image details!
Spatial resolution
How to choose the spatial resolution : Nyquist Rate
Original
image
= Sampling locations
Minimum
Period
Spatial resolution
(sampling rate)
Sampled image
No detail is lost!
Nyquist Rate:
Spatial resolution must be less or
equal half of the minimum period of
the image or sampling frequency
must be greater or Equal twice of the
maximum frequency.
2mm
1mm
0 0.5 1 1.5 2
-1
-0.5
0
0.5
1
0 0.5 1 1.5 2
-1
-0.5
0
0.5
1
1
),
2
sin(
)
(
1 
 f
t
t
x 
6
),
12
sin(
)
(
2 
 f
t
t
x 
Sampling rate:
5 samples/sec
Aliased Frequency
Two different frequencies but the same results !
Effect of Spatial Resolution
256x256 pixels
64x64 pixels
128x128 pixels
32x32 pixels
Effect of Spatial Resolution
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Moire Pattern Effect : Special Case of Sampling
Moire patterns occur when frequencies of two superimposed
periodic patterns are close to each other.
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Effect of Spatial Resolution
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Can we increase spatial resolution by interpolation ?
Down sampling is an irreversible process.
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Image Quantization
Image quantization:
discretize continuous pixel values into discrete numbers
Color resolution/ color depth/ levels:
- No. of colors or gray levels or
- No. of bits representing each pixel value
- No. of colors or gray levels Nc is given by
b
c
N 2

where b = no. of bits
Quantization function
Light intensity
Quantization
level
0
1
2
Nc-1
Nc-2
Darkest Brightest
Effect of Quantization Levels
256 levels 128 levels
32 levels
64 levels
Effect of Quantization Levels (cont.)
16 levels 8 levels
2 levels
4 levels
In this image,
it is easy to see
false contour.
How to select the suitable size and pixel depth of images
Low detail image Medium detail image High detail image
Lena image Cameraman image
To satisfy human mind
1. For images of the same size, the low detail image may need more pixel depth.
2. As an image size increase, fewer gray levels may be needed.
The word “suitable” is subjective: depending on “subject”.
(Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.
Human vision: Spatial Frequency vs Contrast
Human vision: Distinguish ability for Difference in brightness
Regions with 5% brightness difference

More Related Content

Similar to DIPsadasdasfsdfsdfdfasdfsdfsdgsdgdsfgdfgfdg

EC4160-lect 1,2.ppt
EC4160-lect 1,2.pptEC4160-lect 1,2.ppt
EC4160-lect 1,2.pptssuser812128
 
application of digital image processing and methods
application of digital image processing and methodsapplication of digital image processing and methods
application of digital image processing and methodsSIRILsam
 
Dip unit-i-ppt academic year(2016-17)
Dip unit-i-ppt academic year(2016-17)Dip unit-i-ppt academic year(2016-17)
Dip unit-i-ppt academic year(2016-17)RagavanK6
 
Digital_image_processing_-Vijaya_Raghavan.pdf
Digital_image_processing_-Vijaya_Raghavan.pdfDigital_image_processing_-Vijaya_Raghavan.pdf
Digital_image_processing_-Vijaya_Raghavan.pdfVaideshSiva1
 
BEC007 -Digital image processing.pdf
BEC007  -Digital image processing.pdfBEC007  -Digital image processing.pdf
BEC007 -Digital image processing.pdfgopikahari7
 
Digital image processing using matlab
Digital image processing using matlab Digital image processing using matlab
Digital image processing using matlab Amr Rashed
 
introdaction.pptx
introdaction.pptxintrodaction.pptx
introdaction.pptxDekebatufa
 
Presentation on Digital Image Processing
Presentation on Digital Image ProcessingPresentation on Digital Image Processing
Presentation on Digital Image ProcessingSalim Hosen
 
Digital Image Processing presentation
Digital Image Processing presentationDigital Image Processing presentation
Digital Image Processing presentationOmkarDattatrayKanase
 
ARKA RAJ SAHA-27332020003..pptx
ARKA RAJ SAHA-27332020003..pptxARKA RAJ SAHA-27332020003..pptx
ARKA RAJ SAHA-27332020003..pptxAdharchandsaha
 
Digital image processing
Digital image processingDigital image processing
Digital image processingDeevena Dayaal
 
Optical Watermarking Literature survey....
Optical Watermarking Literature survey....Optical Watermarking Literature survey....
Optical Watermarking Literature survey....Arif Ahmed
 

Similar to DIPsadasdasfsdfsdfdfasdfsdfsdgsdgdsfgdfgfdg (20)

1_unit-1.1_Introduction to DIP.pptx
1_unit-1.1_Introduction to DIP.pptx1_unit-1.1_Introduction to DIP.pptx
1_unit-1.1_Introduction to DIP.pptx
 
1st section
1st section1st section
1st section
 
EC4160-lect 1,2.ppt
EC4160-lect 1,2.pptEC4160-lect 1,2.ppt
EC4160-lect 1,2.ppt
 
application of digital image processing and methods
application of digital image processing and methodsapplication of digital image processing and methods
application of digital image processing and methods
 
Dip unit-i-ppt academic year(2016-17)
Dip unit-i-ppt academic year(2016-17)Dip unit-i-ppt academic year(2016-17)
Dip unit-i-ppt academic year(2016-17)
 
Digital_image_processing_-Vijaya_Raghavan.pdf
Digital_image_processing_-Vijaya_Raghavan.pdfDigital_image_processing_-Vijaya_Raghavan.pdf
Digital_image_processing_-Vijaya_Raghavan.pdf
 
BEC007 -Digital image processing.pdf
BEC007  -Digital image processing.pdfBEC007  -Digital image processing.pdf
BEC007 -Digital image processing.pdf
 
Dip review
Dip reviewDip review
Dip review
 
Digital image processing using matlab
Digital image processing using matlab Digital image processing using matlab
Digital image processing using matlab
 
introdaction.pptx
introdaction.pptxintrodaction.pptx
introdaction.pptx
 
Presentation on Digital Image Processing
Presentation on Digital Image ProcessingPresentation on Digital Image Processing
Presentation on Digital Image Processing
 
Digital Image Processing presentation
Digital Image Processing presentationDigital Image Processing presentation
Digital Image Processing presentation
 
ARKA RAJ SAHA-27332020003..pptx
ARKA RAJ SAHA-27332020003..pptxARKA RAJ SAHA-27332020003..pptx
ARKA RAJ SAHA-27332020003..pptx
 
Image processing
Image processingImage processing
Image processing
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Dip sdit 7
Dip sdit 7Dip sdit 7
Dip sdit 7
 
Jc3416551658
Jc3416551658Jc3416551658
Jc3416551658
 
Application of image processing
Application of image processingApplication of image processing
Application of image processing
 
Ch1.pptx
Ch1.pptxCh1.pptx
Ch1.pptx
 
Optical Watermarking Literature survey....
Optical Watermarking Literature survey....Optical Watermarking Literature survey....
Optical Watermarking Literature survey....
 

More from MrVMNair

Chapter_01_Introduction Two differen.ppt
Chapter_01_Introduction Two differen.pptChapter_01_Introduction Two differen.ppt
Chapter_01_Introduction Two differen.pptMrVMNair
 
webpack introductionNotice Demystifyingf
webpack introductionNotice Demystifyingfwebpack introductionNotice Demystifyingf
webpack introductionNotice DemystifyingfMrVMNair
 
Lecture05.pptx
Lecture05.pptxLecture05.pptx
Lecture05.pptxMrVMNair
 
Event Loop Node js.pptx
Event Loop Node js.pptxEvent Loop Node js.pptx
Event Loop Node js.pptxMrVMNair
 
EN Childhood Anxiety Disorder by Slidesgo.pptx
EN Childhood Anxiety Disorder by Slidesgo.pptxEN Childhood Anxiety Disorder by Slidesgo.pptx
EN Childhood Anxiety Disorder by Slidesgo.pptxMrVMNair
 
15998595.ppt
15998595.ppt15998595.ppt
15998595.pptMrVMNair
 
6.3 c-Functions.ppt
6.3 c-Functions.ppt6.3 c-Functions.ppt
6.3 c-Functions.pptMrVMNair
 
Chapter 1_NG_2020.ppt
Chapter 1_NG_2020.pptChapter 1_NG_2020.ppt
Chapter 1_NG_2020.pptMrVMNair
 

More from MrVMNair (8)

Chapter_01_Introduction Two differen.ppt
Chapter_01_Introduction Two differen.pptChapter_01_Introduction Two differen.ppt
Chapter_01_Introduction Two differen.ppt
 
webpack introductionNotice Demystifyingf
webpack introductionNotice Demystifyingfwebpack introductionNotice Demystifyingf
webpack introductionNotice Demystifyingf
 
Lecture05.pptx
Lecture05.pptxLecture05.pptx
Lecture05.pptx
 
Event Loop Node js.pptx
Event Loop Node js.pptxEvent Loop Node js.pptx
Event Loop Node js.pptx
 
EN Childhood Anxiety Disorder by Slidesgo.pptx
EN Childhood Anxiety Disorder by Slidesgo.pptxEN Childhood Anxiety Disorder by Slidesgo.pptx
EN Childhood Anxiety Disorder by Slidesgo.pptx
 
15998595.ppt
15998595.ppt15998595.ppt
15998595.ppt
 
6.3 c-Functions.ppt
6.3 c-Functions.ppt6.3 c-Functions.ppt
6.3 c-Functions.ppt
 
Chapter 1_NG_2020.ppt
Chapter 1_NG_2020.pptChapter 1_NG_2020.ppt
Chapter 1_NG_2020.ppt
 

Recently uploaded

Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxUnboundStockton
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxabhijeetpadhi001
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 

Recently uploaded (20)

Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docx
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptx
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 

DIPsadasdasfsdfsdfdfasdfsdfsdgsdgdsfgdfgfdg

  • 2. Introduction “One picture is worth more than ten thousand words” Anonymous
  • 3. Contents This lecture will cover: ◦ What is a digital image? ◦ What is digital image processing? ◦ History of digital image processing ◦ State of the art examples of digital image processing ◦ Key stages in digital image processing
  • 4. What is a Digital Image? A digital image is a representation of a two- dimensional image as a finite set of digital values, called picture elements or pixels
  • 5. Pixel values typically represent gray levels, colours, heights, opacities etc Remember digitization implies that a digital image is an approximation of a real scene 1 pixel
  • 6. Common image formats include: ◦ 1 sample per point (B&W or Grayscale) ◦ 3 samples per point (Red, Green, and Blue) ◦ 4 samples per point (Red, Green, Blue, and “Alpha”, a.k.a. Opacity)
  • 7. 128 230 123 232 123 321 123 77 89 80 255 255 The figure is an example of digital image that you are now viewing on your computer screen. But actually , this image is nothing but a two dimensional array of numbers ranging between 0 and 255. Each number represents the value of the function f(x,y) at any point. In this case the value 128 , 230 ,123 each represents an individual pixel value. The dimensions of the picture is actually the dimensions of this two dimensional array.
  • 8. What is Digital Image Processing? Digital image processing focuses on two major tasks ◦ Improvement of pictorial information for human interpretation ◦ Processing of image data for storage, transmission and representation for autonomous machine perception.
  • 9. How it works In the above figure , an image has been captured by a camera and has been sent to a digital system to remove all the other details , and just focus on the water drop by zooming it in such a way that the quality of the image remains the same.
  • 10. Introduction  Signal processing is a discipline in electrical engineering and in mathematics that deals with analysis and processing of analog and digital signals , and deals with storing , filtering , and other operations on signals.  These signals include transmission signals , sound or voice signals , image signals , and other signals e.t.c.  Out of all these signals , the field that deals with the type of signals for which the input is an image and the output is also an image is done in image processing. As it name suggests, it deals with the processing on images.  It can be further divided into analog image processing and digital image processing.
  • 11.  Analog image processing Analog image processing is done on analog signals. It includes processing on two dimensional analog signals. In this type of processing, the images are manipulated by electrical means by varying the electrical signal. The common example include is the television image.  Digital image processing The digital image processing deals with developing a digital system that performs operations on an digital image. Digital image processing has dominated over analog image processing with the passage of time due its wider range of applications.
  • 12. The continuum from image processing to computer vision can be broken up into low-, mid- and high-level processes Low Level Process Input: Image Output: Image Examples: Noise removal, image sharpening Mid Level Process Input: Image Output: Attributes Examples: Object recognition, segmentation High Level Process Input: Attributes Output: Understanding Examples: Scene understanding, autonomous navigation In this course we will stop here
  • 13. History of Digital Image Processing Early 1920s: One of the first applications of digital imaging was in the news- paper industry ◦ The Bartlane cable picture transmission service ◦ Images were transferred by submarine cable between London and New York ◦ Pictures were coded for cable transfer and reconstructed at the receiving end on a telegraph printer Early digital image
  • 14. History of DIP (cont…) Mid to late 1920s: Improvements to the Bartlane system resulted in higher quality images ◦ New reproduction processes based on photographic techniques ◦ Increased number of tones in reproduced images Improved digital image Early 15 tone digital image
  • 15. History of DIP (cont…) 1960s: Improvements in computing technology and the onset of the space race led to a surge of work in digital image processing ◦ 1964: Computers used to improve the quality of images of the moon taken by the Ranger 7 probe ◦ Such techniques were used in other space missions including the Apollo landings A picture of the moon taken by the Ranger 7 probe minutes before landing
  • 16. History of DIP (cont…) 1970s: Digital image processing begins to be used in medical applications ◦ 1979: Sir Godfrey N. Hounsfield & Prof. Allan M. Cormack share the Nobel Prize in medicine for the invention of tomography, the technology behind Computerised Axial Tomography (CAT) scans Typical head slice CAT image
  • 17. History of DIP (cont…) 1980s - Today: The use of digital image processing techniques has exploded and they are now used for all kinds of tasks in all kinds of areas ◦ Image enhancement/restoration ◦ Artistic effects ◦ Medical visualisation ◦ Industrial inspection ◦ Law enforcement ◦ Human computer interfaces
  • 18. Examples: Image Enhancement One of the most common uses of DIP techniques: improve quality, remove noise etc
  • 19. Examples: The Hubble Telescope Launched in 1990 the Hubble telescope can take images of very distant objects However, an incorrect mirror made many of Hubble’s images useless Image processing techniques were used to fix this
  • 20. Examples: Artistic Effects Artistic effects are used to make images more visually appealing, to add special effects and to make composite images
  • 21. Examples: Medicine Take slice from MRI scan of canine heart, and find boundaries between types of tissue ◦ Image with gray levels representing tissue density ◦ Use a suitable filter to highlight edges Original MRI Image of a Dog Heart Edge Detection Image
  • 22. Examples: GIS Geographic Information Systems ◦ Digital image processing techniques are used extensively to manipulate satellite imagery ◦ Terrain classification ◦ Meteorology
  • 23. Examples: GIS (cont…) Night-Time Lights of the World data set ◦ Global inventory of human settlement ◦ Not hard to imagine the kind of analysis that might be done using this data
  • 24. Examples: Industrial Inspection Human operators are expensive, slow and unreliable Make machines do the job instead Industrial vision systems are used in all kinds of industries Can we trust them?
  • 25. Examples: PCB Inspection Printed Circuit Board (PCB) inspection ◦ Machine inspection is used to determine that all components are present and that all solder joints are acceptable ◦ Both conventional imaging and x-ray imaging are used
  • 26. Examples: Law Enforcement Image processing techniques are used extensively by law enforcers ◦ Number plate recognition for speed cameras/automated toll systems ◦ Fingerprint recognition ◦ Enhancement of CCTV images
  • 27. Examples: HCI Try to make human computer interfaces more natural ◦ Face recognition ◦ Gesture recognition
  • 28. Applications of Digital Image Processing  Image sharpening and restoration  Medical field  Remote sensing  Transmission and encoding  Machine/Robot vision  Color processing  Pattern recognition  Video processing  Microscopic Imaging  Others
  • 29. Image sharpening and restoration  Image sharpening and restoration refers here to process images that have been captured from the modern camera to make them a better image or to manipulate those images in way to achieve desired result. It refers to do what Photoshop usually does.  This includes Zooming, blurring , sharpening , gray scale to color conversion, detecting edges and vice versa , Image retrieval and Image recognition. The common examples are: Original Zoomed Blurr
  • 31. UV imaging  In the field of remote sensing , the area of the earth is scanned by a satellite or from a very high ground and then it is analyzed to obtain information about it. One particular application of digital image processing in the field of remote sensing is to detect infrastructure damages caused by an earthquake.
  • 32. Hurdle detection  Hurdle detection is one of the common task that has been done through image processing, by identifying different type of objects in the image and then calculating the distance between robot and hurdles.
  • 33. Line follower robot  Most of the robots today work by following the line and thus are called line follower robots. This help a robot to move on its path and perform some tasks. This has also been achieved through image processing.
  • 34. Fundamental Steps in Digital Image Processing: Image Acquisition Image Restoration Morphological Processing Segmentation Object Recognition Image Enhancement Representation & Description Problem Domain Colour Image Processing Image Compression Wavelets & Multiresolution processing Outputs of these processes generally are images Knowledge Base
  • 35. Step 1: Image Acquisition The image is captured by a sensor (eg. Camera), and digitized if the output of the camera or sensor is not already in digital form, using analogue-to-digital convertor
  • 36. Step 2: Image Enhancement The process of manipulating an image so that the result is more suitable than the original for specific applications. The idea behind enhancement techniques is to bring out details that are hidden, or simple to highlight certain features of interest in an image.
  • 37. Step 3: Image Restoration - Improving the appearance of an image - Tend to be mathematical or probabilistic models. Enhancement, on the other hand, is based on human subjective preferences regarding what constitutes a “good” enhancement result.
  • 38. Step 4: Colour Image Processing Use the colour of the image to extract features of interest in an image. Colour modeling and processing in a digital domain etc.
  • 39. Step 5: Wavelets Are the foundation of representing images in various degrees of resolution. It is used for image data compression where images are subdivided into smaller regions.
  • 40. Step 6: Compression Techniques for reducing the storage required to save an image or the bandwidth required to transmit it.
  • 41. Tools for extracting image components that are useful in the representation and description of shape. In this step, there would be a transition from processes that output images, to processes that output image attributes. Step 7: Morphological Processing
  • 42. Step 8: Image Segmentation Segmentation procedures partition an image into its constituent parts or objects. Important Tip: The more accurate the segmentation, the more likely recognition is to succeed.
  • 43. Step 9: Representation and Description - Representation: Make a decision whether the data should be represented as a boundary or as a complete region. It is almost always follows the output of a segmentation stage. - Boundary Representation: Focus on external shape characteristics, such as corners and inflections. - Region Representation: Focus on internal properties, such as texture or skeleton shape. Transforming raw data into a form suitable for subsequent computer processing. Description deals with extracting attributes that result in some quantitative information of interest or are basic for differentiating one class of objects from another.
  • 44. Step 10: Object Recognition Recognition: the process that assigns label to an object based on the information provided by its description. Recognition is the process that assigns a label, such as, “vehicle” to an object based on its descriptors.
  • 45. Components of an Image Processing System Network Image displays Computer Mass storage Hardcopy Specialized image processing hardware Image processing software Image sensors Problem Domain Typical general- purpose DIP system
  • 46. Components of an Image Processing System 1. Image Sensors Two elements are required to acquire digital images. The first is the physical device that is sensitive to the energy radiated by the object we wish to image (Sensor). The second, called a digitizer, is a device for converting the output of the physical sensing device into digital form.
  • 47. Components of an Image Processing System 2. Specialized Image Processing Hardware Usually consists of the digitizer, mentioned before, plus hardware that performs other primitive operations, such as an arithmetic logic unit (ALU), which performs arithmetic and logical operations in parallel on entire images. This type of hardware sometimes is called a front-end subsystem, and its most distinguishing characteristic is speed. In other words, this unit performs functions that require fast data throughputs that the typical main computer cannot handle.
  • 48. Components of an Image Processing System 4. Image Processing Software Software for image processing consists of specialized modules that perform specific tasks. A well-designed package also includes the capability for the user to write code that, as a minimum, utilizes the specialized modules.
  • 49. Components of an Image Processing System 5. Mass Storage Capability Mass storage capability is a must in a image processing applications. And image of sized 1024 * 1024 pixels requires one megabyte of storage space if the image is not compressed. Digital storage for image processing applications falls into three principal categories: 1. Short-term storage for use during processing. 2. on line storage for relatively fast recall 3. Archival storage, characterized by infrequent access
  • 50. Components of an Image Processing System 5. Mass Storage Capability One method of providing short-term storage is computer memory. Another is by specialized boards, called frame buffers, that store one or more images and can be accessed rapidly. The on-line storage method, allows virtually instantaneous image zoom, as well as scroll (vertical shifts) and pan (horizontal shifts). On-line storage generally takes the form of magnetic disks and optical-media storage. The key factor characterizing on-line storage is frequent access to the stored data. Finally, archival storage is characterized by massive storage requirements but infrequent need for access.
  • 51. Components of an Image Processing System 6. Image Displays The displays in use today are mainly color (preferably flat screen) TV monitors. Monitors are driven by the outputs of the image and graphics display cards that are an integral part of a computer system.
  • 52. Components of an Image Processing System 7. Hardcopy devices Used for recording images, include laser printers, film cameras, heat-sensitive devices, inkjet units and digital units, such as optical and CD-Rom disks.
  • 53. Components of an Image Processing System 8. Networking Is almost a default function in any computer system, in use today. Because of the large amount of data inherent in image processing applications the key consideration in image transmission is bandwidth. In dedicated networks, this typically is not a problem, but communications with remote sites via the internet are not always as efficient.
  • 54. Summary We have looked at: ◦ What is a digital image? ◦ What is digital image processing? ◦ History of digital image processing ◦ State of the art examples of digital image processing ◦ Key stages in digital image processing Next time we will start to see how it all works…
  • 55. Digital image = a multidimensional array of numbers (such as intensity image) or vectors (such as color image) Each component in the image called pixel associates with the pixel value (a single number in the case of intensity images or a vector in the case of color images).             39 87 15 32 22 13 25 15 37 26 6 9 28 16 10 10             39 65 65 54 42 47 54 21 67 96 54 32 43 56 70 65             99 87 65 32 92 43 85 85 67 96 90 60 78 56 70 99
  • 56. Visual Perception: Human Eye (Picture from Microsoft Encarta 2000)
  • 57. Cross Section of the Human Eye (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 58. 1. The lens contains 60-70% water, 6% of fat. 2. The iris diaphragm controls amount of light that enters the eye. 3. Light receptors in the retina - About 6-7 millions cones for bright light vision called photopic - Density of cones is about 150,000 elements/mm2. - Cones involve in color vision. - Cones are concentrated in fovea about 1.5x1.5 mm2. - About 75-150 millions rods for dim light vision called scotopic - Rods are sensitive to low level of light and are not involved color vision. 4. Blind spot is the region of emergence of the optic nerve from the eye. Visual Perception: Human Eye (cont.)
  • 59. Blind-Spot Experiment Draw an image similar to that below on a piece of paper (the dot and cross are about 6 inches apart) Close your right eye and focus on the cross with your left eye Hold the image about 20 inches away from your face and move it slowly towards you The dot should disappear!
  • 60.
  • 61. Image Formation In The Eye Muscles within the eye can be used to change the shape of the lens allowing us focus on objects that are near or far away An image is focused onto the retina causing rods and cones to become excited which ultimately send signals to the brain
  • 62. Position Intensity Brightness Adaptation of Human Eye : Mach Band Effect
  • 63. Mach Band Effect Intensities of surrounding points effect perceived brightness at each point. In this image, edges between bars appear brighter on the right side and darker on the left side. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 64. In area A, brightness perceived is darker while in area B is brighter. This phenomenon is called Mach Band Effect. Position Intensity A B Mach Band Effect (Cont)
  • 65. Simultaneous contrast. All small squares have exactly the same intensity but they appear progressively darker as background becomes lighter. Brightness Adaptation of Human Eye : Simultaneous Contrast
  • 66. Simultaneous Contrast (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 67. Optical illusion (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 68. Visible Spectrum (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 69. Image Sensors Single sensor Line sensor Array sensor (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 70. Image Sensors : Single Sensor (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 71. Image Sensors : Line Sensor Fingerprint sweep sensor Computerized Axial Tomography (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 72. Image “After snow storm” Fundamentals of Digital Images f(x,y) x y w An image: a multidimensional function of spatial coordinates. w Spatial coordinate: (x,y) for 2D case such as photograph, (x,y,z) for 3D case such as CT scan images (x,y,t) for movies w The function f may represent intensity (for monochrome images) or color (for color images) or other associated values. Origin
  • 73. Conventional Coordinate for Image Representation (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 74. Digital Image Types : Intensity Image Intensity image or monochrome image each pixel corresponds to light intensity normally represented in gray scale (gray level).             39 87 15 32 22 13 25 15 37 26 6 9 28 16 10 10 Gray scale values
  • 76. Image Types : Binary Image Binary image or black and white image Each pixel contains one bit : 1 represent white 0 represents black             1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 Binary data
  • 77. Image Types : Index Image Index image Each pixel contains index number pointing to a color in a color table           2 5 6 7 4 6 9 4 1 Index value Index No. Red component Green component Blue component 1 0.1 0.5 0.3 2 1.0 0.0 0.0 3 0.0 1.0 0.0 4 0.5 0.5 0.5 5 0.2 0.8 0.9 … … … … Color Table
  • 78. Digital Image Acquisition Process (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 80. Basic Relationship of Pixels x y (0,0) Conventional indexing method (x,y) (x+1,y) (x-1,y) (x,y-1) (x,y+1) (x+1,y-1) (x-1,y-1) (x-1,y+1) (x+1,y+1)
  • 81. Neighbors of a Pixel p (x+1,y) (x-1,y) (x,y-1) (x,y+1) 4-neighbors of p: N4(p) = (x-1,y) (x+1,y) (x,y-1) (x,y+1) Neighborhood relation is used to tell adjacent pixels. It is useful for analyzing regions. Note: q N4(p) implies p N4(q) 4-neighborhood relation considers only vertical and horizontal neighbors.
  • 82. p (x+1,y) (x-1,y) (x,y-1) (x,y+1) (x+1,y-1) (x-1,y-1) (x-1,y+1) (x+1,y+1) Neighbors of a Pixel (cont.) 8-neighbors of p: (x-1,y-1) (x,y-1) (x+1,y-1) (x-1,y) (x+1,y) (x-1,y+1) (x,y+1) (x+1,y+1) N8(p) = 8-neighborhood relation considers all neighbor pixels.
  • 83. p (x+1,y-1) (x-1,y-1) (x-1,y+1) (x+1,y+1) Diagonal neighbors of p: ND(p)= (x-1,y-1) (x+1,y-1) (x-1,y+1) (x+1,y+1) Neighbors of a Pixel (cont.) Diagonal -neighborhood relation considers only diagonal neighbor pixels.
  • 84. Connectivity Connectivity is adapted from neighborhood relation. Two pixels are connected if they are in the same class (i.e. the same color or the same range of intensity) and they are neighbors of one another. For p and q from the same class w 4-connectivity: p and q are 4-connected if q Î N4(p) w 8-connectivity: p and q are 8-connected if q Î N8(p) w mixed-connectivity (m-connectivity): p and q are m-connected if q Î N4(p) or q Î ND(p) and N4(p) Ç N4(q) = Æ
  • 85. Adjacency A pixel p is adjacent to pixel q is they are connected. Two image subsets S1 and S2 are adjacent if some pixel in S1 is adjacent to some pixel in S2 S1 S2 We can define type of adjacency: 4-adjacency, 8-adjacency or m- adjacency depending on type of connectivity.
  • 86. Types of Adjacency 1. 4-adjacency: Two pixels p and q with values from V are 4-adjacent if q is in the set N4(p). 2. 8-adjacency: Two pixels p and q with values from V are 8-adjacent if q is in the set N8(p). 3. m-adjacency =(mixed)
  • 87. Types of Adjacency  m-adjacency: Two pixels p and q with values from V are m-adjacent if :  q is in N4(p) or  q is in ND(p) and the set N4(p) ∩ N4(q) has no pixel whose values are from V (no intersection)  Important Note: the type of adjacency used must be specified
  • 88. Types of Adjacency  Mixed adjacency is a modification of 8- adjacency. It is introduced to eliminate the ambiguities that often arise when 8-adjacency is used.  For example:
  • 89. Types of Adjacency  In this example, we can note that to connect between two pixels (finding a path between two pixels): ◦ In 8-adjacency way, you can find multiple paths between two pixels ◦ While, in m-adjacency, you can find only one path between two pixels  So, m-adjacency has eliminated the multiple path connection that has been generated by the 8-adjacency.  Two subsets S1 and S2 are adjacent, if some pixel in S1 is adjacent to some pixel in S2. Adjacent means, either 4-, 8- or m-adjacency.
  • 90. Path A path from pixel p at (x,y) to pixel q at (s,t) is a sequence of distinct pixels: (x0,y0), (x1,y1), (x2,y2),…, (xn,yn) such that (x0,y0) = (x,y) and (xn,yn) = (s,t) and (xi,yi) is adjacent to (xi-1,yi-1), i = 1,…,n p q We can define type of path: 4-path, 8-path or m-path depending on type of adjacency.
  • 91. Path (cont.) p q p q p q 8-path from p to q results in some ambiguity m-path from p to q solves this ambiguity 8-path m-path
  • 92. Distance  For pixels p, q and z, with coordinates (x,y), (s,t) and (v,w), respectively, D is a distance function if: (a) D (p,q) ≥ 0 (D (p,q) = 0 iff p = q), (b) D (p,q) = D (q, p), and (c) D (p,z) ≤ D (p,q) + D (q,z).
  • 93. Distance (cont.) D4-distance (city-block distance) is defined as t y s x q p D - + -  ) , ( 4 1 2 1 0 1 2 1 2 2 2 2 2 2 Pixels with D4(p) = 1 is 4-neighbors of p.
  • 94. Distance (cont.) D8-distance (chessboard distance) is defined as ) , max( ) , ( 8 t y s x q p D - -  1 2 1 0 1 2 1 2 2 2 2 2 2 Pixels with D8(p) = 1 is 8-neighbors of p. 2 2 2 2 2 2 2 2 1 1 1 1
  • 95. Distance Measures  The Euclidean Distance between p and q is defined as: De (p,q) = [(x – s)2 + (y - t)2]1/2 Pixels having a distance less than or equal to some value r from (x,y) are the points contained in a disk of radius r centered at (x,y) p (x,y) q (s,t)
  • 96. Distance Measures  The D4 distance (also called city-block distance) between p and q is defined as: D4 (p,q) = | x – s | + | y – t | Pixels having a D4 distance from (x,y), less than or equal to some value r form a Diamond centered at (x,y) p (x,y) q (s,t) D4
  • 97. Distance Measures Example: The pixels with distance D4 ≤ 2 from (x,y) form the following contours of constant distance. The pixels with D4 = 1 are the 4-neighbors of (x,y)
  • 98. Distance Measures  The D8 distance (also called chessboard distance) between p and q is defined as: D8 (p,q) = max(| x – s |,| y – t |) Pixels having a D8 distance from (x,y), less than or equal to some value r form a square Centered at (x,y) p (x,y) q (s,t) D8(b) D8(a) D8 = max(D8(a) , D8(b))
  • 99. Distance Measures Example: D8 distance ≤ 2 from (x,y) form the following contours of constant distance.
  • 100. Distance Measures  Dm distance: is defined as the shortest m-path between the points. In this case, the distance between two pixels will depend on the values of the pixels along the path, as well as the values of their neighbors.
  • 101. Distance Measures  Example: Consider the following arrangement of pixels and assume that p, p2, and p4 have value 1 and that p1 and p3 can have can have a value of 0 or 1 Suppose that we consider the adjacency of pixels values 1 (i.e. V = {1})
  • 102. Distance Measures  Cont. Example: Now, to compute the Dm between points p and p4 Here we have 4 cases: Case1: If p1 =0 and p3 = 0 The length of the shortest m-path (the Dm distance) is 2 (p, p2, p4)
  • 103. Distance Measures  Cont. Example: Case2: If p1 =1 and p3 = 0 now, p1 and p will no longer be adjacent (see m-adjacency definition) then, the length of the shortest path will be 3 (p, p1, p2, p4)
  • 104. Distance Measures  Cont. Example: Case3: If p1 =0 and p3 = 1 The same applies here, and the shortest – m-path will be 3 (p, p2, p3, p4)
  • 105. Distance Measures  Cont. Example: Case4: If p1 =1 and p3 = 1 The length of the shortest m-path will be 4 (p, p1 , p2, p3, p4)
  • 106. Image Sampling and Quantization Image sampling: discretize an image in the spatial domain Spatial resolution / image resolution: pixel size or number of pixels (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 107. How to choose the spatial resolution = Sampling locations Original image Sampled image Under sampling, we lost some image details! Spatial resolution
  • 108. How to choose the spatial resolution : Nyquist Rate Original image = Sampling locations Minimum Period Spatial resolution (sampling rate) Sampled image No detail is lost! Nyquist Rate: Spatial resolution must be less or equal half of the minimum period of the image or sampling frequency must be greater or Equal twice of the maximum frequency. 2mm 1mm
  • 109. 0 0.5 1 1.5 2 -1 -0.5 0 0.5 1 0 0.5 1 1.5 2 -1 -0.5 0 0.5 1 1 ), 2 sin( ) ( 1   f t t x  6 ), 12 sin( ) ( 2   f t t x  Sampling rate: 5 samples/sec Aliased Frequency Two different frequencies but the same results !
  • 110. Effect of Spatial Resolution 256x256 pixels 64x64 pixels 128x128 pixels 32x32 pixels
  • 111. Effect of Spatial Resolution (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 112. Moire Pattern Effect : Special Case of Sampling Moire patterns occur when frequencies of two superimposed periodic patterns are close to each other. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 113. Effect of Spatial Resolution (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 114. Can we increase spatial resolution by interpolation ? Down sampling is an irreversible process. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 115. Image Quantization Image quantization: discretize continuous pixel values into discrete numbers Color resolution/ color depth/ levels: - No. of colors or gray levels or - No. of bits representing each pixel value - No. of colors or gray levels Nc is given by b c N 2  where b = no. of bits
  • 117. Effect of Quantization Levels 256 levels 128 levels 32 levels 64 levels
  • 118. Effect of Quantization Levels (cont.) 16 levels 8 levels 2 levels 4 levels In this image, it is easy to see false contour.
  • 119. How to select the suitable size and pixel depth of images Low detail image Medium detail image High detail image Lena image Cameraman image To satisfy human mind 1. For images of the same size, the low detail image may need more pixel depth. 2. As an image size increase, fewer gray levels may be needed. The word “suitable” is subjective: depending on “subject”. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.
  • 120. Human vision: Spatial Frequency vs Contrast
  • 121. Human vision: Distinguish ability for Difference in brightness Regions with 5% brightness difference