The document discusses the fundamentals of digital image processing. It defines a digital image as a 2D function where amplitude at each point represents intensity or gray level. A digital image is composed of pixels which are discrete image elements. Image processing includes low-level tasks like noise reduction, mid-level tasks like segmentation, and high-level tasks like object recognition. Mathematical representation of a digital image involves illumination and reflectance components. Intensity at each point in a monochrome image represents its gray level value within the gray scale range from minimum to maximum.
2. Introduction to Digital Image
Processing
• An image may be represented as a 2-dimensional function,
f(x,y).
where, x and y are spatial coordinates.
x
• The amplitude of f at any pair 0
of coordinates (x,y) is called . . . . . . . f(x,y) .
the intensity of gray level of . . . . . . . . . ..
the image at that point. . . . . . . . . . ..
. . . . . . . . . . .
• when x, y and amplitude of f y
are all finite, discreet quantities,
we call it a Digital Image.
• The field of Digital Image Processing(DIP) Fig : Representation of a
Digital Image
refers to processing the digital image
by means of a digital computer.
3. • Digital Image is composed of a finite number
of elements, each of which has a particular
location and value.
• These elements are referred to as a picture
elements, image elements, pels & pixels.
• Pixel is the term most widely used to denote
the elements of a digital image.
4. Types of Image Processing
1. Low Level Processing-involves primitive
operations like noise reduction, contrast
enhancement and image sharpening.
Its inputs and outputs both are images.
2. Mid Level Processing-involves tasks such as
segmentation, classification(recognition) of
individual objects.
Its inputs generally are images but its outputs
are attributes extracted from those images.
example: edges, contours, identity of object
5. (contd...)
3. High Level Processing-involves “making
sense” of an ensemble of recognized
objects, performing the cognitive functions
normally associated with the vision.
6. Fundamental steps in DIP
Outputs of the processes generally are images
Color image Wavelets and Morphological
Outputs of these processes generally are
processing multiresolution compression processing
processing
Image
restoration segmentation
image attributes
Image Knowledge Base Representation
enhancement and description
Image Object
acquisition recognition
Problem
domain
7. Components of an Image Processing
System
network
Image display Computer Mass storage
Specialized image Image
Hardcopy processing processing
hardware software
Image sensors
Problem domain
8. Image Formation in the Eye
d1 c
h
d
d1/d = h/λ
Where c = optical centre of lens
d = distance between the object and the optical centre
d1 = height of the object
h = height of the object formed in the eye
λ = focal length,17mm if we are focussing on an object present greater than 3m
9. Mathematical representation of a
Digital Image
Function f(x,y) may be characterized by 2 components:
1. the amount of source illumination incident on the
scene being viewed, i(x,y) i.e illumination component.
2. the amount of illumination reflected by the objects
in the scene, r(x,y) i.e reflectance component.
therefore,
f(x,y) = i(x,y) * r(x,y)
where, 0 < i(x,y) < ∞
0(total absorption) < r(x,y) < 1(total reflectance)
10. • Consider the intensity of a monochrome image at
any coordinate (x0, y0), the gray level (l) of the
image at that point,
l = f (x0, y0)
‘l’ lies in the range, Lmin ≤ l ≤ Lmax
where, Lmin= imin * rmin
Lmax= imax * rmax
• The interval [Lmin,Lmax] is called the gray scale.
• Common practice is to shift this interval
numerically to the interval [0,L-1],
where, l=0 is considered black
l=L-1 is considered white