Elements of digital image processing
systems
-Raksha Bala Bhale
Image :
An image may be defined as a two-dimensional function
f(x,y), where x and y are spatial (plane) coordinates, and the
amplitude to at any pair of coordinates (x,y) is called the intensity of
the image at that point. The term gray level is used often to refer to
the intensity of monochrome images.
An image may be continuous with respect to the x- and y-
coordinates and also in amplitude. Converting such an image to digital
form requires that the coordinates, as well as the amplitude, be
digitized.
Components of Image Processing systems
Image sensors
• Consists of millions of photosites or pixels
• Photosites converts the incoming light into the charge
Two types of image sensors
1. CCD ( Charge-Coupled Device)
2. CMOS (Complementary Metal Oxide Semiconductor)
Functions of image sensors
Light to charge conversion---Charge accumulation---Transfer ---Charge to
voltage conversion---Amplification
CCD ( Charge-Coupled Device)
CMOS (Complementary Metal Oxide Semiconductor)
Image sampling and Quantization:
To create a digital image, we need to convert the continuous
sensed data into digital form. This involves two processes.
1. Sampling
2. Quantization
• To convert it to digital form, we have to sample the
function in both coordinates and in amplitude.
Digitizing the coordinate values is called Sampling.
Digitizing the amplitude values is called Quantization.
Image processing hardware:
I. Image processing hardware is the dedicated hardware that is
used to process the instructions obtained from the image
sensors.
II. It passes the result to general purpose computer.
Mass Storage:
This capability is a must in image processing applications.
III. Short term storage for use during processing
IV. On line storage for relatively fast retrieval
V. Archival storage such as magnetic tapes and disks
e.g.central processing unit (CPU), graphics processing unit
(GPU), and field programmable gate array (FPGA)
• Image Display: Image displays in use today are mainly color TV
monitors. These monitors are driven by the outputs of image and
graphics displays cards that are an integral part of computer
system.
• Hardcopy Devices: The devices for recording image include laser
printers, film cameras, heat sensitive devices inkjet units and
digital units such as optical and CD ROM disk. Films provide the
highest possible resolution, but paper is the obvious medium of
choice for written applications.
• Networking: It is almost a default function in any computer
system in use today because of the large amount of data
inherent in image processing applications. The key consideration
in image transmission bandwidth.

1 Elements of digital image processing.pptx

  • 1.
    Elements of digitalimage processing systems -Raksha Bala Bhale
  • 2.
    Image : An imagemay be defined as a two-dimensional function f(x,y), where x and y are spatial (plane) coordinates, and the amplitude to at any pair of coordinates (x,y) is called the intensity of the image at that point. The term gray level is used often to refer to the intensity of monochrome images. An image may be continuous with respect to the x- and y- coordinates and also in amplitude. Converting such an image to digital form requires that the coordinates, as well as the amplitude, be digitized.
  • 3.
    Components of ImageProcessing systems
  • 4.
    Image sensors • Consistsof millions of photosites or pixels • Photosites converts the incoming light into the charge Two types of image sensors 1. CCD ( Charge-Coupled Device) 2. CMOS (Complementary Metal Oxide Semiconductor)
  • 5.
    Functions of imagesensors Light to charge conversion---Charge accumulation---Transfer ---Charge to voltage conversion---Amplification CCD ( Charge-Coupled Device)
  • 6.
    CMOS (Complementary MetalOxide Semiconductor) Image sampling and Quantization: To create a digital image, we need to convert the continuous sensed data into digital form. This involves two processes. 1. Sampling 2. Quantization
  • 7.
    • To convertit to digital form, we have to sample the function in both coordinates and in amplitude. Digitizing the coordinate values is called Sampling. Digitizing the amplitude values is called Quantization.
  • 9.
    Image processing hardware: I.Image processing hardware is the dedicated hardware that is used to process the instructions obtained from the image sensors. II. It passes the result to general purpose computer. Mass Storage: This capability is a must in image processing applications. III. Short term storage for use during processing IV. On line storage for relatively fast retrieval V. Archival storage such as magnetic tapes and disks e.g.central processing unit (CPU), graphics processing unit (GPU), and field programmable gate array (FPGA)
  • 10.
    • Image Display:Image displays in use today are mainly color TV monitors. These monitors are driven by the outputs of image and graphics displays cards that are an integral part of computer system. • Hardcopy Devices: The devices for recording image include laser printers, film cameras, heat sensitive devices inkjet units and digital units such as optical and CD ROM disk. Films provide the highest possible resolution, but paper is the obvious medium of choice for written applications. • Networking: It is almost a default function in any computer system in use today because of the large amount of data inherent in image processing applications. The key consideration in image transmission bandwidth.