2. WHAT IS THE DIFFERENCE ?
Image Processing
Computer Vision
Robot Vision
2
Aditya@poltekom.ac.id
3. IMAGE PROCESSING
A process to an image focusing on transforming, encoding and transmitting
the image.
IMAGE IMAGE
3
Aditya@poltekom.ac.id
4. COMPUTER VISION
Computer Vision => media to know the world visually supported by
knowledge strength by computational instrument.
Description /
IMAGE humanized
information
4
Aditya@poltekom.ac.id
5. ROBOT VISION
Robot Vision => a machine with ability to see his environment designed with
workflow algorithm, so it can make a decision and finish the job automatically.
IMAGE Action
5
Aditya@poltekom.ac.id
8. INTRODUCTION
Modern digital technology has made it possible to
manipulate multi-dimensional signals with systems that range
from simple digital circuits to advanced parallel computers.
The goal of this manipulation can be divided into three
categories:
Image Processing
Image in image out
Image Analysis
Image in measurements out
Image Understanding
Image in high-level description out
8
Aditya@poltekom.ac.id
12. INTRODUCTION
We begin with certain basic definitions.
An image defined in the “real world” is
considered to be a function of two real variables, for
example, a(x,y) with a as the amplitude (e.g. brightness)
of the image at the real coordinate position (x,y).
12
Aditya@poltekom.ac.id
13. INTRODUCTION
An image may be considered to contain sub-
images sometimes referred to as regions–of–interest, ROIs,
or simply regions. This concept reflects the fact that
images frequently contain collections of objects each of
which can be the basis for a region.
13
Aditya@poltekom.ac.id
15. INTRODUCTION
Y1 Y2
Regions X1
Of
Interest
X2
15
Aditya@poltekom.ac.id
16. INTRODUCTION
Y1 Y2 Y3 Y4
Regions
Regions Of
Of Interest
Interest X1
X2
16
Aditya@poltekom.ac.id
17. INTRODUCTION
In a sophisticated image processing system it should be possible to
apply specific image processing operations to selected regions. Thus one part
of an image (region) might be processed to suppress motion blur while
another part might be processed to improve color rendition.
17
Aditya@poltekom.ac.id
19. DIGITAL IMAGE DEFINITIONS
A digital image a[m,n] described in a 2D discrete space is derived
from an analog image a(x,y) in a 2D continuous space through a sampling
process that is frequently referred to as digitization. For now we will look at
some basic definitions associated with the digital image. The effect of
digitization is shown in Figure 1.
19
Aditya@poltekom.ac.id
20. DIGITAL IMAGE DEFINITIONS
0,2 nM 3,2 nM
Sample range of colour in reality world is an analog signal
20
Aditya@poltekom.ac.id
21. DIGITAL IMAGE DEFINITIONS
0,2 nM 3,2 nM
The idea of digitization
Is taking sample of a range or an analog value
21
Aditya@poltekom.ac.id
22. DIGITAL IMAGE DEFINITIONS
0,2 nM 3,2 nM
00 01 10 11
2 bit colour representation
The idea of digitization
Is taking sample of a range or an analog value
22
Aditya@poltekom.ac.id
24. DIGITAL IMAGE DEFINITIONS
The 2D continuous image a(x,y) is divided into N
rows and M columns. The intersection of a row and a
column is termed a pixel (pixel comes from “picture element”).
The value assigned to the integer coordinates [m,n] with
{m=0,1,2,…,M–1} and {n=0,1,2,…,N–1} is a[m,n]. In
fact, in most cases a(x,y) which we might consider to be
the physical signal that impinges on the face of a 2D
sensor is actually a function of many variables including
depth (z), color (l), and time (t). Unless otherwise stated,
we will consider the case of 2D, monochromatic, static
images in this chapter.
24
Aditya@poltekom.ac.id
25. DIGITAL IMAGE DEFINITIONS
A pixel contain these information : a (x, y, z, l, t)
a = illumination / light exposure in a certain pixel
X = horizontal coordinate
Y = vertical coordinate
Z = depth
L = colour
T = time frame
25
Aditya@poltekom.ac.id
26. DIGITAL IMAGE DEFINITIONS
A pixel contain these information : a (x, y, z, l, t)
a = illumination / light exposure in a certain pixel High
Light
Low exposure
Light
exposure
26
Aditya@poltekom.ac.id
27. DIGITAL IMAGE DEFINITIONS
A pixel contain these information : a (x, y, z, l, t)
X, Y = 2 dimensional coordinate
Y1 Y2
X1
X2
27
Aditya@poltekom.ac.id
28. DIGITAL IMAGE DEFINITIONS
A pixel contain these information : a (x, y, z, l, t)
Z = depth bottom
surface
28
Aditya@poltekom.ac.id
29. DIGITAL IMAGE DEFINITIONS
A pixel contain these information : a (x, y, z, l, t)
l = colour Yellow
colour
Red
colour
29
Aditya@poltekom.ac.id
30. DIGITAL IMAGE DEFINITIONS
A pixel contain these information : a (x, y, z, l, t)
Picture taken in a different time frame
t = time frame
t1
t2
t3
t4
30
Introduction to Image Processing Aditya@poltekom.ac.id
31. DIGITAL IMAGE DEFINITIONS
The image shown in Figure 1 has been divided into N = 16 rows and
M = 16 columns. The value assigned to every pixel is the average brightness in
the pixel rounded to the nearest integer value. The process of representing
the amplitude of the 2D signal at a given coordinate as an integer value
with L different gray levels is usually referred to as amplitude
quantization or simply quantization.
31
Aditya@poltekom.ac.id
32. COMMON VALUES
There are standard values for the various
parameters encountered in digital image processing.
These values can be caused by video standards, by
algorithmic requirements, or by the desire to keep digital
circuitry simple. Table 1 gives some commonly
encountered values.
32
Aditya@poltekom.ac.id
33. COMMON VALUES
Quite frequently we see cases of M=N=2K where {K =
8,9,10}. This can be motivated by digital circuitry or by
the use of certain algorithms such as the (fast) Fourier
transform.
The number of distinct gray levels is usually a power of
2, that is, L=2B where B is the number of bits in the
binary representation of the brightness levels. When
B>1 we speak of a gray-level image; when B=1 we speak
of a binary image. In a binary image there are just two
gray levels which can be referred to, for example, as
“black” and “white” or “0” and “1”.
33
Aditya@poltekom.ac.id
34. CHARACTERISTIC OF
IMAGE OPERATIONS
There is a variety of ways to classify and characterize image
operations. The reason for doing so is to understand what type of results we
might expect to achieve with a given type of operation or what might be the
computational burden associated with a given operation.
34
Aditya@poltekom.ac.id
35. ADVANTAGES OF IMAGE
PROCESSING
Medical
Sharpen X-Ray result, Analysis of MRI, etc
Technology and Communications
Reduce noise from satellite image, video streaming
Game
Shadow effect on water surface, light effect, etc
Photography and Films
Contrast, brightness, illegal photo manipulations, etc
35
Aditya@poltekom.ac.id