SlideShare a Scribd company logo
XỬ LÝ ẢNH TRONG CƠ ĐIỆN TỬ
Machine Vision
1
TRƯỜNG ĐẠI HỌC BÁCH KHOA HÀ NỘI
Giảng viên: TS. Nguyễn Thành Hùng
Đơn vị: Bộ môn Cơ điện tử, Viện Cơ khí
Hà Nội, 2021
2
Chapter 1. Introduction to machine vision
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
Phân đoạn ảnh
3
1. Introduction
❖Definition
➢Machine vision (MV) is the technology and methods used to provide imaging-
based automatic inspection and analysis for such applications as automatic
inspection, process control, and robot guidance, usually in industry.
➢Machine vision is a term encompassing a large number of technologies,
software and hardware products, integrated systems, actions, methods and
expertise.
➢Machine vision as a systems engineering discipline can be considered distinct
from computer vision, a form of computer science.
➢It attempts to integrate existing technologies in new ways and apply them to
solve real world problems.
4
1. Introduction
❖Definition
➢The overall machine vision process includes planning the details of the
requirements and project, and then creating a solution. During run-time, the
process starts with imaging, followed by automated analysis of the image and
extraction of the required information.
5
1. Introduction
❖Definition
https://en.wikipedia.org/wiki/Glossary_of_machine_vision
6
1. Introduction
❖Application: Locate
➢To find the object and report its position and orientation.
7
1. Introduction
❖Application: Measure
➢To measure physical dimensions of the object
8
1. Introduction
❖Application: Inspect
➢To validate certain features
9
1. Introduction
❖Application: Inspect
10
1. Introduction
❖Application: Identify
11
Chapter 1. Introduction to machine vision
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
12
2. Basic elements of machine vision system
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
13
2. Basic elements of machine vision system
❑ 2.1. Overview
http://www.digikey.com/en/articles/techzone/2012/jan/versatile-leds-drive-machine-vision-in-automated-manufacture
14
2. Basic elements of machine vision system
❑ 2.1. Overview
OK NG
How to automatically detect the defect?
15
2. Basic elements of machine vision system
❑ 2.1. Overview
❖Illumination
➢Illumination: is the way an object is lit up and lighting is the actual lamp that
generates the illumination.
❖Imaging (Camera and lens)
➢The term imaging defines the act of creating an image.
16
2. Basic elements of machine vision system
❑ 2.1. Overview
❖Image processing and analysis
➢This is where the desired features are extracted automatically by algorithms and
conclusions are drawn.
➢A feature is the general term for information in an image, for example a
dimension or a pattern.
➢Algorithms are also referred to as tools or functions.
17
2. Basic elements of machine vision system
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
18
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖The goal of lighting in machine vision is to obtain a robust application by:
➢Enhancing the features to be inspected.
➢Assuring high repeatability in image quality.
SICK IVP, “Machine Vision Introduction,” 2006.
19
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Illumination Principles
Light can be described as waves with three properties:
➢Wavelength or color, measured in nm (nanometers)
➢Intensity
➢Polarization.
SICK IVP, “Machine Vision Introduction,” 2006.
20
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Illumination Principles
➢ The spectral response of a sensor is the sensitivity curve for different
wavelengths.
Spectral response of a
gray scale CCD sensor.
Maximum sensitivity
is for green (500 nm).
SICK IVP, “Machine Vision Introduction,” 2006.
21
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Illumination Principles
➢ The optical axis is a thought line through the center of the lens, i.e. the direction
the camera is looking.
SICK IVP, “Machine Vision Introduction,” 2006.
22
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Illumination Principles
SICK IVP, “Machine Vision Introduction,” 2006.
23
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Ring Light
➢ A ring light is mounted around the optical axis of the lens, either on the camera or
somewhere in between the camera and the object.
SICK IVP, “Machine Vision Introduction,” 2006.
24
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Ring Light
SICK IVP, “Machine Vision Introduction,” 2006.
Pros
• Easy to use
• High intensity and short exposure time possible
Ambient light.
Cons
• Direct reflections, called hot spots, on reflective surfaces
Ring light. The printed matte s
urface is evenly illuminated. Ho
t spots appear on shiny
surfaces (center), one for each
of the 12 LEDs of the ring light.
25
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Spot Light
➢ A spot light has all the light emanating from one direction that is different from the
optical axis. For flat objects, only diffuse reflections reach the camera.
SICK IVP, “Machine Vision Introduction,” 2006.
Mainly diffuse reflections
reach the camera
Object
Spot light
26
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Spot Light
SICK IVP, “Machine Vision Introduction,” 2006.
Pros
• No hot spots
Cons
• Uneven illumination
• Requires intense light since it is
dependent on diffuse reflections
27
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Backlight
➢ The backlight principle has the object being illuminated from behind to produce a
contour or silhouette.
SICK IVP, “Machine Vision Introduction,” 2006.
28
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Backlight
SICK IVP, “Machine Vision Introduction,” 2006.
Pros
• Very good contrast
• Robust to texture, color, and ambient light
Cons
• Dimension must be
larger than object
Ambient light.
Backlight: Enhances contours
by creating a silhouette
29
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Darkfield
➢ Darkfield means that the object is illuminated at a large angle of incidence.
SICK IVP, “Machine Vision Introduction,” 2006.
30
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Darkfield
SICK IVP, “Machine Vision Introduction,” 2006.
Pros
• Good enhancement of scratches, protruding
edges, and dirt on surfaces
Cons
• Mainly works on flat surfaces with small features
• Requires small distance to object
• The object needs to be somewhat reflective
Ambient light. Darkfield: Enhances relief co
ntours, i.e., lights up edges
31
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> On-Axis Light
➢ When an object needs to be illuminated parallel to the optical axis, a semi-
transparent mirror is used to create an on- axial light source.
SICK IVP, “Machine Vision Introduction,” 2006.
32
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> On-Axis Light
SICK IVP, “Machine Vision Introduction,” 2006.
Pros
• Very even illumination, not hot spots
• High contrast on materials with different
reflectivity
Cons
• Low intensity requires long exposure times
• Cleaning off semi-transparent mirror (beam-
splitter) often needed
Inside of a can
as seen with
ambient light
Inside of the same can
as seen with a coaxial
(on-axis) light
33
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Dome Light
➢ The dome light produces the needed uniform light intensity inside of the dome
walls.
SICK IVP, “Machine Vision Introduction,” 2006.
34
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Dome Light
SICK IVP, “Machine Vision Introduction,” 2006.
Pros
• Works well on highly reflective
materials
• Uniform illumination, except for the
darker middle of the image. No hot
spots
Cons
• Low intensity requires
long exposure times
• Dimensions must be
larger than object
• Dark area in the middle of
the image
35
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Dome Light
SICK IVP, “Machine Vision Introduction,” 2006.
Ambient light. On top of the key numbers is a curved,
transparent material causing direct reflections.
The direct reflections are eliminated by
the dome light’s even illumination.
36
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Laser Light
➢ A 2D camera with a laser line can provide a cost efficient solution for low-contrast
and 3D inspections.
SICK IVP, “Machine Vision Introduction,” 2006.
Pros
• Robust against ambient light
• Allows height measurements (z parallel
to the optical axis).
• Low-cost 3D for simpler applications
Cons
• Laser safety issues
• Data along y is lost in favor of z (height) data
• Lower accuracy than 3D cameras
37
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Laser Light
SICK IVP, “Machine Vision Introduction,” 2006.
Ambient light. Contract lens containers, the left
is facing up (5mm high at cross) and the right is
facing down (1mm high at minus sign.
The laser line clearly shows the
height difference.
38
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Strobe or Constant light
➢ A strobe light is a flashing light.
➢ Strobing allows the LED to emit higher light intensity than what is achieved with a
constant light by turbo charging.
SICK IVP, “Machine Vision Introduction,” 2006.
39
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Diffusor Plate
➢ The diffusor plate converts direct light into diffuse.
➢ The purpose of a diffusor plate is to avoid bright spots in the image, caused by
the direct light's reflections in glossy surfaces.
SICK IVP, “Machine Vision Introduction,” 2006.
Two identical white bar lights, with diffusor
plate (top) and without (bottom).
40
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> LED Color
➢ LED lightings come in several colors. Most common are red and green. There are
also LEDs in blue, white, UV, and IR.
➢ Different objects reflect different colors. A blue object appears blue because it
reflects the color blue.
➢ Therefore, if blue light is used to illuminate a blue object, it will appear bright in a
gray scale image.
SICK IVP, “Machine Vision Introduction,” 2006.
41
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> LED Color
SICK IVP, “Machine Vision Introduction,” 2006.
42
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Optical Filters
➢ An optical filter is a layer in front of the sensor or lens that absorbs certain
wavelengths (colors) or polarizations.
➢ Two main optical filter types are used for machine vision:
SICK IVP, “Machine Vision Introduction,” 2006.
1
2
Polarization filter: Only transmits light with a certain polarization. Light changes its
polarization when it is reflected, which allows us to filter out unwanted reflections.
Band-pass filter: Only transmits light of a certain color, i.e. within a certain
wavelength interval. example, a red filter only lets red through
43
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Optical Filters
SICK IVP, “Machine Vision Introduction,” 2006.
Original image Image seen by gray
scale camera with
ambient light and
without filter
Red light and a
red band-pass filter
Green light and a
green band-pass filter
44
2. Basic elements of machine vision system
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
45
2. Basic elements of machine vision system
❑ 2.3. Imaging
➢ The term imaging defines the act of creating an image.
➢ Imaging has several technical names: Acquiring, capturing, or grabbing
➢ To grab a high-quality image → the number one goal for a successful vision
application.
SICK IVP, “Machine Vision Introduction,” 2006.
46
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts
➢ A simplified camera setup consists of camera, lens, lighting, and object.
SICK IVP, “Machine Vision Introduction,” 2006.
47
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
➢ A sensor chip is used to grab a digital image.
➢ On the sensor there is an array of lightsensitive pixels.
SICK IVP, “Machine Vision Introduction,” 2006.
Sensor chip with an array
of light-sensitive pixels.
48
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
There are two technologies used for digital image sensors:
➢ CCD (Charge-Coupled Device)
➢ CMOS (Complementary Metal Oxide Semiconductor).
SICK IVP, “Machine Vision Introduction,” 2006.
49
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
http://www.f4news.com/2016/05/09/ccd-vs-cmos-infographic/
50
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The lens (Objective) focuses the light that enters the camera in a way that
creates a sharp image.
SICK IVP, “Machine Vision Introduction,” 2006.
Focused or sharp image. Unfocused or blurred image.
51
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The angle of view determines how much of the visual scene the camera sees.
SICK IVP, “Machine Vision Introduction,” 2006.
52
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The focal length is the distance between the lens and the focal point.
➢ When the focal point is on the sensor, the image is in focus.
SICK IVP, “Machine Vision Introduction,” 2006.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Lenses and Focal Length
▪ Focal length is related to angle of view in that a long focal length corresponds to a
small angle of view, and vice versa.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Field of View in 2D
▪ The FOV (Field of View) in 2D systems is the full area that a camera sees. The FOV
is specified by its width and height.
▪ The object distance is the distance between the lens and the object.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Aperture and F-stop
▪ The aperture is the opening in the lens that controls the amount of light that is let
onto the sensor. In quality lenses, the aperture is adjustable.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Aperture and F-stop
▪ The size of the aperture is measured by its F-stop value. A large F-stop value means
a small aperture opening, and vice versa.
▪ For standard CCTV lenses, the F-stop value is adjustable in the range between F1.4
and F16.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Depth of Field
▪ The minimum object distance (sometimes abbreviated MOD) is the closest
distance in which the camera lens can focus and maximum object distance is the
farthest distance.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Depth of Field
▪ The focal plane is found at the distance where the focus is as sharp as possible.
▪ Objects closer or farther away than the focal plane can also be considered to be in
focus. This distance interval where good-enough focus is obtained is called depth of
field (DOF).
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Depth of Field
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Depth of Field
▪ The depth of field depends on both the focal length and the aperture adjustment.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Depth of Field
▪ By adding a distance ring between the camera and the lens, the focal plane (and
thus the MOD) can be moved closer to the camera. A distance ring is also referred to
as shim, spacer, or extension ring.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Depth of Field
▪ A side-effect of using a distance ring is that a maximum object distance is
introduced and that the depth of field range decreases.
Basic Camera Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Pixels and Resolution
▪ A pixel is the smallest element in a digital image. Normally, the
pixel in the image corresponds directly to the physical pixel on
the sensor.
▪ To the right is an example of a very small image with dimension
8x8 pixels. The dimensions are called x and y, where x
corresponds to the image columns and y to the rows.
Basic Image Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Pixels and Resolution
▪ Typical values of sensor resolution in 2D machine
vision are:
➢ VGA (Video Graphics Array): 640x480 pixels
➢ XGA (Extended Graphics Array): 1024x768 pixels
➢ SXGA (Super Extended Graphics Array):
1280x1024 pixels
Basic Image Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Pixels and Resolution
▪ The object resolution is the physical dimension on the object that corresponds to
one pixel on the sensor. Common units for object resolution are μm (microns) per
pixel and mm per pixel.
▪ Example: Object Resolution Calculation: FOV width = 50 mm, Sensor resolution =
640x480 pixels, Calculation of object resolution in x:
Basic Image Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Intensity
▪ The brightness of a pixel is called intensity. The intensity information is stored for
each pixel in the image and can be of different types. Examples:
➢ Binary: One bit per pixel.
➢ Gray scale: Typically one byte per pixel.
Basic Image Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Intensity
➢ Color: Typically one byte per pixel and color. Three bytes are needed to obtain full
color information. One pixel thus contains three components (R, G, B).
Basic Image Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Intensity
▪ When the intensity of a pixel is digitized and described by a byte, the information is
quantized into discrete levels. The number of bits per byte is called bit-depth.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
Basic Image Concepts
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Exposure
▪ Exposure is how much light is detected by the photographic film or sensor. The
exposure amount is determined by two factors:
➢ Exposure time: Duration of the exposure, measured in milliseconds (ms). Also
called shutter time from traditional photography.
➢ Aperture size: Controls the amount of light that passes through the lens.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
Basic Image Concepts
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Exposure
▪ If the exposure time is too short for the sensor to capture enough light, the image is
said to be underexposed. If there is too much light and the sensor is saturated, the
image is said to be overexposed.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
Basic Image Concepts
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Gain
▪ Gain amplifies the intensity values after the sensor has already been exposed, very
much like the volume control of a radio (which doesn’t actually make the artist sing
louder). The tradeoff of compensating insufficient exposure with a high gain is
amplified noise.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
Basic Image Concepts
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Contrast and Histogram
▪ Contrast is the relative difference between bright and dark areas in an image.
Contrast is necessary to see anything at all in an image.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
Basic Image Concepts
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Contrast and Histogram
▪ A histogram is a diagram where the pixels are sorted in order of increasing intensity
values.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
Basic Image Concepts
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Contrast and Histogram
Basic Image Concepts
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.3. Imaging
75
2. Basic elements of machine vision system
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
➢ The basic stages in image processing include: preprocessing, image
segmentation, feature extraction, and recognition and analysis.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
▪ The captured image may have low contrast, noise, or contain some unnecessary
information.
▪ The main function of the preprocessing is to filter noise, increase contrast to make
images clearer and sharper.
▪ Some other functions such as converting color images to grayscale images,
extracting areas of interest (ROI - Region of Interest).
Preprocessing
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Preprocessing
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
ROI Extraction
One ROI is created to verify the logotype (blue) and
another is created for barcode reading (green).
A ROI is placed around each pill in the blister pack
and the pass/fail analysis is performed once per ROI.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Preprocessing
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
Pixel Counting
Automotive part with crack. The crack is found using a darkfield illumination
and by counting the dark pixels inside the ROI.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Preprocessing
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
Digital Filters
Noisy version of original image. Image (left) after noise reduction.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Image segmentation
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
▪ Image segmentation is to split an input image into component areas to
represent analysis and image recognition.
▪ This is the most difficult part of image processing and is also error-prone,
losing the accuracy of the image processing. The result of object
identification depends very much on this stage.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Image segmentation
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
Original intensity-coded 3D image. Image after a binarization operation. Image after edge enhancement.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Image segmentation
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Feature Extraction
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
▪ The output of the segmentation contains pixels of the image area
(segmented image) plus the code associated with the neighborhood.
▪ Features for image rendering called Feature Selection are associated with
separating image properties in the form of quantitative information.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Feature Extraction
R. C. Gonzalez and R. E. Woods, “Digital Image Processing,” 4th edition, Prentice Hall, 2018.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
Digital boundary with resampling
grid superimposed.
Result of resampling. 8-directional chain-coded boundary.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Feature Extraction
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
The smallest axis-parallel enclosing
rectangle of a region.
The smallest enclosing rectangle
of arbitrary orientation. The smallest enclosing circle.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Image recognition and analysis
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
▪ Image recognition is the process of identifying images.
▪ This process is usually obtained by comparing with the standard sample
that has been learned (or saved) before.
▪ Interpolation is a judgment based on the meaning of identification.
➢identification of parameter
➢identification of structure
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Image recognition and analysis
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
Reference image for teaching. Matching in new image.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
Image recognition and analysis
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
❑ 2.4. Image processing and analysis
90
Chapter 1. Introduction to machine vision
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
91
3. Classification
3.1. 1D vision systems
3.2. 2D vision systems
3.3. 3D vision systems
92
3. Classification
❑ 3.1. 1D vision systems
➢ 1D vision analyzes a digital signal one line at a time instead of looking at a whole
picture at once.
➢ This technique commonly detects and classifies defects on materials
manufactured in a continuous process, such as paper, metals, plastics, and other
non-woven sheet or roll goods.
COGNEX, “Introduction to Machine Vision,” 2016.
93
3. Classification
❑ 3.1. 1D vision systems
COGNEX, “Introduction to Machine Vision,” 2016.
1D vision systems scan one
line at a time while the
process moves. In the
above example, a defect in
the sheet is detected.
94
3. Classification
❑ 3.2. 2D vision systems
➢ Most common inspection cameras perform area scans that involve capturing 2D
snapshots in various resolutions.
COGNEX, “Introduction to Machine Vision,” 2016.
2D vision systems can
produce images with
different resolutions
95
3. Classification
❑ 3.2. 2D vision systems
➢ Another type of 2D machine vision–line scan–builds a 2D image line by line.
COGNEX, “Introduction to Machine Vision,” 2016.
Line scan techniques build the 2D image one line at a time.
96
3. Classification
❑ 3.3. 3D vision systems
➢ 3D machine vision systems typically comprise multiple cameras or one or more
laser displacement sensors.
➢ Multi-camera 3D vision in robotic guidance applications provides the robot with
part orientation information.
➢ These systems involve multiple cameras mounted at different locations and
“triangulation” on an objective position in 3-D space.
COGNEX, “Introduction to Machine Vision,” 2016.
97
3. Classification
❑ 3.3. 3D vision systems
COGNEX, “Introduction to Machine Vision,” 2016.
3D vision systems typically employ
multiple cameras
3D inspection system using
a single camera
98
3. Classification
❑ 3.3. 3D vision systems
➢ 3D laser-displacement sensor applications typically include surface inspection
and volume measurement, producing 3D results with as few as a single camera.
➢ A height map is generated from the displacement of the reflected lasers’ location
on an object.
➢ The object or camera must be moved to scan the entire product similar to line
scanning.
COGNEX, “Introduction to Machine Vision,” 2016.
99
Chapter 1. Introduction to machine vision
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
100
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
101
4. Technical Specifications
❑ 4.1. Parts
➢ Discrete parts or endless material (i.e., paper or woven goods) minimum and
maximum dimensions
➢ Changes in shape
➢ Description of the features that have to be extracted
➢ Changes of these features concerning error parts and common product variation
➢ Surface finish
➢ Color
➢ Corrosion, oil films, or adhesives
➢ Changes due to part handling, i.e., labels, fingerprints
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
102
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
103
4. Technical Specifications
❑ 4.2. Parts Presentation
➢ Regarding part motion, the following options are possible:
▪ indexed positioning
▪ continuous movement
➢ If there is more than one part in view, the following topics are important:
▪ number of parts in view
▪ overlapping parts
▪ touching parts
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
104
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
105
4. Technical Specifications
❑ 4.3. Performance Requirements
➢ The performance requirements can be seen in the aspects of:
▪ accuracy and
▪ time performance
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
106
4. Technical Specifications
❑ 4.3. Performance Requirements
➢ Time performance:
▪ cycle time
▪ start of acquisition
▪ maximum processing time
▪ number of production cycles from inspection to result using (for result
buffering)
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
107
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
108
4. Technical Specifications
❑ 4.4. Information Interfaces
➢ User interface for handling and visualizing results
➢ Declaration of the current part type
➢ Start of the inspection
➢ Setting results
➢ Storage of results or inspection data in log files or databases
➢ Generation of inspection protocols for storage or printout
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
109
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
110
4. Technical Specifications
❑ 4.5. Installation Space
➢ The possibility of aligning the illumination and the camera
➢ Is an insight into the inspection scene possible?
➢ What variations are possible for minimum and maximum distances between the
part and the camera?
➢ The distance between the camera and the processing unit
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
111
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
112
4. Technical Specifications
❑ 4.6. Environment
➢ Ambient light
➢ Dirt or dust that the equipment needs to be protected from shock or vibration that
affects the part of the equipment heat or cold
➢ Necessity of a certain protection class
➢ Availability of power supply
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
113
Chapter 1. Introduction to machine vision
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
114
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
115
5. Designing a Machine Vision System
❑ 5.1. Camera Type
➢ Line scan camera
➢ Area scan camera
➢ 3D camera
Directions for a line scan camera.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
116
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
117
5. Designing a Machine Vision System
❑ 5.2. Field of View
The field of view is determined by the following factors:
➢ maximum part size
➢ maximum variation of part presentation in
translation and orientation
➢ margin as an offset to part size
➢ aspect ratio of the camera sensor
Field of view.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
118
5. Designing a Machine Vision System
❑ 5.2. Field of View
Example:
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
Horizontal Vertical
maximum part size 10 mm 6 mm
tolerance in positioning 1 mm
margin 2 mm
aspect ratio 4:3
𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 = 10 𝑚𝑚 + 1 𝑚𝑚 + 2 𝑚𝑚 = 13 𝑚𝑚 → 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡 =
3
4
𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 = 9.75 𝑚𝑚
𝐹𝑂𝑉𝑣𝑒𝑟_𝑐𝑎𝑙 = 6 𝑚𝑚 + 1 𝑚𝑚 + 2 𝑚𝑚 = 9 𝑚𝑚 < 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡
𝐹𝑂𝑉ℎ𝑜𝑟 = 𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 𝑎𝑛𝑑 𝐹𝑂𝑉
𝑣𝑒𝑟 = 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡 𝐹𝑂𝑉 = 13 𝑚𝑚 × 9.75 𝑚𝑚
119
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
120
5. Designing a Machine Vision System
❑ 5.3. Resolution
➢ camera sensor resolution
➢ spatial resolution
➢ measurement accuracy
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
121
5. Designing a Machine Vision System
❑ 5.3. Resolution
➢ Calculation of Resolution
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
122
5. Designing a Machine Vision System
❑ 5.3. Resolution
➢ Example: Measure the dimension of the object 10mmx6mm above with accuracy
0.01 mm.
▪ Using edge detection for dimention measurement → Nf = 1/3 pixel. But there is a
tolerance in positioning, the number of pixels for the smallest feature is set to 1
pixel (Nf = 1 pixel).
▪ Size of the smallest feature Sf = 0.01 mm
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
123
5. Designing a Machine Vision System
❑ 5.3. Resolution
▪ Camera resolution:
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
𝑅𝐶_ℎ𝑜𝑟 = 𝐹𝑂𝑉ℎ𝑜𝑟
𝑁𝑓
𝑆𝑓
= 13 𝑚𝑚
1 𝑝𝑖𝑥𝑒𝑙
0.01 𝑚𝑚
= 1300 𝑝𝑖𝑥𝑒𝑙𝑠
𝑅𝐶_𝑣𝑒𝑟 = 𝐹𝑂𝑉
𝑣𝑒𝑟
𝑁𝑓
𝑆𝑓
= 9.75 𝑚𝑚
1 𝑝𝑖𝑥𝑒𝑙
0.01 𝑚𝑚
= 975 𝑝𝑖𝑥𝑒𝑙𝑠
124
5. Designing a Machine Vision System
❑ 5.3. Resolution
▪ Object resolution (spatial resolution): assuming that a lens with a field of view of
14 mm ( > 13 mm) was chosen, a camera with resolution of 1440x1080 was
chosen.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
𝑅𝑠 =
𝐹𝑂𝑉
𝑅𝐶
=
14 𝑚𝑚
1440 𝑝𝑖𝑥𝑒𝑙
= 0.01 𝑚𝑚/𝑝𝑖𝑥𝑒𝑙
125
5. Designing a Machine Vision System
❑ 5.3. Resolution
➢ Calculation of Resolution: Resolution for a Line Scan Camera
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
126
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
127
5. Designing a Machine Vision System
❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform
➢ Camera Model
▪ color sensor
▪ interface technology
▪ progressive scan for area cameras
▪ packaging size
▪ price and availability
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
128
5. Designing a Machine Vision System
❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform
➢ Frame Grabber
▪ compatibility with the pixel rate
▪ compatibility with the software library
▪ number of cameras that can be addressed
▪ utilities to control the camera via the frame grabber
▪ timing and triggering of the camera
▪ availability of on-board processing
▪ availability of general purpose I/O
▪ price and availability
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
129
5. Designing a Machine Vision System
❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform
➢ Pixel Rate
▪ This is the speed of imaging in terms of pixels per second.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
130
5. Designing a Machine Vision System
❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform
➢ Pixel Rate
▪ For an area camera:
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
An overhead of 10% to 20% should be considered
due to additional bus transfer.
131
5. Designing a Machine Vision System
❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform
➢ Pixel Rate
▪ For a line scan camera:
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
132
5. Designing a Machine Vision System
❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform
➢ Hardware Platform
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
• Compatibility with frame grabber
• Operating system
• Development process
• means for a user-friendly human machine
interface
• Processing load
• Miscellaneous points: available interfaces,
memory, packaging size, price, and availability
▪ smart cameras
▪ compact vision systems
▪ PC-based systems
133
5. Designing a Machine Vision System
❑ Example about a camera
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
134
5. Designing a Machine Vision System
❑ Example about a camera
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
135
5. Designing a Machine Vision System
❑ Example about a camera
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
136
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
137
5. Designing a Machine Vision System
❑ 5.5. Lens Design
➢ Focal Length
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
standoff distance
focal length
Magnification
lens extension
focus distance
138
5. Designing a Machine Vision System
❑ 5.5. Lens Design
➢ Focal Length
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
139
5. Designing a Machine Vision System
❑ 5.5. Lens Design
➢ Lens Flange Focal Distance
▪ This is the distance between the lens mount face and the image plane.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
140
5. Designing a Machine Vision System
❑ 5.5. Lens Design
➢ Extension Tubes
▪ The lens extension l can be increased using the focus adjustmentof the lens.
▪ If the distance cannot be increased, extension tubes can be used to focus close
objects. As a result, the depth of view is decreased.
▪ For higher magnifications, such as from 0.4 to 4, macro lenses offer better image
quality.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
141
5. Designing a Machine Vision System
❑ 5.5. Lens Design
➢ Lens Diameter and Sensor Size
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
Areas illuminated by the lens and camera;
the left side displays an appropriate choice.
142
5. Designing a Machine Vision System
❑ 5.5. Lens Design
➢ Sensor Resolution and Lens Quality
▪ As for high resolution cameras, the requirements on the lens are higher than
those for standard cameras.
▪ Using a low-budget lens might lead to poor image quality for high resolution
sensors, whereas the quality is acceptable for lower resolutions.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
143
5. Designing a Machine Vision System
❑ Example about lens
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
144
5. Designing a Machine Vision System
❑ Example about lens
https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/
145
5. Designing a Machine Vision System
❑ Example about lens
https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/
146
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
147
5. Designing a Machine Vision System
❑ 5.6. Choice of Illumination
➢ Concept: Maximize Contrast
▪ Direction of light: diffuse from all directions or directed from a range of angles
▪ Light spectrum
▪ Polarization: effect on surfaces, such as metal or glass
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
148
5. Designing a Machine Vision System
❑ 5.6. Choice of Illumination
➢ Illumination Setups
▪ Backlight and
▪ Frontlight
• Diffused light
• Directed light
• Confocal frontlight
• Bright field
• Dark field
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
149
5. Designing a Machine Vision System
❑ 5.6. Choice of Illumination
➢ Light Sources
▪ Fluorescent tubes
▪ Halogen and xenon lamps
▪ LED
▪ Laser
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
150
5. Designing a Machine Vision System
❑ 5.6. Choice of Illumination
➢ Approach to the Optimum Setup
▪ A confirmation of the setup based on experiments with sample parts is
mandatory.
▪ The alignment of light, the part and the camera needs to be documented.
▪ To balance between similar setups images have to be captured and compared
for the maximum contrast.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
151
5. Designing a Machine Vision System
❑ 5.6. Choice of Illumination
➢ Interfering Lighting
▪ The influences of different lamps on the images have to be checked.
▪ To avoid interfering, a spatial separation can be achieved by using different
camera stations.
▪ The part is imaged with different sets of cameras and illuminations.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
152
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
153
5. Designing a Machine Vision System
❑ 5.7. Mechanical Design
➢ As the cameras, lenses, standoff distances, and
illumination devices are determined, the mechanical
conditions can be defined.
➢ As for mounting of cameras and lights the
adjustment is important for installation, operation,
and maintenance.
➢ The devices have to be protected against vibration
or shock.
➢ The position of cameras and lights should be
changed easily.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
154
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
155
5. Designing a Machine Vision System
❑ 5.8. Electrical Design
➢ The power supply
➢ The housing of cameras and illumination
➢ The length of cables as well as their laying
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
156
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
157
5. Designing a Machine Vision System
❑ 5.9. Software
➢ selection of a software library
➢ design and implementation of the application-specific software
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
158
5. Designing a Machine Vision System
❑ 5.9. Software
➢ Software Library
➢ Software Structure
▪ Image acquisition
▪ Preprocessing
▪ Feature localization
▪ Feature extraction
▪ Feature interpretation
▪ Generation of results
▪ Handling interfaces
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
159
5. Designing a Machine Vision System
❑ 5.9. Software
➢ General Topics
▪ Visualization of live images for all cameras
▪ Possibility of image saving
▪ Maintenance modus
▪ Log files for the system state
▪ Detailed visualization of the image processing
▪ Crucial processing parameters
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
160
5. Designing a Machine Vision System
5.1. Camera Type
5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
161
5. Designing a Machine Vision System
❑ 5.10. Costs
➢ The development costs
▪ project management
▪ base design
▪ hardware components
▪ software licenses
▪ software development
▪ installation
▪ test runs, feasibility tests, and acceptance test
▪ training
▪ documentation
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
162
5. Designing a Machine Vision System
❑ 5.10. Costs
➢ The operating costs
▪ maintenance, such as cleaning of the optical equipment
▪ change of equipment, such as lamps
▪ utility, for instance electrical power or compressed air if needed
▪ costs for system modification due to product changes
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
Quiz 1
Quiz Number 1 Quiz Type
OX Example Select
Question
Choose the lighting for measuring the radius R and r of following
object:
Example
A. Dome Light B. On-Axis Light
C. Darkfield D. Backlight
Answer
Feedback
R
r
Quiz 2
Quiz Number 2 Quiz Type
OX Example Select
Question
Assume that the object size = 10 cm x 20 cm, sensor resolution =
640x480 pixels. Calculate the object resolution?
Example
A. 0.31 mm/pixel B. 0.21 mm/pixel C. 0.17 mm/pixel D. 0.42
mm/pixel
Answer
Feedback
Quiz 3
Quiz Number 3 Quiz Type
OX Example Select
Question Splitting an input image into component areas is called:
Example
A. Image preprocessing B. Image segmentation
C. Image recognition D. Image representation
Answer
Feedback
Quiz 4
Quiz Number 4 Quiz Type
OX Example Select
Question The performance requirements of a machine vision system are:
Example
A. Accuracy B. Time performance
C. Both A and B D. None of the above
Answer
Feedback
Quiz 5
Quiz Number 5 Quiz Type OX Example Select
Question
Diameter Inspection of Rivets:
+ The nominal size of the rivets lies in a range of 3 mm to 4 mm
+ The required accuracy is 0.1 mm
+ The tolerance of part positioning is less than ±1 mm across the
optical axis and ±0.1 mm in the direction of the optical axis. The
belt stops for 1.5 s.
+ The maximum processing time is 2 s; the cycle time is 2.5s.
+ The maximum space for installing equipment is 500 mm.
Quiz 5
Quiz Number 5 Quiz Type OX Example Select
Question Bearing with rivet and disk

More Related Content

Similar to Chapter 1. Introduction to machine vision.pdf

Seminar on night vision technology ppt
Seminar on night vision technology pptSeminar on night vision technology ppt
Seminar on night vision technology ppt
deepakmarndi
 
Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)
Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)
Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)
swapnareddy20
 
50424340-Machine-Vision3 (1).pptx
50424340-Machine-Vision3 (1).pptx50424340-Machine-Vision3 (1).pptx
50424340-Machine-Vision3 (1).pptx
CHARLESAHIMANA
 
final ppt
final pptfinal ppt
final ppt
Akshay Upendran
 
Machine vision.pptx
Machine vision.pptxMachine vision.pptx
Machine vision.pptx
WorkCit
 
OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE
OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE
OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE
sangeetkhule
 
night vision.pptx
night vision.pptxnight vision.pptx
night vision.pptx
PTejaswini6
 
APPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISIONAPPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISION
anil badiger
 
Computer Graphics
Computer GraphicsComputer Graphics
fundamentals of machine vision system
fundamentals of machine vision systemfundamentals of machine vision system
fundamentals of machine vision system
shalet kochumuttath Shaji
 
Robot Machine Vision
Robot Machine VisionRobot Machine Vision
Robot Machine Vision
anand hd
 
B027SiddheshJadhav.pptx
B027SiddheshJadhav.pptxB027SiddheshJadhav.pptx
B027SiddheshJadhav.pptx
SiddheshJadhav94
 
Improving image resolution through the cra algorithm involved recycling proce...
Improving image resolution through the cra algorithm involved recycling proce...Improving image resolution through the cra algorithm involved recycling proce...
Improving image resolution through the cra algorithm involved recycling proce...
csandit
 
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
cscpconf
 
Eye Tracking Based Human - Computer Interaction
Eye Tracking Based Human - Computer InteractionEye Tracking Based Human - Computer Interaction
Eye Tracking Based Human - Computer Interaction
Sharath Raj
 
pick and place robotic arm
pick and place robotic armpick and place robotic arm
pick and place robotic arm
ANJANA ANILKUMAR
 
Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892
ANJANA ANILKUMAR
 
Dip lect2-Machine Vision Fundamentals
Dip  lect2-Machine Vision Fundamentals Dip  lect2-Machine Vision Fundamentals
Dip lect2-Machine Vision Fundamentals
Abdul Abbasi
 
Flexible Electronics for Pakistan by DPE
Flexible Electronics for Pakistan by DPEFlexible Electronics for Pakistan by DPE
Flexible Electronics for Pakistan by DPE
Abdullah Saqib
 
Computer architecture for vision systems
Computer architecture for vision systemsComputer architecture for vision systems
Computer architecture for vision systems
utsav patel
 

Similar to Chapter 1. Introduction to machine vision.pdf (20)

Seminar on night vision technology ppt
Seminar on night vision technology pptSeminar on night vision technology ppt
Seminar on night vision technology ppt
 
Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)
Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)
Seminaronnightvisiontechnologyppt 120102200232-phpapp02(2)
 
50424340-Machine-Vision3 (1).pptx
50424340-Machine-Vision3 (1).pptx50424340-Machine-Vision3 (1).pptx
50424340-Machine-Vision3 (1).pptx
 
final ppt
final pptfinal ppt
final ppt
 
Machine vision.pptx
Machine vision.pptxMachine vision.pptx
Machine vision.pptx
 
OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE
OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE
OPTICAL MICROSCOPY AND COORDINATE MEASURING MACHINE
 
night vision.pptx
night vision.pptxnight vision.pptx
night vision.pptx
 
APPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISIONAPPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISION
 
Computer Graphics
Computer GraphicsComputer Graphics
Computer Graphics
 
fundamentals of machine vision system
fundamentals of machine vision systemfundamentals of machine vision system
fundamentals of machine vision system
 
Robot Machine Vision
Robot Machine VisionRobot Machine Vision
Robot Machine Vision
 
B027SiddheshJadhav.pptx
B027SiddheshJadhav.pptxB027SiddheshJadhav.pptx
B027SiddheshJadhav.pptx
 
Improving image resolution through the cra algorithm involved recycling proce...
Improving image resolution through the cra algorithm involved recycling proce...Improving image resolution through the cra algorithm involved recycling proce...
Improving image resolution through the cra algorithm involved recycling proce...
 
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
 
Eye Tracking Based Human - Computer Interaction
Eye Tracking Based Human - Computer InteractionEye Tracking Based Human - Computer Interaction
Eye Tracking Based Human - Computer Interaction
 
pick and place robotic arm
pick and place robotic armpick and place robotic arm
pick and place robotic arm
 
Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892Mainprojpresentation 150617092611-lva1-app6892
Mainprojpresentation 150617092611-lva1-app6892
 
Dip lect2-Machine Vision Fundamentals
Dip  lect2-Machine Vision Fundamentals Dip  lect2-Machine Vision Fundamentals
Dip lect2-Machine Vision Fundamentals
 
Flexible Electronics for Pakistan by DPE
Flexible Electronics for Pakistan by DPEFlexible Electronics for Pakistan by DPE
Flexible Electronics for Pakistan by DPE
 
Computer architecture for vision systems
Computer architecture for vision systemsComputer architecture for vision systems
Computer architecture for vision systems
 

Recently uploaded

22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
KrishnaveniKrishnara1
 
Curve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods RegressionCurve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods Regression
Nada Hikmah
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
IJECEIAES
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
VICTOR MAESTRE RAMIREZ
 
4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf
4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf
4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf
Gino153088
 
Material for memory and display system h
Material for memory and display system hMaterial for memory and display system h
Material for memory and display system h
gowrishankartb2005
 
Generative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of contentGenerative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of content
Hitesh Mohapatra
 
Data Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason WebinarData Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason Webinar
UReason
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
kandramariana6
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
Yasser Mahgoub
 
Unit-III-ELECTROCHEMICAL STORAGE DEVICES.ppt
Unit-III-ELECTROCHEMICAL STORAGE DEVICES.pptUnit-III-ELECTROCHEMICAL STORAGE DEVICES.ppt
Unit-III-ELECTROCHEMICAL STORAGE DEVICES.ppt
KrishnaveniKrishnara1
 
AI assisted telemedicine KIOSK for Rural India.pptx
AI assisted telemedicine KIOSK for Rural India.pptxAI assisted telemedicine KIOSK for Rural India.pptx
AI assisted telemedicine KIOSK for Rural India.pptx
architagupta876
 
Certificates - Mahmoud Mohamed Moursi Ahmed
Certificates - Mahmoud Mohamed Moursi AhmedCertificates - Mahmoud Mohamed Moursi Ahmed
Certificates - Mahmoud Mohamed Moursi Ahmed
Mahmoud Morsy
 
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
171ticu
 
spirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptxspirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptx
Madan Karki
 
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
IJECEIAES
 
Null Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAMNull Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAM
Divyanshu
 
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Sinan KOZAK
 
Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...
bijceesjournal
 
CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1
PKavitha10
 

Recently uploaded (20)

22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
 
Curve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods RegressionCurve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods Regression
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
 
4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf
4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf
4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf
 
Material for memory and display system h
Material for memory and display system hMaterial for memory and display system h
Material for memory and display system h
 
Generative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of contentGenerative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of content
 
Data Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason WebinarData Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason Webinar
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
 
Unit-III-ELECTROCHEMICAL STORAGE DEVICES.ppt
Unit-III-ELECTROCHEMICAL STORAGE DEVICES.pptUnit-III-ELECTROCHEMICAL STORAGE DEVICES.ppt
Unit-III-ELECTROCHEMICAL STORAGE DEVICES.ppt
 
AI assisted telemedicine KIOSK for Rural India.pptx
AI assisted telemedicine KIOSK for Rural India.pptxAI assisted telemedicine KIOSK for Rural India.pptx
AI assisted telemedicine KIOSK for Rural India.pptx
 
Certificates - Mahmoud Mohamed Moursi Ahmed
Certificates - Mahmoud Mohamed Moursi AhmedCertificates - Mahmoud Mohamed Moursi Ahmed
Certificates - Mahmoud Mohamed Moursi Ahmed
 
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
 
spirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptxspirit beverages ppt without graphics.pptx
spirit beverages ppt without graphics.pptx
 
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
 
Null Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAMNull Bangalore | Pentesters Approach to AWS IAM
Null Bangalore | Pentesters Approach to AWS IAM
 
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
 
Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...
 
CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1
 

Chapter 1. Introduction to machine vision.pdf

  • 1. XỬ LÝ ẢNH TRONG CƠ ĐIỆN TỬ Machine Vision 1 TRƯỜNG ĐẠI HỌC BÁCH KHOA HÀ NỘI Giảng viên: TS. Nguyễn Thành Hùng Đơn vị: Bộ môn Cơ điện tử, Viện Cơ khí Hà Nội, 2021
  • 2. 2 Chapter 1. Introduction to machine vision 1. Introduction 2. Basic elements of machine vision system 3. Classification 4. Technical specifications 5. Designing a Machine Vision System Phân đoạn ảnh
  • 3. 3 1. Introduction ❖Definition ➢Machine vision (MV) is the technology and methods used to provide imaging- based automatic inspection and analysis for such applications as automatic inspection, process control, and robot guidance, usually in industry. ➢Machine vision is a term encompassing a large number of technologies, software and hardware products, integrated systems, actions, methods and expertise. ➢Machine vision as a systems engineering discipline can be considered distinct from computer vision, a form of computer science. ➢It attempts to integrate existing technologies in new ways and apply them to solve real world problems.
  • 4. 4 1. Introduction ❖Definition ➢The overall machine vision process includes planning the details of the requirements and project, and then creating a solution. During run-time, the process starts with imaging, followed by automated analysis of the image and extraction of the required information.
  • 6. 6 1. Introduction ❖Application: Locate ➢To find the object and report its position and orientation.
  • 7. 7 1. Introduction ❖Application: Measure ➢To measure physical dimensions of the object
  • 11. 11 Chapter 1. Introduction to machine vision 1. Introduction 2. Basic elements of machine vision system 3. Classification 4. Technical specifications 5. Designing a Machine Vision System
  • 12. 12 2. Basic elements of machine vision system 2.1. Overview 2.2. Illumination 2.3. Imaging 2.4. Image processing and analysis
  • 13. 13 2. Basic elements of machine vision system ❑ 2.1. Overview http://www.digikey.com/en/articles/techzone/2012/jan/versatile-leds-drive-machine-vision-in-automated-manufacture
  • 14. 14 2. Basic elements of machine vision system ❑ 2.1. Overview OK NG How to automatically detect the defect?
  • 15. 15 2. Basic elements of machine vision system ❑ 2.1. Overview ❖Illumination ➢Illumination: is the way an object is lit up and lighting is the actual lamp that generates the illumination. ❖Imaging (Camera and lens) ➢The term imaging defines the act of creating an image.
  • 16. 16 2. Basic elements of machine vision system ❑ 2.1. Overview ❖Image processing and analysis ➢This is where the desired features are extracted automatically by algorithms and conclusions are drawn. ➢A feature is the general term for information in an image, for example a dimension or a pattern. ➢Algorithms are also referred to as tools or functions.
  • 17. 17 2. Basic elements of machine vision system 2.1. Overview 2.2. Illumination 2.3. Imaging 2.4. Image processing and analysis
  • 18. 18 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖The goal of lighting in machine vision is to obtain a robust application by: ➢Enhancing the features to be inspected. ➢Assuring high repeatability in image quality. SICK IVP, “Machine Vision Introduction,” 2006.
  • 19. 19 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Illumination Principles Light can be described as waves with three properties: ➢Wavelength or color, measured in nm (nanometers) ➢Intensity ➢Polarization. SICK IVP, “Machine Vision Introduction,” 2006.
  • 20. 20 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Illumination Principles ➢ The spectral response of a sensor is the sensitivity curve for different wavelengths. Spectral response of a gray scale CCD sensor. Maximum sensitivity is for green (500 nm). SICK IVP, “Machine Vision Introduction,” 2006.
  • 21. 21 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Illumination Principles ➢ The optical axis is a thought line through the center of the lens, i.e. the direction the camera is looking. SICK IVP, “Machine Vision Introduction,” 2006.
  • 22. 22 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Illumination Principles SICK IVP, “Machine Vision Introduction,” 2006.
  • 23. 23 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Ring Light ➢ A ring light is mounted around the optical axis of the lens, either on the camera or somewhere in between the camera and the object. SICK IVP, “Machine Vision Introduction,” 2006.
  • 24. 24 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Ring Light SICK IVP, “Machine Vision Introduction,” 2006. Pros • Easy to use • High intensity and short exposure time possible Ambient light. Cons • Direct reflections, called hot spots, on reflective surfaces Ring light. The printed matte s urface is evenly illuminated. Ho t spots appear on shiny surfaces (center), one for each of the 12 LEDs of the ring light.
  • 25. 25 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Spot Light ➢ A spot light has all the light emanating from one direction that is different from the optical axis. For flat objects, only diffuse reflections reach the camera. SICK IVP, “Machine Vision Introduction,” 2006. Mainly diffuse reflections reach the camera Object Spot light
  • 26. 26 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Spot Light SICK IVP, “Machine Vision Introduction,” 2006. Pros • No hot spots Cons • Uneven illumination • Requires intense light since it is dependent on diffuse reflections
  • 27. 27 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Backlight ➢ The backlight principle has the object being illuminated from behind to produce a contour or silhouette. SICK IVP, “Machine Vision Introduction,” 2006.
  • 28. 28 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Backlight SICK IVP, “Machine Vision Introduction,” 2006. Pros • Very good contrast • Robust to texture, color, and ambient light Cons • Dimension must be larger than object Ambient light. Backlight: Enhances contours by creating a silhouette
  • 29. 29 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Darkfield ➢ Darkfield means that the object is illuminated at a large angle of incidence. SICK IVP, “Machine Vision Introduction,” 2006.
  • 30. 30 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Darkfield SICK IVP, “Machine Vision Introduction,” 2006. Pros • Good enhancement of scratches, protruding edges, and dirt on surfaces Cons • Mainly works on flat surfaces with small features • Requires small distance to object • The object needs to be somewhat reflective Ambient light. Darkfield: Enhances relief co ntours, i.e., lights up edges
  • 31. 31 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> On-Axis Light ➢ When an object needs to be illuminated parallel to the optical axis, a semi- transparent mirror is used to create an on- axial light source. SICK IVP, “Machine Vision Introduction,” 2006.
  • 32. 32 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> On-Axis Light SICK IVP, “Machine Vision Introduction,” 2006. Pros • Very even illumination, not hot spots • High contrast on materials with different reflectivity Cons • Low intensity requires long exposure times • Cleaning off semi-transparent mirror (beam- splitter) often needed Inside of a can as seen with ambient light Inside of the same can as seen with a coaxial (on-axis) light
  • 33. 33 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Dome Light ➢ The dome light produces the needed uniform light intensity inside of the dome walls. SICK IVP, “Machine Vision Introduction,” 2006.
  • 34. 34 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Dome Light SICK IVP, “Machine Vision Introduction,” 2006. Pros • Works well on highly reflective materials • Uniform illumination, except for the darker middle of the image. No hot spots Cons • Low intensity requires long exposure times • Dimensions must be larger than object • Dark area in the middle of the image
  • 35. 35 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Dome Light SICK IVP, “Machine Vision Introduction,” 2006. Ambient light. On top of the key numbers is a curved, transparent material causing direct reflections. The direct reflections are eliminated by the dome light’s even illumination.
  • 36. 36 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Laser Light ➢ A 2D camera with a laser line can provide a cost efficient solution for low-contrast and 3D inspections. SICK IVP, “Machine Vision Introduction,” 2006. Pros • Robust against ambient light • Allows height measurements (z parallel to the optical axis). • Low-cost 3D for simpler applications Cons • Laser safety issues • Data along y is lost in favor of z (height) data • Lower accuracy than 3D cameras
  • 37. 37 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Types >> Laser Light SICK IVP, “Machine Vision Introduction,” 2006. Ambient light. Contract lens containers, the left is facing up (5mm high at cross) and the right is facing down (1mm high at minus sign. The laser line clearly shows the height difference.
  • 38. 38 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Variants and Accessories >> Strobe or Constant light ➢ A strobe light is a flashing light. ➢ Strobing allows the LED to emit higher light intensity than what is achieved with a constant light by turbo charging. SICK IVP, “Machine Vision Introduction,” 2006.
  • 39. 39 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Variants and Accessories >> Diffusor Plate ➢ The diffusor plate converts direct light into diffuse. ➢ The purpose of a diffusor plate is to avoid bright spots in the image, caused by the direct light's reflections in glossy surfaces. SICK IVP, “Machine Vision Introduction,” 2006. Two identical white bar lights, with diffusor plate (top) and without (bottom).
  • 40. 40 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Variants and Accessories >> LED Color ➢ LED lightings come in several colors. Most common are red and green. There are also LEDs in blue, white, UV, and IR. ➢ Different objects reflect different colors. A blue object appears blue because it reflects the color blue. ➢ Therefore, if blue light is used to illuminate a blue object, it will appear bright in a gray scale image. SICK IVP, “Machine Vision Introduction,” 2006.
  • 41. 41 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Variants and Accessories >> LED Color SICK IVP, “Machine Vision Introduction,” 2006.
  • 42. 42 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Variants and Accessories >> Optical Filters ➢ An optical filter is a layer in front of the sensor or lens that absorbs certain wavelengths (colors) or polarizations. ➢ Two main optical filter types are used for machine vision: SICK IVP, “Machine Vision Introduction,” 2006. 1 2 Polarization filter: Only transmits light with a certain polarization. Light changes its polarization when it is reflected, which allows us to filter out unwanted reflections. Band-pass filter: Only transmits light of a certain color, i.e. within a certain wavelength interval. example, a red filter only lets red through
  • 43. 43 2. Basic elements of machine vision system ❑ 2.2. Illumination ❖Lighting Variants and Accessories >> Optical Filters SICK IVP, “Machine Vision Introduction,” 2006. Original image Image seen by gray scale camera with ambient light and without filter Red light and a red band-pass filter Green light and a green band-pass filter
  • 44. 44 2. Basic elements of machine vision system 2.1. Overview 2.2. Illumination 2.3. Imaging 2.4. Image processing and analysis
  • 45. 45 2. Basic elements of machine vision system ❑ 2.3. Imaging ➢ The term imaging defines the act of creating an image. ➢ Imaging has several technical names: Acquiring, capturing, or grabbing ➢ To grab a high-quality image → the number one goal for a successful vision application. SICK IVP, “Machine Vision Introduction,” 2006.
  • 46. 46 2. Basic elements of machine vision system ❑ 2.3. Imaging ❖Basic Camera Concepts ➢ A simplified camera setup consists of camera, lens, lighting, and object. SICK IVP, “Machine Vision Introduction,” 2006.
  • 47. 47 2. Basic elements of machine vision system ❑ 2.3. Imaging ❖Basic Camera Concepts: Digital Imaging ➢ A sensor chip is used to grab a digital image. ➢ On the sensor there is an array of lightsensitive pixels. SICK IVP, “Machine Vision Introduction,” 2006. Sensor chip with an array of light-sensitive pixels.
  • 48. 48 2. Basic elements of machine vision system ❑ 2.3. Imaging ❖Basic Camera Concepts: Digital Imaging There are two technologies used for digital image sensors: ➢ CCD (Charge-Coupled Device) ➢ CMOS (Complementary Metal Oxide Semiconductor). SICK IVP, “Machine Vision Introduction,” 2006.
  • 49. 49 2. Basic elements of machine vision system ❑ 2.3. Imaging ❖Basic Camera Concepts: Digital Imaging http://www.f4news.com/2016/05/09/ccd-vs-cmos-infographic/
  • 50. 50 2. Basic elements of machine vision system ❑ 2.3. Imaging ❖Basic Camera Concepts: Lenses and Focal Length ➢ The lens (Objective) focuses the light that enters the camera in a way that creates a sharp image. SICK IVP, “Machine Vision Introduction,” 2006. Focused or sharp image. Unfocused or blurred image.
  • 51. 51 2. Basic elements of machine vision system ❑ 2.3. Imaging ❖Basic Camera Concepts: Lenses and Focal Length ➢ The angle of view determines how much of the visual scene the camera sees. SICK IVP, “Machine Vision Introduction,” 2006.
  • 52. 52 2. Basic elements of machine vision system ❑ 2.3. Imaging ❖Basic Camera Concepts: Lenses and Focal Length ➢ The focal length is the distance between the lens and the focal point. ➢ When the focal point is on the sensor, the image is in focus. SICK IVP, “Machine Vision Introduction,” 2006.
  • 53. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Lenses and Focal Length ▪ Focal length is related to angle of view in that a long focal length corresponds to a small angle of view, and vice versa. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 54. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Field of View in 2D ▪ The FOV (Field of View) in 2D systems is the full area that a camera sees. The FOV is specified by its width and height. ▪ The object distance is the distance between the lens and the object. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 55. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Aperture and F-stop ▪ The aperture is the opening in the lens that controls the amount of light that is let onto the sensor. In quality lenses, the aperture is adjustable. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 56. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Aperture and F-stop ▪ The size of the aperture is measured by its F-stop value. A large F-stop value means a small aperture opening, and vice versa. ▪ For standard CCTV lenses, the F-stop value is adjustable in the range between F1.4 and F16. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 57. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Depth of Field ▪ The minimum object distance (sometimes abbreviated MOD) is the closest distance in which the camera lens can focus and maximum object distance is the farthest distance. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 58. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Depth of Field ▪ The focal plane is found at the distance where the focus is as sharp as possible. ▪ Objects closer or farther away than the focal plane can also be considered to be in focus. This distance interval where good-enough focus is obtained is called depth of field (DOF). Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 59. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Depth of Field Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 60. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Depth of Field ▪ The depth of field depends on both the focal length and the aperture adjustment. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 61. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Depth of Field ▪ By adding a distance ring between the camera and the lens, the focal plane (and thus the MOD) can be moved closer to the camera. A distance ring is also referred to as shim, spacer, or extension ring. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 62. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Depth of Field ▪ A side-effect of using a distance ring is that a maximum object distance is introduced and that the depth of field range decreases. Basic Camera Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 63. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Pixels and Resolution ▪ A pixel is the smallest element in a digital image. Normally, the pixel in the image corresponds directly to the physical pixel on the sensor. ▪ To the right is an example of a very small image with dimension 8x8 pixels. The dimensions are called x and y, where x corresponds to the image columns and y to the rows. Basic Image Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 64. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Pixels and Resolution ▪ Typical values of sensor resolution in 2D machine vision are: ➢ VGA (Video Graphics Array): 640x480 pixels ➢ XGA (Extended Graphics Array): 1024x768 pixels ➢ SXGA (Super Extended Graphics Array): 1280x1024 pixels Basic Image Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 65. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Pixels and Resolution ▪ The object resolution is the physical dimension on the object that corresponds to one pixel on the sensor. Common units for object resolution are μm (microns) per pixel and mm per pixel. ▪ Example: Object Resolution Calculation: FOV width = 50 mm, Sensor resolution = 640x480 pixels, Calculation of object resolution in x: Basic Image Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 66. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Intensity ▪ The brightness of a pixel is called intensity. The intensity information is stored for each pixel in the image and can be of different types. Examples: ➢ Binary: One bit per pixel. ➢ Gray scale: Typically one byte per pixel. Basic Image Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 67. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Intensity ➢ Color: Typically one byte per pixel and color. Three bytes are needed to obtain full color information. One pixel thus contains three components (R, G, B). Basic Image Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 68. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Intensity ▪ When the intensity of a pixel is digitized and described by a byte, the information is quantized into discrete levels. The number of bits per byte is called bit-depth. SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging Basic Image Concepts
  • 69. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Exposure ▪ Exposure is how much light is detected by the photographic film or sensor. The exposure amount is determined by two factors: ➢ Exposure time: Duration of the exposure, measured in milliseconds (ms). Also called shutter time from traditional photography. ➢ Aperture size: Controls the amount of light that passes through the lens. SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging Basic Image Concepts
  • 70. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Exposure ▪ If the exposure time is too short for the sensor to capture enough light, the image is said to be underexposed. If there is too much light and the sensor is saturated, the image is said to be overexposed. SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging Basic Image Concepts
  • 71. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Gain ▪ Gain amplifies the intensity values after the sensor has already been exposed, very much like the volume control of a radio (which doesn’t actually make the artist sing louder). The tradeoff of compensating insufficient exposure with a high gain is amplified noise. SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging Basic Image Concepts
  • 72. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Contrast and Histogram ▪ Contrast is the relative difference between bright and dark areas in an image. Contrast is necessary to see anything at all in an image. SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging Basic Image Concepts
  • 73. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Contrast and Histogram ▪ A histogram is a diagram where the pixels are sorted in order of increasing intensity values. SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging Basic Image Concepts
  • 74. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Contrast and Histogram Basic Image Concepts SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.3. Imaging
  • 75. 75 2. Basic elements of machine vision system 2.1. Overview 2.2. Illumination 2.3. Imaging 2.4. Image processing and analysis
  • 76. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis ➢ The basic stages in image processing include: preprocessing, image segmentation, feature extraction, and recognition and analysis.
  • 77. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. ▪ The captured image may have low contrast, noise, or contain some unnecessary information. ▪ The main function of the preprocessing is to filter noise, increase contrast to make images clearer and sharper. ▪ Some other functions such as converting color images to grayscale images, extracting areas of interest (ROI - Region of Interest). Preprocessing SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis
  • 78. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Preprocessing SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis ROI Extraction One ROI is created to verify the logotype (blue) and another is created for barcode reading (green). A ROI is placed around each pill in the blister pack and the pass/fail analysis is performed once per ROI.
  • 79. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Preprocessing SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis Pixel Counting Automotive part with crack. The crack is found using a darkfield illumination and by counting the dark pixels inside the ROI.
  • 80. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Preprocessing SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis Digital Filters Noisy version of original image. Image (left) after noise reduction.
  • 81. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Image segmentation SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis ▪ Image segmentation is to split an input image into component areas to represent analysis and image recognition. ▪ This is the most difficult part of image processing and is also error-prone, losing the accuracy of the image processing. The result of object identification depends very much on this stage.
  • 82. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Image segmentation SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis Original intensity-coded 3D image. Image after a binarization operation. Image after edge enhancement.
  • 83. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Image segmentation SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis
  • 84. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Feature Extraction SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis ▪ The output of the segmentation contains pixels of the image area (segmented image) plus the code associated with the neighborhood. ▪ Features for image rendering called Feature Selection are associated with separating image properties in the form of quantitative information.
  • 85. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Feature Extraction R. C. Gonzalez and R. E. Woods, “Digital Image Processing,” 4th edition, Prentice Hall, 2018. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis Digital boundary with resampling grid superimposed. Result of resampling. 8-directional chain-coded boundary.
  • 86. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Feature Extraction Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis The smallest axis-parallel enclosing rectangle of a region. The smallest enclosing rectangle of arbitrary orientation. The smallest enclosing circle.
  • 87. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Image recognition and analysis SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis ▪ Image recognition is the process of identifying images. ▪ This process is usually obtained by comparing with the standard sample that has been learned (or saved) before. ▪ Interpolation is a judgment based on the meaning of identification. ➢identification of parameter ➢identification of structure
  • 88. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Image recognition and analysis SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis Reference image for teaching. Matching in new image.
  • 89. The spectral response of a sensor is the sensitivity curve for different wavelengths. Camera sensors can have a different spectral response than the human eye. Image recognition and analysis SICK IVP, “Machine Vision Introduction,” 2006. 2. Basic elements of machine vision system ❑ 2.4. Image processing and analysis
  • 90. 90 Chapter 1. Introduction to machine vision 1. Introduction 2. Basic elements of machine vision system 3. Classification 4. Technical specifications 5. Designing a Machine Vision System
  • 91. 91 3. Classification 3.1. 1D vision systems 3.2. 2D vision systems 3.3. 3D vision systems
  • 92. 92 3. Classification ❑ 3.1. 1D vision systems ➢ 1D vision analyzes a digital signal one line at a time instead of looking at a whole picture at once. ➢ This technique commonly detects and classifies defects on materials manufactured in a continuous process, such as paper, metals, plastics, and other non-woven sheet or roll goods. COGNEX, “Introduction to Machine Vision,” 2016.
  • 93. 93 3. Classification ❑ 3.1. 1D vision systems COGNEX, “Introduction to Machine Vision,” 2016. 1D vision systems scan one line at a time while the process moves. In the above example, a defect in the sheet is detected.
  • 94. 94 3. Classification ❑ 3.2. 2D vision systems ➢ Most common inspection cameras perform area scans that involve capturing 2D snapshots in various resolutions. COGNEX, “Introduction to Machine Vision,” 2016. 2D vision systems can produce images with different resolutions
  • 95. 95 3. Classification ❑ 3.2. 2D vision systems ➢ Another type of 2D machine vision–line scan–builds a 2D image line by line. COGNEX, “Introduction to Machine Vision,” 2016. Line scan techniques build the 2D image one line at a time.
  • 96. 96 3. Classification ❑ 3.3. 3D vision systems ➢ 3D machine vision systems typically comprise multiple cameras or one or more laser displacement sensors. ➢ Multi-camera 3D vision in robotic guidance applications provides the robot with part orientation information. ➢ These systems involve multiple cameras mounted at different locations and “triangulation” on an objective position in 3-D space. COGNEX, “Introduction to Machine Vision,” 2016.
  • 97. 97 3. Classification ❑ 3.3. 3D vision systems COGNEX, “Introduction to Machine Vision,” 2016. 3D vision systems typically employ multiple cameras 3D inspection system using a single camera
  • 98. 98 3. Classification ❑ 3.3. 3D vision systems ➢ 3D laser-displacement sensor applications typically include surface inspection and volume measurement, producing 3D results with as few as a single camera. ➢ A height map is generated from the displacement of the reflected lasers’ location on an object. ➢ The object or camera must be moved to scan the entire product similar to line scanning. COGNEX, “Introduction to Machine Vision,” 2016.
  • 99. 99 Chapter 1. Introduction to machine vision 1. Introduction 2. Basic elements of machine vision system 3. Classification 4. Technical specifications 5. Designing a Machine Vision System
  • 100. 100 4. Technical Specifications 4.1. Parts 4.2. Part Presentation 4.3. Performance Requirements 4.4. Information Interfaces 4.5. Installation space 4.6. Environment
  • 101. 101 4. Technical Specifications ❑ 4.1. Parts ➢ Discrete parts or endless material (i.e., paper or woven goods) minimum and maximum dimensions ➢ Changes in shape ➢ Description of the features that have to be extracted ➢ Changes of these features concerning error parts and common product variation ➢ Surface finish ➢ Color ➢ Corrosion, oil films, or adhesives ➢ Changes due to part handling, i.e., labels, fingerprints Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 102. 102 4. Technical Specifications 4.1. Parts 4.2. Part Presentation 4.3. Performance Requirements 4.4. Information Interfaces 4.5. Installation space 4.6. Environment
  • 103. 103 4. Technical Specifications ❑ 4.2. Parts Presentation ➢ Regarding part motion, the following options are possible: ▪ indexed positioning ▪ continuous movement ➢ If there is more than one part in view, the following topics are important: ▪ number of parts in view ▪ overlapping parts ▪ touching parts Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 104. 104 4. Technical Specifications 4.1. Parts 4.2. Part Presentation 4.3. Performance Requirements 4.4. Information Interfaces 4.5. Installation space 4.6. Environment
  • 105. 105 4. Technical Specifications ❑ 4.3. Performance Requirements ➢ The performance requirements can be seen in the aspects of: ▪ accuracy and ▪ time performance Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 106. 106 4. Technical Specifications ❑ 4.3. Performance Requirements ➢ Time performance: ▪ cycle time ▪ start of acquisition ▪ maximum processing time ▪ number of production cycles from inspection to result using (for result buffering) Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 107. 107 4. Technical Specifications 4.1. Parts 4.2. Part Presentation 4.3. Performance Requirements 4.4. Information Interfaces 4.5. Installation space 4.6. Environment
  • 108. 108 4. Technical Specifications ❑ 4.4. Information Interfaces ➢ User interface for handling and visualizing results ➢ Declaration of the current part type ➢ Start of the inspection ➢ Setting results ➢ Storage of results or inspection data in log files or databases ➢ Generation of inspection protocols for storage or printout Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 109. 109 4. Technical Specifications 4.1. Parts 4.2. Part Presentation 4.3. Performance Requirements 4.4. Information Interfaces 4.5. Installation space 4.6. Environment
  • 110. 110 4. Technical Specifications ❑ 4.5. Installation Space ➢ The possibility of aligning the illumination and the camera ➢ Is an insight into the inspection scene possible? ➢ What variations are possible for minimum and maximum distances between the part and the camera? ➢ The distance between the camera and the processing unit Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 111. 111 4. Technical Specifications 4.1. Parts 4.2. Part Presentation 4.3. Performance Requirements 4.4. Information Interfaces 4.5. Installation space 4.6. Environment
  • 112. 112 4. Technical Specifications ❑ 4.6. Environment ➢ Ambient light ➢ Dirt or dust that the equipment needs to be protected from shock or vibration that affects the part of the equipment heat or cold ➢ Necessity of a certain protection class ➢ Availability of power supply Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 113. 113 Chapter 1. Introduction to machine vision 1. Introduction 2. Basic elements of machine vision system 3. Classification 4. Technical specifications 5. Designing a Machine Vision System
  • 114. 114 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 115. 115 5. Designing a Machine Vision System ❑ 5.1. Camera Type ➢ Line scan camera ➢ Area scan camera ➢ 3D camera Directions for a line scan camera. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 116. 116 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 117. 117 5. Designing a Machine Vision System ❑ 5.2. Field of View The field of view is determined by the following factors: ➢ maximum part size ➢ maximum variation of part presentation in translation and orientation ➢ margin as an offset to part size ➢ aspect ratio of the camera sensor Field of view. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 118. 118 5. Designing a Machine Vision System ❑ 5.2. Field of View Example: Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. Horizontal Vertical maximum part size 10 mm 6 mm tolerance in positioning 1 mm margin 2 mm aspect ratio 4:3 𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 = 10 𝑚𝑚 + 1 𝑚𝑚 + 2 𝑚𝑚 = 13 𝑚𝑚 → 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡 = 3 4 𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 = 9.75 𝑚𝑚 𝐹𝑂𝑉𝑣𝑒𝑟_𝑐𝑎𝑙 = 6 𝑚𝑚 + 1 𝑚𝑚 + 2 𝑚𝑚 = 9 𝑚𝑚 < 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡 𝐹𝑂𝑉ℎ𝑜𝑟 = 𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 𝑎𝑛𝑑 𝐹𝑂𝑉 𝑣𝑒𝑟 = 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡 𝐹𝑂𝑉 = 13 𝑚𝑚 × 9.75 𝑚𝑚
  • 119. 119 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 120. 120 5. Designing a Machine Vision System ❑ 5.3. Resolution ➢ camera sensor resolution ➢ spatial resolution ➢ measurement accuracy Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 121. 121 5. Designing a Machine Vision System ❑ 5.3. Resolution ➢ Calculation of Resolution Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 122. 122 5. Designing a Machine Vision System ❑ 5.3. Resolution ➢ Example: Measure the dimension of the object 10mmx6mm above with accuracy 0.01 mm. ▪ Using edge detection for dimention measurement → Nf = 1/3 pixel. But there is a tolerance in positioning, the number of pixels for the smallest feature is set to 1 pixel (Nf = 1 pixel). ▪ Size of the smallest feature Sf = 0.01 mm Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 123. 123 5. Designing a Machine Vision System ❑ 5.3. Resolution ▪ Camera resolution: Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 𝑅𝐶_ℎ𝑜𝑟 = 𝐹𝑂𝑉ℎ𝑜𝑟 𝑁𝑓 𝑆𝑓 = 13 𝑚𝑚 1 𝑝𝑖𝑥𝑒𝑙 0.01 𝑚𝑚 = 1300 𝑝𝑖𝑥𝑒𝑙𝑠 𝑅𝐶_𝑣𝑒𝑟 = 𝐹𝑂𝑉 𝑣𝑒𝑟 𝑁𝑓 𝑆𝑓 = 9.75 𝑚𝑚 1 𝑝𝑖𝑥𝑒𝑙 0.01 𝑚𝑚 = 975 𝑝𝑖𝑥𝑒𝑙𝑠
  • 124. 124 5. Designing a Machine Vision System ❑ 5.3. Resolution ▪ Object resolution (spatial resolution): assuming that a lens with a field of view of 14 mm ( > 13 mm) was chosen, a camera with resolution of 1440x1080 was chosen. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 𝑅𝑠 = 𝐹𝑂𝑉 𝑅𝐶 = 14 𝑚𝑚 1440 𝑝𝑖𝑥𝑒𝑙 = 0.01 𝑚𝑚/𝑝𝑖𝑥𝑒𝑙
  • 125. 125 5. Designing a Machine Vision System ❑ 5.3. Resolution ➢ Calculation of Resolution: Resolution for a Line Scan Camera Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 126. 126 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 127. 127 5. Designing a Machine Vision System ❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform ➢ Camera Model ▪ color sensor ▪ interface technology ▪ progressive scan for area cameras ▪ packaging size ▪ price and availability Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 128. 128 5. Designing a Machine Vision System ❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform ➢ Frame Grabber ▪ compatibility with the pixel rate ▪ compatibility with the software library ▪ number of cameras that can be addressed ▪ utilities to control the camera via the frame grabber ▪ timing and triggering of the camera ▪ availability of on-board processing ▪ availability of general purpose I/O ▪ price and availability Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 129. 129 5. Designing a Machine Vision System ❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform ➢ Pixel Rate ▪ This is the speed of imaging in terms of pixels per second. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 130. 130 5. Designing a Machine Vision System ❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform ➢ Pixel Rate ▪ For an area camera: Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. An overhead of 10% to 20% should be considered due to additional bus transfer.
  • 131. 131 5. Designing a Machine Vision System ❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform ➢ Pixel Rate ▪ For a line scan camera: Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 132. 132 5. Designing a Machine Vision System ❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform ➢ Hardware Platform Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. • Compatibility with frame grabber • Operating system • Development process • means for a user-friendly human machine interface • Processing load • Miscellaneous points: available interfaces, memory, packaging size, price, and availability ▪ smart cameras ▪ compact vision systems ▪ PC-based systems
  • 133. 133 5. Designing a Machine Vision System ❑ Example about a camera https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
  • 134. 134 5. Designing a Machine Vision System ❑ Example about a camera https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
  • 135. 135 5. Designing a Machine Vision System ❑ Example about a camera https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
  • 136. 136 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 137. 137 5. Designing a Machine Vision System ❑ 5.5. Lens Design ➢ Focal Length Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. standoff distance focal length Magnification lens extension focus distance
  • 138. 138 5. Designing a Machine Vision System ❑ 5.5. Lens Design ➢ Focal Length Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 139. 139 5. Designing a Machine Vision System ❑ 5.5. Lens Design ➢ Lens Flange Focal Distance ▪ This is the distance between the lens mount face and the image plane. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 140. 140 5. Designing a Machine Vision System ❑ 5.5. Lens Design ➢ Extension Tubes ▪ The lens extension l can be increased using the focus adjustmentof the lens. ▪ If the distance cannot be increased, extension tubes can be used to focus close objects. As a result, the depth of view is decreased. ▪ For higher magnifications, such as from 0.4 to 4, macro lenses offer better image quality. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 141. 141 5. Designing a Machine Vision System ❑ 5.5. Lens Design ➢ Lens Diameter and Sensor Size Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. Areas illuminated by the lens and camera; the left side displays an appropriate choice.
  • 142. 142 5. Designing a Machine Vision System ❑ 5.5. Lens Design ➢ Sensor Resolution and Lens Quality ▪ As for high resolution cameras, the requirements on the lens are higher than those for standard cameras. ▪ Using a low-budget lens might lead to poor image quality for high resolution sensors, whereas the quality is acceptable for lower resolutions. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 143. 143 5. Designing a Machine Vision System ❑ Example about lens https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/
  • 144. 144 5. Designing a Machine Vision System ❑ Example about lens https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/
  • 145. 145 5. Designing a Machine Vision System ❑ Example about lens https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/
  • 146. 146 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 147. 147 5. Designing a Machine Vision System ❑ 5.6. Choice of Illumination ➢ Concept: Maximize Contrast ▪ Direction of light: diffuse from all directions or directed from a range of angles ▪ Light spectrum ▪ Polarization: effect on surfaces, such as metal or glass Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 148. 148 5. Designing a Machine Vision System ❑ 5.6. Choice of Illumination ➢ Illumination Setups ▪ Backlight and ▪ Frontlight • Diffused light • Directed light • Confocal frontlight • Bright field • Dark field Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 149. 149 5. Designing a Machine Vision System ❑ 5.6. Choice of Illumination ➢ Light Sources ▪ Fluorescent tubes ▪ Halogen and xenon lamps ▪ LED ▪ Laser Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 150. 150 5. Designing a Machine Vision System ❑ 5.6. Choice of Illumination ➢ Approach to the Optimum Setup ▪ A confirmation of the setup based on experiments with sample parts is mandatory. ▪ The alignment of light, the part and the camera needs to be documented. ▪ To balance between similar setups images have to be captured and compared for the maximum contrast. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 151. 151 5. Designing a Machine Vision System ❑ 5.6. Choice of Illumination ➢ Interfering Lighting ▪ The influences of different lamps on the images have to be checked. ▪ To avoid interfering, a spatial separation can be achieved by using different camera stations. ▪ The part is imaged with different sets of cameras and illuminations. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 152. 152 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 153. 153 5. Designing a Machine Vision System ❑ 5.7. Mechanical Design ➢ As the cameras, lenses, standoff distances, and illumination devices are determined, the mechanical conditions can be defined. ➢ As for mounting of cameras and lights the adjustment is important for installation, operation, and maintenance. ➢ The devices have to be protected against vibration or shock. ➢ The position of cameras and lights should be changed easily. Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 154. 154 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 155. 155 5. Designing a Machine Vision System ❑ 5.8. Electrical Design ➢ The power supply ➢ The housing of cameras and illumination ➢ The length of cables as well as their laying Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 156. 156 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 157. 157 5. Designing a Machine Vision System ❑ 5.9. Software ➢ selection of a software library ➢ design and implementation of the application-specific software Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 158. 158 5. Designing a Machine Vision System ❑ 5.9. Software ➢ Software Library ➢ Software Structure ▪ Image acquisition ▪ Preprocessing ▪ Feature localization ▪ Feature extraction ▪ Feature interpretation ▪ Generation of results ▪ Handling interfaces Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 159. 159 5. Designing a Machine Vision System ❑ 5.9. Software ➢ General Topics ▪ Visualization of live images for all cameras ▪ Possibility of image saving ▪ Maintenance modus ▪ Log files for the system state ▪ Detailed visualization of the image processing ▪ Crucial processing parameters Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 160. 160 5. Designing a Machine Vision System 5.1. Camera Type 5.2. Field of View 5.3. Resolution 5.4. Choice of Camera, Frame Grabber, and Hardware Platform 5.5. Lens Design 5.6. Choice of Illumination 5.7. Mechanical Design 5.8. Electrical Design 5.9. Software 5.10. Costs Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 161. 161 5. Designing a Machine Vision System ❑ 5.10. Costs ➢ The development costs ▪ project management ▪ base design ▪ hardware components ▪ software licenses ▪ software development ▪ installation ▪ test runs, feasibility tests, and acceptance test ▪ training ▪ documentation Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 162. 162 5. Designing a Machine Vision System ❑ 5.10. Costs ➢ The operating costs ▪ maintenance, such as cleaning of the optical equipment ▪ change of equipment, such as lamps ▪ utility, for instance electrical power or compressed air if needed ▪ costs for system modification due to product changes Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.
  • 163. Quiz 1 Quiz Number 1 Quiz Type OX Example Select Question Choose the lighting for measuring the radius R and r of following object: Example A. Dome Light B. On-Axis Light C. Darkfield D. Backlight Answer Feedback R r
  • 164. Quiz 2 Quiz Number 2 Quiz Type OX Example Select Question Assume that the object size = 10 cm x 20 cm, sensor resolution = 640x480 pixels. Calculate the object resolution? Example A. 0.31 mm/pixel B. 0.21 mm/pixel C. 0.17 mm/pixel D. 0.42 mm/pixel Answer Feedback
  • 165. Quiz 3 Quiz Number 3 Quiz Type OX Example Select Question Splitting an input image into component areas is called: Example A. Image preprocessing B. Image segmentation C. Image recognition D. Image representation Answer Feedback
  • 166. Quiz 4 Quiz Number 4 Quiz Type OX Example Select Question The performance requirements of a machine vision system are: Example A. Accuracy B. Time performance C. Both A and B D. None of the above Answer Feedback
  • 167. Quiz 5 Quiz Number 5 Quiz Type OX Example Select Question Diameter Inspection of Rivets: + The nominal size of the rivets lies in a range of 3 mm to 4 mm + The required accuracy is 0.1 mm + The tolerance of part positioning is less than ±1 mm across the optical axis and ±0.1 mm in the direction of the optical axis. The belt stops for 1.5 s. + The maximum processing time is 2 s; the cycle time is 2.5s. + The maximum space for installing equipment is 500 mm.
  • 168. Quiz 5 Quiz Number 5 Quiz Type OX Example Select Question Bearing with rivet and disk