SlideShare a Scribd company logo
1 of 6
Download to read offline
2.3     Feature extraction
Many computer vision systems rely on first the detection of some features in the images.
Image features may be global (e.g. average grey level) or local (e.g. a straight line). They
may or may not be associated to scene elements. Different features demand different
detection methods. In general, feature extraction results in feature descriptors specifying
the locations and properties of the features found in the image.

2.3.1 Edge detection
Edges are pixels at or around which the image values undergo a sharp variation, e.g.
borders of object. Edge detection is often the first operation in finding interesting scene
elements such as lines, contours. Notice that noise can also cause intensity variations. So
a good edge detector should find genuine edges generated by scene elements, not by noise.
In the detection process, noise will be suppressed without destroying true edges, then
edges are enhanced and located.

Types of edge
Step edge occurs where the image intensity abruptly changes from one value on one side
of the discontinuity to a different value on the opposite side. If the intensity change occurs
over some pixels, it is a ramp edge.




Line edge occurs where the image intensity abruptly changes value but then returns to the
starting value within some short distance. If the two changes occur over a finite distance,
it is a roof edge.




Edge descriptor
Edge normal: the direction of the maximum intensity variation at the edge point which is
perpendicular to the edge
Edge direction: the direction tangent to the edge which is perpendicular to the edge
normal
Edge position: the image position at which the edge is located, along the edge normal
Edge strength: a measure of the intensity variation across the edge




                                              1
edge direction                             edge normal




First derivative operator – Sobel edge detector
Edge detection is the essentially the operation of detecting significant local intensity
changes in an image. Take step edge as an example, it is associated with a local peak in
the first derivative.

                            I



                            I’


The first derivative can be measured by the gradient.
                         ∂I 
                 G x   ∂x 
G (I( x , y)) =   =  ∂I 
                G y   
                         ∂y 
                         
The magnitude of the gradient is G (I( x , y)) = G 2 + G 2 . The calculation can also be
                                                   x     y

approximated by G (I( x , y)) = G x + G y or G (I( x , y)) = max( G x , G y ) . The direction
                                                                       Gy   
of the gradient with respect to the x axis is ∠G (I( x , y)) = tan −1 
                                                                      G     .
                                                                             
                                                                       x    

For digital image, the gradient is measured by discrete approximation.
G x ≅ I[i, j + 1] − I[i, j]
G y ≅ I[i, j] − I[i + 1, j]
The equations can be implemented with convolution masks

            Gx =     -1     1            Gy =       1    1

                     -1    1                        -1   -1



                                                2
 1       1
which estimate the gradient at the coordinates i + , j +  . To detect edge, you can
                                                2       2
perform noise filtering before the gradient computation. Genuine edges can be located
with sub-pixel resolution simply by thresholding the gradient magnitudes.

To avoid having the gradient calculated about an interpolated point between pixels, you
can use a 3 x 3 neighborhood. The Sobel edge detector calculates the gradient magnitude
by S(I( x , y)) = S 2 + S 2 where Sx and Sy can be implemented using convolution masks
                    x     y


            Sx =    -1    0        1       Sy =    1     2     1

                    -2    0        2               0    0     0

                    -1    0        1               -1   -2    -1

Second derivative operator – Laplacian of Gaussian
Another approach is to locate the zero crossing in the second derivative.


                              I



                              I’



                              I”




The zero crossings can be located by the Laplacian operator.
                 ∂ 2I ∂ 2I
∇ 2 I( x , y ) = 2 + 2
                ∂x      ∂y
Equations of discrete approximation are
 ∂ 2I
       = I[i, j + 1] − 2I[i, j] + I[i, j − 1]
∂x 2
 ∂2I
       = I[i + 1, j] − 2I[i, j] + I[i − 1, j]
∂y 2
The Laplacian operator can be implemented using the convolution mask




                                             3
∇2 ≈    0     1     0

                      1      -4    1

                      0      1     0

In many cases, the actual edge location must be determined by interpolation. Edge
operator involving two derivatives is affected by noise more than an operator involving a
single derivative.

A better approach is to combine Gaussian filtering with the second derivative – Laplacian
of Gaussian (LoG). Steps in edge detection:
• filter out the image noise using Gaussian filter
• enhance the edge pixels using 2D Laplacian operator
• edge is detected when there is a zero crossing in the second derivative with a
    corresponding large peak in the first derivative
• estimate the edge location with sub-pixel resolution using linear interpolation

Some methods apply filtering masks of multiple sizes and locate the edge pixels by
analyzing the behavior of edges at different scales of filtering.

Canny edge detector
There is a trade-off between noise suppression and edge localization. An edge detector
can reduce noise by smoothing the image, but this will result in spreading of edges and
add uncertainty to the location of edge. An edge detector can have greater sensitivity to
the presence of edges, but this will also increase the sensitivity of the detector to noise.
The type of linear operator that provides the best compromise between noise immunity
and edge localization, while retaining the advantages of Gaussian filtering, is the first
derivative of a Gaussian. This operator corresponds to smoothing an image with a
Gaussian function and then computing the gradient. The operator is not rotationally
symmetric – it is symmetric along the edge direction and anti-symmetric along the edge
normal.

The Canny edge detector is the first derivative of a Gaussian and closely approximates the
operator that optimizes the product of SNR and edge localization. Steps in edge detection:
• edge enhancement
• non-maximum suppression
• hysteresis thresholding

edge enhancement:
Apply Gaussian smoothing to the image. Compute the gradient of the smoothed image
and estimate the magnitude and orientation of the gradient.
S[i, j] = G[i, j; σ] ∗ I[i, j]



                                              4
where G is a Gaussian with zero mean and standard deviation σ. The gradient
components are computed by
          (S[i, j + 1] − S[i, j] + S[i + 1, j + 1] − S[i + 1, j])
P[i, j] =
                                    2
          (S[i, j] − S[i + 1, j] + S[i, j + 1] − S[i + 1, j + 1])
Q[i, j] =
                                     2
Magnitude of the gradient is
M[i, j] = P[i, j] 2 + Q[i, j] 2
Orientation of the gradient is
                  Q[i, j] 
θ[i, j] = tan −1 
                  P[i, j] 
                           
                          

non-maximum suppression:
M[i, j] may contain wide ridges around the local maximum. This step is to thin such
ridges to produce 1-pixel wide edges. Values of M[i, j] along the edge normal that are not
peak will be suppressed. Let the possible edge normal orientations be quantized into 4,
e.g. 0°, 45°, 90°, and 135° with respect to the horizontal axis. For each pixel (i, j), find
the orientation which best approximates θ[i, j]. If M[i, j] is smaller than at least one of its
two neighbors along the quantized orientation, change M[i, j] to zero.

hysteresis thresholding:
M[i, j], after the non-maximum suppression step, may still contain local maxima created
by noise. To get rid of false edges by thresholding, some false edges may still remain if
the threshold value is set too low, or some true edges may be deleted if the threshold
value is set too high. An effective scheme is to use 2 thresholds τl and τh, e.g. τh = 2τl.
Scan the non-zero points of M[i, j] in a fixed order. If M[i, j] is larger than τh, locate it as
an edge pixel. Else if any 8-neighbors of (i, j) have gradient magnitude larger than τl,
locate them as edge pixels. Continue until another edge pixel is located by τh. Therefore,
the Canny edge detector performs edge linking as a by-product.

Final words
Edge detection has been one of the most popular research areas since the early days of
computer vision. There is a large literature on edge detection. You should be aware that
specific edge detection method can be very useful for particular computer vision
applications. However, a universal edge detector remains to be seen.

2.3.2 Corner detection
Corners are quite stable across sequences of images. They are interesting features that can
be employed for tracking objects across sequences of images. Consider an image point p,
a neighborhood Q of p, and a matrix C defined as




                                                5
 ∑G2  x    ∑G xG y 
C=  Q          Q        
                     G2 
    ∑            ∑ y 
        G xG y
    Q             Q      
We can think of C as a diagonal matrix
    λ    0
C= 1        
     0 λ2 
The two eigenvalues λ1 and λ2 are non-negative. A corner is located in Q where λ1 ≥ λ2 >
0 and λ2 is large enough. Steps in corner detection:
• compute the image gradient
• for each image point p, form matrix C over a neighborhood Q of p, compute the
    smaller eigenvalue λ2 of C, if λ2 > threshold τ save the coordinates of p into a list L
• sort L in decreasing order of λ2
• scan L from top to bottom, for each corner point p, delete neighboring corner points in
    L

References
E. Trucco & A. Verri, Introductory Techniques for 3-D Computer Vision, Prentice Hall,
1998, Chapter 4.
R. Jain, R. Kasturi & B. G. Schunck, Machine Vision, McGraw-Hill, 1995, Chapter 5.

Summary
♦ first derivative edge detector
♦ second derivative edge detector
♦ Canny edge detector
♦ corner detection




                                            6

More Related Content

What's hot

Lesson18 Double Integrals Over Rectangles Slides
Lesson18   Double Integrals Over Rectangles SlidesLesson18   Double Integrals Over Rectangles Slides
Lesson18 Double Integrals Over Rectangles SlidesMatthew Leingang
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image ProcessingGabriel Peyré
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationGabriel Peyré
 
Mesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursMesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursGabriel Peyré
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registration
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, RegistrationCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registration
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registrationzukun
 
Kccsi 2012 a real-time robust object tracking-v2
Kccsi 2012   a real-time robust object tracking-v2Kccsi 2012   a real-time robust object tracking-v2
Kccsi 2012 a real-time robust object tracking-v2Prarinya Siritanawan
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse RepresentationGabriel Peyré
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptorszukun
 
State of art pde based ip to bt vijayakrishna rowthu
State of art pde based ip to bt  vijayakrishna rowthuState of art pde based ip to bt  vijayakrishna rowthu
State of art pde based ip to bt vijayakrishna rowthuvijayakrishna rowthu
 
Design Approach of Colour Image Denoising Using Adaptive Wavelet
Design Approach of Colour Image Denoising Using Adaptive WaveletDesign Approach of Colour Image Denoising Using Adaptive Wavelet
Design Approach of Colour Image Denoising Using Adaptive WaveletIJERD Editor
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportGabriel Peyré
 
Mesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusMesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusGabriel Peyré
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionGabriel Peyré
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectorszukun
 
Centroids moments of inertia
Centroids moments of inertiaCentroids moments of inertia
Centroids moments of inertiacoolzero2012
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsGabriel Peyré
 
Verification of a brick wang tiling algorithm
Verification of a brick wang tiling algorithmVerification of a brick wang tiling algorithm
Verification of a brick wang tiling algorithmYoshihiro Mizoguchi
 
ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...
ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...
ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...grssieee
 

What's hot (20)

Lesson18 Double Integrals Over Rectangles Slides
Lesson18   Double Integrals Over Rectangles SlidesLesson18   Double Integrals Over Rectangles Slides
Lesson18 Double Integrals Over Rectangles Slides
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image Processing
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh Parameterization
 
Mesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursMesh Processing Course : Active Contours
Mesh Processing Course : Active Contours
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registration
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, RegistrationCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registration
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registration
 
Kccsi 2012 a real-time robust object tracking-v2
Kccsi 2012   a real-time robust object tracking-v2Kccsi 2012   a real-time robust object tracking-v2
Kccsi 2012 a real-time robust object tracking-v2
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse Representation
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
 
State of art pde based ip to bt vijayakrishna rowthu
State of art pde based ip to bt  vijayakrishna rowthuState of art pde based ip to bt  vijayakrishna rowthu
State of art pde based ip to bt vijayakrishna rowthu
 
Design Approach of Colour Image Denoising Using Adaptive Wavelet
Design Approach of Colour Image Denoising Using Adaptive WaveletDesign Approach of Colour Image Denoising Using Adaptive Wavelet
Design Approach of Colour Image Denoising Using Adaptive Wavelet
 
Lecture 06
Lecture 06Lecture 06
Lecture 06
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal Transport
 
Dip3
Dip3Dip3
Dip3
 
Mesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusMesh Processing Course : Differential Calculus
Mesh Processing Course : Differential Calculus
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : Multiresolution
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
 
Centroids moments of inertia
Centroids moments of inertiaCentroids moments of inertia
Centroids moments of inertia
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
 
Verification of a brick wang tiling algorithm
Verification of a brick wang tiling algorithmVerification of a brick wang tiling algorithm
Verification of a brick wang tiling algorithm
 
ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...
ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...
ANISOTROPIC SURFACES DETECTION USING INTENSITY MAPS ACQUIRED BY AN AIRBORNE L...
 

Viewers also liked

GSR Edge Detection Vision and Routing Process
GSR  Edge Detection Vision and Routing ProcessGSR  Edge Detection Vision and Routing Process
GSR Edge Detection Vision and Routing ProcessGetech Automation
 
Scale Invariant Feature Transform
Scale Invariant Feature TransformScale Invariant Feature Transform
Scale Invariant Feature Transformkislayabhi
 
Computer Vision: Pattern Recognition
Computer Vision: Pattern RecognitionComputer Vision: Pattern Recognition
Computer Vision: Pattern Recognitionedsfocci
 
A FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISION
A FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISIONA FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISION
A FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISIONMarcos Nieto
 
Feature Matching using SIFT algorithm
Feature Matching using SIFT algorithmFeature Matching using SIFT algorithm
Feature Matching using SIFT algorithmSajid Pareeth
 
Edge detection of video using matlab code
Edge detection of video using matlab codeEdge detection of video using matlab code
Edge detection of video using matlab codeBhushan Deore
 
An Introduction to Computer Vision
An Introduction to Computer VisionAn Introduction to Computer Vision
An Introduction to Computer Visionguestd1b1b5
 
Basics of edge detection and forier transform
Basics of edge detection and forier transformBasics of edge detection and forier transform
Basics of edge detection and forier transformSimranjit Singh
 
Edge Detection algorithm and code
Edge Detection algorithm and codeEdge Detection algorithm and code
Edge Detection algorithm and codeVaddi Manikanta
 
Matlab Feature Extraction Using Segmentation And Edge Detection
Matlab Feature Extraction Using Segmentation And Edge DetectionMatlab Feature Extraction Using Segmentation And Edge Detection
Matlab Feature Extraction Using Segmentation And Edge DetectionDataminingTools Inc
 
General introduction to computer vision
General introduction to computer visionGeneral introduction to computer vision
General introduction to computer visionbutest
 

Viewers also liked (15)

GSR Edge Detection Vision and Routing Process
GSR  Edge Detection Vision and Routing ProcessGSR  Edge Detection Vision and Routing Process
GSR Edge Detection Vision and Routing Process
 
Scale Invariant Feature Transform
Scale Invariant Feature TransformScale Invariant Feature Transform
Scale Invariant Feature Transform
 
Computer Vision: Pattern Recognition
Computer Vision: Pattern RecognitionComputer Vision: Pattern Recognition
Computer Vision: Pattern Recognition
 
Computer Vision
Computer VisionComputer Vision
Computer Vision
 
A FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISION
A FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISIONA FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISION
A FRIENDLY APPROACH TO PARTICLE FILTERS IN COMPUTER VISION
 
Feature Matching using SIFT algorithm
Feature Matching using SIFT algorithmFeature Matching using SIFT algorithm
Feature Matching using SIFT algorithm
 
Edge detection of video using matlab code
Edge detection of video using matlab codeEdge detection of video using matlab code
Edge detection of video using matlab code
 
Feature Extraction
Feature ExtractionFeature Extraction
Feature Extraction
 
An Introduction to Computer Vision
An Introduction to Computer VisionAn Introduction to Computer Vision
An Introduction to Computer Vision
 
Basics of edge detection and forier transform
Basics of edge detection and forier transformBasics of edge detection and forier transform
Basics of edge detection and forier transform
 
Computer Vision
Computer VisionComputer Vision
Computer Vision
 
Edge Detection algorithm and code
Edge Detection algorithm and codeEdge Detection algorithm and code
Edge Detection algorithm and code
 
Edge detection
Edge detectionEdge detection
Edge detection
 
Matlab Feature Extraction Using Segmentation And Edge Detection
Matlab Feature Extraction Using Segmentation And Edge DetectionMatlab Feature Extraction Using Segmentation And Edge Detection
Matlab Feature Extraction Using Segmentation And Edge Detection
 
General introduction to computer vision
General introduction to computer visionGeneral introduction to computer vision
General introduction to computer vision
 

Similar to Test

image segmentation image segmentation.pptx
image segmentation image segmentation.pptximage segmentation image segmentation.pptx
image segmentation image segmentation.pptxNaveenKumar5162
 
Math behind the kernels
Math behind the kernelsMath behind the kernels
Math behind the kernelsRevanth Kumar
 
Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentationasodariyabhavesh
 
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTIONEDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTIONcscpconf
 
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTIONEDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTIONcsitconf
 
Module-5-1_230523_171754 (1).pdf
Module-5-1_230523_171754 (1).pdfModule-5-1_230523_171754 (1).pdf
Module-5-1_230523_171754 (1).pdfvikasmittal92
 
Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251Editor IJARCET
 
Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251Editor IJARCET
 
Introduction to Digital image processing
Introduction to Digital image processing Introduction to Digital image processing
Introduction to Digital image processing Sairam Geethanath
 
Notes on image processing
Notes on image processingNotes on image processing
Notes on image processingMohammed Kamel
 
Scale Invariant Feature Tranform
Scale Invariant Feature TranformScale Invariant Feature Tranform
Scale Invariant Feature TranformShanker Naik
 
ImageSegmentation (1).ppt
ImageSegmentation (1).pptImageSegmentation (1).ppt
ImageSegmentation (1).pptNoorUlHaq47
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.pptAVUDAI1
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.pptDEEPUKUMARR
 
Image segmentation
Image segmentation Image segmentation
Image segmentation Amnaakhaan
 
Spatial Filtering in intro image processingr
Spatial Filtering in intro image processingrSpatial Filtering in intro image processingr
Spatial Filtering in intro image processingrkumarankit06875
 

Similar to Test (20)

Test
TestTest
Test
 
Edge detection
Edge detectionEdge detection
Edge detection
 
image segmentation image segmentation.pptx
image segmentation image segmentation.pptximage segmentation image segmentation.pptx
image segmentation image segmentation.pptx
 
Math behind the kernels
Math behind the kernelsMath behind the kernels
Math behind the kernels
 
Image segmentation
Image segmentationImage segmentation
Image segmentation
 
Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentation
 
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTIONEDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
 
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTIONEDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
EDGE DETECTION IN RADAR IMAGES USING WEIBULL DISTRIBUTION
 
Module-5-1_230523_171754 (1).pdf
Module-5-1_230523_171754 (1).pdfModule-5-1_230523_171754 (1).pdf
Module-5-1_230523_171754 (1).pdf
 
Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251
 
Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251Ijarcet vol-2-issue-7-2246-2251
Ijarcet vol-2-issue-7-2246-2251
 
Introduction to Digital image processing
Introduction to Digital image processing Introduction to Digital image processing
Introduction to Digital image processing
 
Notes on image processing
Notes on image processingNotes on image processing
Notes on image processing
 
Scale Invariant Feature Tranform
Scale Invariant Feature TranformScale Invariant Feature Tranform
Scale Invariant Feature Tranform
 
ImageSegmentation (1).ppt
ImageSegmentation (1).pptImageSegmentation (1).ppt
ImageSegmentation (1).ppt
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.ppt
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.ppt
 
Image segmentation
Image segmentation Image segmentation
Image segmentation
 
Spatial Filtering in intro image processingr
Spatial Filtering in intro image processingrSpatial Filtering in intro image processingr
Spatial Filtering in intro image processingr
 
Lecture 8
Lecture 8Lecture 8
Lecture 8
 

More from Kinni MEW (16)

Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Test
TestTest
Test
 
Lec1
Lec1Lec1
Lec1
 

Test

  • 1. 2.3 Feature extraction Many computer vision systems rely on first the detection of some features in the images. Image features may be global (e.g. average grey level) or local (e.g. a straight line). They may or may not be associated to scene elements. Different features demand different detection methods. In general, feature extraction results in feature descriptors specifying the locations and properties of the features found in the image. 2.3.1 Edge detection Edges are pixels at or around which the image values undergo a sharp variation, e.g. borders of object. Edge detection is often the first operation in finding interesting scene elements such as lines, contours. Notice that noise can also cause intensity variations. So a good edge detector should find genuine edges generated by scene elements, not by noise. In the detection process, noise will be suppressed without destroying true edges, then edges are enhanced and located. Types of edge Step edge occurs where the image intensity abruptly changes from one value on one side of the discontinuity to a different value on the opposite side. If the intensity change occurs over some pixels, it is a ramp edge. Line edge occurs where the image intensity abruptly changes value but then returns to the starting value within some short distance. If the two changes occur over a finite distance, it is a roof edge. Edge descriptor Edge normal: the direction of the maximum intensity variation at the edge point which is perpendicular to the edge Edge direction: the direction tangent to the edge which is perpendicular to the edge normal Edge position: the image position at which the edge is located, along the edge normal Edge strength: a measure of the intensity variation across the edge 1
  • 2. edge direction edge normal First derivative operator – Sobel edge detector Edge detection is the essentially the operation of detecting significant local intensity changes in an image. Take step edge as an example, it is associated with a local peak in the first derivative. I I’ The first derivative can be measured by the gradient.  ∂I   G x   ∂x  G (I( x , y)) =   =  ∂I  G y     ∂y    The magnitude of the gradient is G (I( x , y)) = G 2 + G 2 . The calculation can also be x y approximated by G (I( x , y)) = G x + G y or G (I( x , y)) = max( G x , G y ) . The direction  Gy  of the gradient with respect to the x axis is ∠G (I( x , y)) = tan −1  G .   x  For digital image, the gradient is measured by discrete approximation. G x ≅ I[i, j + 1] − I[i, j] G y ≅ I[i, j] − I[i + 1, j] The equations can be implemented with convolution masks Gx = -1 1 Gy = 1 1 -1 1 -1 -1 2
  • 3.  1 1 which estimate the gradient at the coordinates i + , j +  . To detect edge, you can  2 2 perform noise filtering before the gradient computation. Genuine edges can be located with sub-pixel resolution simply by thresholding the gradient magnitudes. To avoid having the gradient calculated about an interpolated point between pixels, you can use a 3 x 3 neighborhood. The Sobel edge detector calculates the gradient magnitude by S(I( x , y)) = S 2 + S 2 where Sx and Sy can be implemented using convolution masks x y Sx = -1 0 1 Sy = 1 2 1 -2 0 2 0 0 0 -1 0 1 -1 -2 -1 Second derivative operator – Laplacian of Gaussian Another approach is to locate the zero crossing in the second derivative. I I’ I” The zero crossings can be located by the Laplacian operator. ∂ 2I ∂ 2I ∇ 2 I( x , y ) = 2 + 2 ∂x ∂y Equations of discrete approximation are ∂ 2I = I[i, j + 1] − 2I[i, j] + I[i, j − 1] ∂x 2 ∂2I = I[i + 1, j] − 2I[i, j] + I[i − 1, j] ∂y 2 The Laplacian operator can be implemented using the convolution mask 3
  • 4. ∇2 ≈ 0 1 0 1 -4 1 0 1 0 In many cases, the actual edge location must be determined by interpolation. Edge operator involving two derivatives is affected by noise more than an operator involving a single derivative. A better approach is to combine Gaussian filtering with the second derivative – Laplacian of Gaussian (LoG). Steps in edge detection: • filter out the image noise using Gaussian filter • enhance the edge pixels using 2D Laplacian operator • edge is detected when there is a zero crossing in the second derivative with a corresponding large peak in the first derivative • estimate the edge location with sub-pixel resolution using linear interpolation Some methods apply filtering masks of multiple sizes and locate the edge pixels by analyzing the behavior of edges at different scales of filtering. Canny edge detector There is a trade-off between noise suppression and edge localization. An edge detector can reduce noise by smoothing the image, but this will result in spreading of edges and add uncertainty to the location of edge. An edge detector can have greater sensitivity to the presence of edges, but this will also increase the sensitivity of the detector to noise. The type of linear operator that provides the best compromise between noise immunity and edge localization, while retaining the advantages of Gaussian filtering, is the first derivative of a Gaussian. This operator corresponds to smoothing an image with a Gaussian function and then computing the gradient. The operator is not rotationally symmetric – it is symmetric along the edge direction and anti-symmetric along the edge normal. The Canny edge detector is the first derivative of a Gaussian and closely approximates the operator that optimizes the product of SNR and edge localization. Steps in edge detection: • edge enhancement • non-maximum suppression • hysteresis thresholding edge enhancement: Apply Gaussian smoothing to the image. Compute the gradient of the smoothed image and estimate the magnitude and orientation of the gradient. S[i, j] = G[i, j; σ] ∗ I[i, j] 4
  • 5. where G is a Gaussian with zero mean and standard deviation σ. The gradient components are computed by (S[i, j + 1] − S[i, j] + S[i + 1, j + 1] − S[i + 1, j]) P[i, j] = 2 (S[i, j] − S[i + 1, j] + S[i, j + 1] − S[i + 1, j + 1]) Q[i, j] = 2 Magnitude of the gradient is M[i, j] = P[i, j] 2 + Q[i, j] 2 Orientation of the gradient is  Q[i, j]  θ[i, j] = tan −1   P[i, j]     non-maximum suppression: M[i, j] may contain wide ridges around the local maximum. This step is to thin such ridges to produce 1-pixel wide edges. Values of M[i, j] along the edge normal that are not peak will be suppressed. Let the possible edge normal orientations be quantized into 4, e.g. 0°, 45°, 90°, and 135° with respect to the horizontal axis. For each pixel (i, j), find the orientation which best approximates θ[i, j]. If M[i, j] is smaller than at least one of its two neighbors along the quantized orientation, change M[i, j] to zero. hysteresis thresholding: M[i, j], after the non-maximum suppression step, may still contain local maxima created by noise. To get rid of false edges by thresholding, some false edges may still remain if the threshold value is set too low, or some true edges may be deleted if the threshold value is set too high. An effective scheme is to use 2 thresholds τl and τh, e.g. τh = 2τl. Scan the non-zero points of M[i, j] in a fixed order. If M[i, j] is larger than τh, locate it as an edge pixel. Else if any 8-neighbors of (i, j) have gradient magnitude larger than τl, locate them as edge pixels. Continue until another edge pixel is located by τh. Therefore, the Canny edge detector performs edge linking as a by-product. Final words Edge detection has been one of the most popular research areas since the early days of computer vision. There is a large literature on edge detection. You should be aware that specific edge detection method can be very useful for particular computer vision applications. However, a universal edge detector remains to be seen. 2.3.2 Corner detection Corners are quite stable across sequences of images. They are interesting features that can be employed for tracking objects across sequences of images. Consider an image point p, a neighborhood Q of p, and a matrix C defined as 5
  • 6.  ∑G2 x ∑G xG y  C=  Q Q  G2  ∑ ∑ y  G xG y Q Q  We can think of C as a diagonal matrix λ 0 C= 1   0 λ2  The two eigenvalues λ1 and λ2 are non-negative. A corner is located in Q where λ1 ≥ λ2 > 0 and λ2 is large enough. Steps in corner detection: • compute the image gradient • for each image point p, form matrix C over a neighborhood Q of p, compute the smaller eigenvalue λ2 of C, if λ2 > threshold τ save the coordinates of p into a list L • sort L in decreasing order of λ2 • scan L from top to bottom, for each corner point p, delete neighboring corner points in L References E. Trucco & A. Verri, Introductory Techniques for 3-D Computer Vision, Prentice Hall, 1998, Chapter 4. R. Jain, R. Kasturi & B. G. Schunck, Machine Vision, McGraw-Hill, 1995, Chapter 5. Summary ♦ first derivative edge detector ♦ second derivative edge detector ♦ Canny edge detector ♦ corner detection 6