The document summarizes concepts related to intensity transformations and spatial filtering of digital images. It discusses two categories of spatial processing: intensity transformations that operate on single pixels, and spatial filtering that operates on pixel neighborhoods. Intensity transformations covered include contrast manipulation, thresholding, negatives, logarithmic and power-law transforms. Spatial filtering examples given are image smoothing and sharpening. The document also explains histogram processing and equalization.
The document discusses key concepts in digital image processing including array vs matrix operations, linear vs nonlinear operations, and arithmetic and logical operations. Array operations are performed on a pixel-by-pixel basis while matrix operations use matrix theory. A linear operator satisfies additivity and homogeneity, processing the sum and scaling of inputs the same as individual inputs summed or scaled. Logical operations like AND, OR, and NOT are applied to binary images.
Unit 3 discusses image segmentation techniques. Similarity based techniques group similar image components, like pixels or frames, for compact representation. Common applications include medical imaging, satellite images, and surveillance. Methods include thresholding and k-means clustering. Segmentation of grayscale images is based on discontinuities in pixel values, detecting edges, or similarities using thresholding, region growing, and splitting/merging. Region growing starts with seed pixels and groups neighboring pixels with similar properties. Region splitting starts with the full image and divides non-homogeneous regions, while region merging combines small similar regions.
This document summarizes techniques for least mean square filtering and geometric transformations. It discusses minimum mean square error (Wiener) filtering, constrained least squares filtering, and geometric mean filtering for noise removal. It also covers spatial transformations, nearest neighbor gray level interpolation, and bilinear interpolation for geometric correction of distorted images. Examples are provided to demonstrate geometric distortion, nearest neighbor interpolation, and bilinear transformation.
Here in the ppt a detailed description of Image Enhancement Techniques is given which includes topics like Basic Gray level Transformations,Histogram Processing.
Enhancement using Arithmetic/Logic Operations.
image averaging and image averaging methods.
Piecewise-Linear Transformation Functions
This document discusses various mathematical tools used in digital image processing (DIP), including array versus matrix operations, linear versus nonlinear operations, arithmetic operations, set and logical operations, spatial operations, vector and matrix operations, and image transforms. Key points include:
- Array operations are performed on a pixel-by-pixel basis, while matrix operations consider relationships between pixels.
- Linear operators preserve scaling and addition properties, while nonlinear operators like max do not.
- Spatial operations include single-pixel, neighborhood, and geometric transformations of pixel locations and intensities.
- Images can be represented as vectors and transformed using matrix operations.
- Common transforms like Fourier use separable, symmetric kernels to decompose images into frequency domains.
Lecture 13 (Usage of Fourier transform in image processing)VARUN KUMAR
This document discusses the Fourier transform and its applications in image processing. It begins by explaining the Fourier transform for 1D and 2D continuous and discrete signals. The Fourier transform converts a signal from the time or space domain to the frequency domain. It then covers properties of the Fourier transform such as separability and translation. The document concludes by mentioning references for further reading on image processing and computer vision topics.
The document discusses key concepts in digital image processing including array vs matrix operations, linear vs nonlinear operations, and arithmetic and logical operations. Array operations are performed on a pixel-by-pixel basis while matrix operations use matrix theory. A linear operator satisfies additivity and homogeneity, processing the sum and scaling of inputs the same as individual inputs summed or scaled. Logical operations like AND, OR, and NOT are applied to binary images.
Unit 3 discusses image segmentation techniques. Similarity based techniques group similar image components, like pixels or frames, for compact representation. Common applications include medical imaging, satellite images, and surveillance. Methods include thresholding and k-means clustering. Segmentation of grayscale images is based on discontinuities in pixel values, detecting edges, or similarities using thresholding, region growing, and splitting/merging. Region growing starts with seed pixels and groups neighboring pixels with similar properties. Region splitting starts with the full image and divides non-homogeneous regions, while region merging combines small similar regions.
This document summarizes techniques for least mean square filtering and geometric transformations. It discusses minimum mean square error (Wiener) filtering, constrained least squares filtering, and geometric mean filtering for noise removal. It also covers spatial transformations, nearest neighbor gray level interpolation, and bilinear interpolation for geometric correction of distorted images. Examples are provided to demonstrate geometric distortion, nearest neighbor interpolation, and bilinear transformation.
Here in the ppt a detailed description of Image Enhancement Techniques is given which includes topics like Basic Gray level Transformations,Histogram Processing.
Enhancement using Arithmetic/Logic Operations.
image averaging and image averaging methods.
Piecewise-Linear Transformation Functions
This document discusses various mathematical tools used in digital image processing (DIP), including array versus matrix operations, linear versus nonlinear operations, arithmetic operations, set and logical operations, spatial operations, vector and matrix operations, and image transforms. Key points include:
- Array operations are performed on a pixel-by-pixel basis, while matrix operations consider relationships between pixels.
- Linear operators preserve scaling and addition properties, while nonlinear operators like max do not.
- Spatial operations include single-pixel, neighborhood, and geometric transformations of pixel locations and intensities.
- Images can be represented as vectors and transformed using matrix operations.
- Common transforms like Fourier use separable, symmetric kernels to decompose images into frequency domains.
Lecture 13 (Usage of Fourier transform in image processing)VARUN KUMAR
This document discusses the Fourier transform and its applications in image processing. It begins by explaining the Fourier transform for 1D and 2D continuous and discrete signals. The Fourier transform converts a signal from the time or space domain to the frequency domain. It then covers properties of the Fourier transform such as separability and translation. The document concludes by mentioning references for further reading on image processing and computer vision topics.
Spatial domain filtering involves modifying an image by applying a filter or kernel to pixels within a neighborhood region. There are two main types of spatial filters - smoothing/low-pass filters which blur an image, and sharpening/high-pass filters which enhance edges and details. Smoothing filters replace each pixel value with the average of neighboring pixels, reducing noise. Sharpening filters use derivatives of Gaussian kernels to highlight areas of rapid intensity change, increasing contrast along edges. The effects of filtering depend on the size and shape of the kernel, with larger kernels producing more blurring or sharpening.
1. Image restoration aims to reconstruct or recover an image that has been distorted by known degradation processes.
2. Degradation can occur during image acquisition, display, or processing due to factors like sensor noise, blurring, motion, or atmospheric effects.
3. Restoration techniques model the degradation process and apply the inverse to estimate the original undistorted image. The accuracy of the estimate depends on how well the degradation is modeled.
The document discusses digital image processing and provides an overview of key concepts. It defines digital and analog images and explains how digital images are represented by pixels. It outlines fundamental steps in digital image processing like image acquisition, enhancement, restoration, morphological processing, segmentation, representation, compression and object recognition. It also discusses applications in areas like remote sensing, medical imaging, film and video effects.
The document discusses the fundamental steps in digital image processing. It describes 7 key steps: (1) image acquisition, (2) image enhancement, (3) image restoration, (4) color image processing, (5) wavelets and multiresolution processing, (6) image compression, and (7) morphological processing. For each step, it provides brief explanations of the techniques and purposes involved in digital image processing.
The document discusses image restoration techniques. It introduces common image degradation models and noise models encountered in imaging. Spatial and frequency domain filtering methods are described for restoration when the degradation is additive noise. Adaptive median filtering and frequency domain filtering techniques like bandreject, bandpass and notch filters are explained for periodic noise removal. Optimal filtering methods like Wiener filtering that minimize mean square error are also covered. The document provides an overview of key concepts and methods in image restoration.
Digital Image Processing covers intensity transformations that can be performed on images. These include basic transformations like negatives, log transformations, and power-law transformations. It also discusses image histograms, which measure the frequency of each intensity level in an image. Histogram equalization aims to improve contrast by mapping intensities to produce a uniform histogram. It works by spreading out the most frequent intensity values.
This document discusses image enhancement techniques in the spatial domain. It begins by introducing intensity transformations and spatial filtering as the two principal categories of spatial domain processing. It then describes the basics of intensity transformations, including how they directly manipulate pixel values in an image. The document focuses on different types of basic intensity transformation functions such as image negation, log transformations, power law transformations, and piecewise linear transformations. It provides examples of how these transformations can be used to enhance images. Finally, it discusses histogram processing and how the histogram of an image provides information about the distribution of pixel intensities.
This document provides a 3 sentence summary of a lecture on image enhancement through histogram specification. The lecture discusses performing histogram equalization on an input image to match the histogram of a target image through mapping the pixel values. Any questions about histogram specification or equalization are welcome at the end.
This document provides an overview of mathematical morphology and its applications to image processing. Some key points:
- Mathematical morphology uses concepts from set theory and uses structuring elements to probe and extract image properties. It provides tools for tasks like noise removal, thinning, and shape analysis.
- Basic operations include erosion, dilation, opening, and closing. Erosion shrinks objects while dilation expands them. Opening and closing combine these to smooth contours or fill gaps.
- Hit-or-miss transforms allow detecting specific shapes. Skeletonization reduces objects to 1-pixel wide representations.
- Morphological operations can be applied to binary or grayscale images. Structuring elements are used to specify the neighborhood of pixels
1. The document discusses image processing in the frequency domain, which involves transforming an image into its frequency distribution using mathematical operators called transformations like the Fourier transform.
2. The Fourier transform decomposes an image into its frequency components, which can be divided into high frequency components corresponding to edges and low frequency components corresponding to smooth regions.
3. An example computes the 2D discrete Fourier transform of a toy image using zero-padding to increase resolution, and the fftshift function is used to center the DC coefficient when visualizing the transformed image.
This document discusses various techniques for image enhancement in the frequency domain. It describes three types of low-pass filters for smoothing images: ideal low-pass filters, Butterworth low-pass filters, and Gaussian low-pass filters. It also discusses three corresponding types of high-pass filters for sharpening images: ideal high-pass filters, Butterworth high-pass filters, and Gaussian high-pass filters. The key steps in frequency domain filtering are also summarized.
This presentation contains concepts of different image restoration and reconstruction techniques used nowadays in the field of digital image processing. Slides are prepared from Gonzalez book and Pratt book.
This document discusses image enhancement techniques in the spatial domain. It describes two categories of spatial domain operations: point processing and neighborhood processing. Point processing involves direct manipulation of pixel values through techniques like contrast stretching and thresholding. Neighborhood processing considers pixels in a local region and applies techniques like averaging filters. The document outlines several gray level transformations for enhancement, including logarithmic, power-law, piecewise linear, and bit-plane slicing transformations. It also discusses arithmetic and logic operations on images.
Image enhancement techniques can be used to improve image visual appearance and analysis by accentuating features like edges and boundaries. There are several techniques including:
1. Point operations like contrast stretching and thresholding that modify pixel values.
2. Spatial operations like noise smoothing and sharpening that apply neighborhood pixel averaging or differencing.
3. Transform domain techniques like filtering in the frequency domain to accelerate operations like noise removal.
4. Edge enhancement methods like the pyramid approach that detects edges across multiple image scales to isolate significant edges.
Linear filters like averaging and Gaussian filters can remove grain noise by averaging pixel values in a neighborhood. Median filters are better at removing outliers without reducing sharpness by setting a pixel to the median value in its neighborhood. The document demonstrates applying averaging and median filters in Matlab to remove noise, and using morphological opening to estimate and subtract a background illumination to rectify it.
Morphological image processing uses mathematical morphology tools to extract image components and describe shapes. Some key tools include binary erosion and dilation, which thin and thicken objects. Erosion shrinks objects while dilation grows them. Opening and closing are combinations of erosion and dilation that smooth contours or fill gaps. The hit-or-miss transform detects shapes by requiring matches of foreground and background pixels. Other algorithms include boundary extraction, hole filling, and thinning to find skeletons, which are medial axes of object shapes.
This document discusses digital image processing and degradation models. It covers several key points:
1. Degradation models can be represented by a linear operator H that acts on an input image f(x,y) to produce a degraded output g(x,y).
2. Common noise models include those with different spatial and intensity characteristics like Gaussian, Rayleigh, and impulse noise.
3. Noise can be removed through spatial or frequency domain filtering methods like mean filters and Fourier transforms.
4. The quality of de-noising can be evaluated using metrics like peak signal-to-noise ratio (PSNR) or visual perception.
Lec 07 image enhancement in frequency domain iAli Hassan
The document discusses digital image processing and image enhancement in the frequency domain. It provides background on Fourier series and Fourier transforms, explaining that Fourier transforms allow representing even non-periodic functions as integrals of sines and cosines. The Fourier transform converts a signal from the time domain to the frequency domain. Two-dimensional Fourier transforms are used in image processing for applications like image enhancement, restoration, and encoding/decoding. The document also outlines the formulas for one-dimensional and two-dimensional discrete Fourier transforms and their inverses.
The document discusses image restoration techniques. It describes how images can become degraded through phenomena like motion, improper camera focusing, and noise. The goal of image restoration is to recover the original high quality image from its degraded version using knowledge about the degradation process and types of noise. Common noise models include Gaussian, Rayleigh, Erlang, exponential, and impulse noise. Filtering techniques like mean, order statistics, and adaptive filters can be used for restoration by smoothing the image while preserving edges. The adaptive filters change based on local image statistics to better reduce noise with less blurring than regular filters.
Chapter 2. Digital Image Fundamentals.pdfDngThanh44
This document discusses digital image fundamentals including image sensing and acquisition, sampling and quantization, relationships between pixels, and basic mathematical tools used in digital image processing. Specifically, it covers how images are captured using single sensing elements, line sensors and sensor arrays. It also describes how continuous images are converted to digital images through sampling and quantization of coordinate and intensity values. Key concepts covered include spatial and intensity resolution, image interpolation methods, and definitions of pixel neighbors, regions, boundaries and connectivity in digital images.
Unveiling the Details: Image Enhancement in Spatial and Frequency Domain
Digital images are an integral part of our modern world, used in everything from capturing memories to scientific analysis. However, these images often require some adjustments to improve their visual quality and information content. Image enhancement techniques address this need, employing various methods to manipulate the pixels within an image for better visualization and analysis. This explanation delves into two fundamental approaches for image enhancement: spatial domain processing and frequency domain processing.
Spatial Domain Processing: Direct Pixel Manipulation
Spatial domain processing deals with directly modifying the pixel values in an image based on their location (x, y coordinates) and intensity level. These techniques operate directly on the image itself, making them relatively simple to implement and understand. Common spatial domain methods include:
Histogram Processing:
The histogram of an image depicts the distribution of pixel intensities across the grayscale range (0-255 for 8-bit images). Techniques like histogram equalization manipulate this distribution to improve contrast by stretching or compressing the intensity values.
Point Operations:
These operations modify the intensity value of each pixel based on a mathematical formula. Examples include:
Negation: Inverts the image by inverting the intensity values (255 - original value).
Logarithmic transformation: Emphasizes low-intensity regions by compressing the dynamic range.
Power-law transformation: Adjusts the image's overall contrast by increasing (gamma > 1) or decreasing (gamma < 1) the dynamic range.
Neighbourhood Operations:
These techniques modify the pixel value based on its surrounding pixels (neighborhood). Examples include:
Smoothing: Filters like averaging and Gaussian blur replace a pixel's value with the average of its neighbors, reducing noise and blurring sharp edges.
Sharpening: Filters like Laplacian and Sobel highlight image edges by accentuating intensity differences between neighboring pixels. This can amplify noise as well.
Thresholding:
Converts a grayscale image into a binary image (black and white) by selecting a threshold intensity value. Pixels above the threshold become white, and those below become black. This is useful for object segmentation in an image.
Morphological Operations:
These techniques involve manipulating the shapes of objects in an image by applying structuring elements like squares or lines. Common operations include erosion (shrinking objects) and dilation (expanding objects).
Spatial domain techniques offer efficient and intuitive image enhancement. However, they may struggle with complex noise patterns or require extensive manual parameter tuning to achieve optimal results.
Spatial domain filtering involves modifying an image by applying a filter or kernel to pixels within a neighborhood region. There are two main types of spatial filters - smoothing/low-pass filters which blur an image, and sharpening/high-pass filters which enhance edges and details. Smoothing filters replace each pixel value with the average of neighboring pixels, reducing noise. Sharpening filters use derivatives of Gaussian kernels to highlight areas of rapid intensity change, increasing contrast along edges. The effects of filtering depend on the size and shape of the kernel, with larger kernels producing more blurring or sharpening.
1. Image restoration aims to reconstruct or recover an image that has been distorted by known degradation processes.
2. Degradation can occur during image acquisition, display, or processing due to factors like sensor noise, blurring, motion, or atmospheric effects.
3. Restoration techniques model the degradation process and apply the inverse to estimate the original undistorted image. The accuracy of the estimate depends on how well the degradation is modeled.
The document discusses digital image processing and provides an overview of key concepts. It defines digital and analog images and explains how digital images are represented by pixels. It outlines fundamental steps in digital image processing like image acquisition, enhancement, restoration, morphological processing, segmentation, representation, compression and object recognition. It also discusses applications in areas like remote sensing, medical imaging, film and video effects.
The document discusses the fundamental steps in digital image processing. It describes 7 key steps: (1) image acquisition, (2) image enhancement, (3) image restoration, (4) color image processing, (5) wavelets and multiresolution processing, (6) image compression, and (7) morphological processing. For each step, it provides brief explanations of the techniques and purposes involved in digital image processing.
The document discusses image restoration techniques. It introduces common image degradation models and noise models encountered in imaging. Spatial and frequency domain filtering methods are described for restoration when the degradation is additive noise. Adaptive median filtering and frequency domain filtering techniques like bandreject, bandpass and notch filters are explained for periodic noise removal. Optimal filtering methods like Wiener filtering that minimize mean square error are also covered. The document provides an overview of key concepts and methods in image restoration.
Digital Image Processing covers intensity transformations that can be performed on images. These include basic transformations like negatives, log transformations, and power-law transformations. It also discusses image histograms, which measure the frequency of each intensity level in an image. Histogram equalization aims to improve contrast by mapping intensities to produce a uniform histogram. It works by spreading out the most frequent intensity values.
This document discusses image enhancement techniques in the spatial domain. It begins by introducing intensity transformations and spatial filtering as the two principal categories of spatial domain processing. It then describes the basics of intensity transformations, including how they directly manipulate pixel values in an image. The document focuses on different types of basic intensity transformation functions such as image negation, log transformations, power law transformations, and piecewise linear transformations. It provides examples of how these transformations can be used to enhance images. Finally, it discusses histogram processing and how the histogram of an image provides information about the distribution of pixel intensities.
This document provides a 3 sentence summary of a lecture on image enhancement through histogram specification. The lecture discusses performing histogram equalization on an input image to match the histogram of a target image through mapping the pixel values. Any questions about histogram specification or equalization are welcome at the end.
This document provides an overview of mathematical morphology and its applications to image processing. Some key points:
- Mathematical morphology uses concepts from set theory and uses structuring elements to probe and extract image properties. It provides tools for tasks like noise removal, thinning, and shape analysis.
- Basic operations include erosion, dilation, opening, and closing. Erosion shrinks objects while dilation expands them. Opening and closing combine these to smooth contours or fill gaps.
- Hit-or-miss transforms allow detecting specific shapes. Skeletonization reduces objects to 1-pixel wide representations.
- Morphological operations can be applied to binary or grayscale images. Structuring elements are used to specify the neighborhood of pixels
1. The document discusses image processing in the frequency domain, which involves transforming an image into its frequency distribution using mathematical operators called transformations like the Fourier transform.
2. The Fourier transform decomposes an image into its frequency components, which can be divided into high frequency components corresponding to edges and low frequency components corresponding to smooth regions.
3. An example computes the 2D discrete Fourier transform of a toy image using zero-padding to increase resolution, and the fftshift function is used to center the DC coefficient when visualizing the transformed image.
This document discusses various techniques for image enhancement in the frequency domain. It describes three types of low-pass filters for smoothing images: ideal low-pass filters, Butterworth low-pass filters, and Gaussian low-pass filters. It also discusses three corresponding types of high-pass filters for sharpening images: ideal high-pass filters, Butterworth high-pass filters, and Gaussian high-pass filters. The key steps in frequency domain filtering are also summarized.
This presentation contains concepts of different image restoration and reconstruction techniques used nowadays in the field of digital image processing. Slides are prepared from Gonzalez book and Pratt book.
This document discusses image enhancement techniques in the spatial domain. It describes two categories of spatial domain operations: point processing and neighborhood processing. Point processing involves direct manipulation of pixel values through techniques like contrast stretching and thresholding. Neighborhood processing considers pixels in a local region and applies techniques like averaging filters. The document outlines several gray level transformations for enhancement, including logarithmic, power-law, piecewise linear, and bit-plane slicing transformations. It also discusses arithmetic and logic operations on images.
Image enhancement techniques can be used to improve image visual appearance and analysis by accentuating features like edges and boundaries. There are several techniques including:
1. Point operations like contrast stretching and thresholding that modify pixel values.
2. Spatial operations like noise smoothing and sharpening that apply neighborhood pixel averaging or differencing.
3. Transform domain techniques like filtering in the frequency domain to accelerate operations like noise removal.
4. Edge enhancement methods like the pyramid approach that detects edges across multiple image scales to isolate significant edges.
Linear filters like averaging and Gaussian filters can remove grain noise by averaging pixel values in a neighborhood. Median filters are better at removing outliers without reducing sharpness by setting a pixel to the median value in its neighborhood. The document demonstrates applying averaging and median filters in Matlab to remove noise, and using morphological opening to estimate and subtract a background illumination to rectify it.
Morphological image processing uses mathematical morphology tools to extract image components and describe shapes. Some key tools include binary erosion and dilation, which thin and thicken objects. Erosion shrinks objects while dilation grows them. Opening and closing are combinations of erosion and dilation that smooth contours or fill gaps. The hit-or-miss transform detects shapes by requiring matches of foreground and background pixels. Other algorithms include boundary extraction, hole filling, and thinning to find skeletons, which are medial axes of object shapes.
This document discusses digital image processing and degradation models. It covers several key points:
1. Degradation models can be represented by a linear operator H that acts on an input image f(x,y) to produce a degraded output g(x,y).
2. Common noise models include those with different spatial and intensity characteristics like Gaussian, Rayleigh, and impulse noise.
3. Noise can be removed through spatial or frequency domain filtering methods like mean filters and Fourier transforms.
4. The quality of de-noising can be evaluated using metrics like peak signal-to-noise ratio (PSNR) or visual perception.
Lec 07 image enhancement in frequency domain iAli Hassan
The document discusses digital image processing and image enhancement in the frequency domain. It provides background on Fourier series and Fourier transforms, explaining that Fourier transforms allow representing even non-periodic functions as integrals of sines and cosines. The Fourier transform converts a signal from the time domain to the frequency domain. Two-dimensional Fourier transforms are used in image processing for applications like image enhancement, restoration, and encoding/decoding. The document also outlines the formulas for one-dimensional and two-dimensional discrete Fourier transforms and their inverses.
The document discusses image restoration techniques. It describes how images can become degraded through phenomena like motion, improper camera focusing, and noise. The goal of image restoration is to recover the original high quality image from its degraded version using knowledge about the degradation process and types of noise. Common noise models include Gaussian, Rayleigh, Erlang, exponential, and impulse noise. Filtering techniques like mean, order statistics, and adaptive filters can be used for restoration by smoothing the image while preserving edges. The adaptive filters change based on local image statistics to better reduce noise with less blurring than regular filters.
Chapter 2. Digital Image Fundamentals.pdfDngThanh44
This document discusses digital image fundamentals including image sensing and acquisition, sampling and quantization, relationships between pixels, and basic mathematical tools used in digital image processing. Specifically, it covers how images are captured using single sensing elements, line sensors and sensor arrays. It also describes how continuous images are converted to digital images through sampling and quantization of coordinate and intensity values. Key concepts covered include spatial and intensity resolution, image interpolation methods, and definitions of pixel neighbors, regions, boundaries and connectivity in digital images.
Unveiling the Details: Image Enhancement in Spatial and Frequency Domain
Digital images are an integral part of our modern world, used in everything from capturing memories to scientific analysis. However, these images often require some adjustments to improve their visual quality and information content. Image enhancement techniques address this need, employing various methods to manipulate the pixels within an image for better visualization and analysis. This explanation delves into two fundamental approaches for image enhancement: spatial domain processing and frequency domain processing.
Spatial Domain Processing: Direct Pixel Manipulation
Spatial domain processing deals with directly modifying the pixel values in an image based on their location (x, y coordinates) and intensity level. These techniques operate directly on the image itself, making them relatively simple to implement and understand. Common spatial domain methods include:
Histogram Processing:
The histogram of an image depicts the distribution of pixel intensities across the grayscale range (0-255 for 8-bit images). Techniques like histogram equalization manipulate this distribution to improve contrast by stretching or compressing the intensity values.
Point Operations:
These operations modify the intensity value of each pixel based on a mathematical formula. Examples include:
Negation: Inverts the image by inverting the intensity values (255 - original value).
Logarithmic transformation: Emphasizes low-intensity regions by compressing the dynamic range.
Power-law transformation: Adjusts the image's overall contrast by increasing (gamma > 1) or decreasing (gamma < 1) the dynamic range.
Neighbourhood Operations:
These techniques modify the pixel value based on its surrounding pixels (neighborhood). Examples include:
Smoothing: Filters like averaging and Gaussian blur replace a pixel's value with the average of its neighbors, reducing noise and blurring sharp edges.
Sharpening: Filters like Laplacian and Sobel highlight image edges by accentuating intensity differences between neighboring pixels. This can amplify noise as well.
Thresholding:
Converts a grayscale image into a binary image (black and white) by selecting a threshold intensity value. Pixels above the threshold become white, and those below become black. This is useful for object segmentation in an image.
Morphological Operations:
These techniques involve manipulating the shapes of objects in an image by applying structuring elements like squares or lines. Common operations include erosion (shrinking objects) and dilation (expanding objects).
Spatial domain techniques offer efficient and intuitive image enhancement. However, they may struggle with complex noise patterns or require extensive manual parameter tuning to achieve optimal results.
This document discusses single object tracking and velocity determination. It begins with an introduction and objectives of the project which is to develop an algorithm for tracking a single object and determining its velocity in a sequence of video frames. It then provides details on preprocessing techniques like mean filtering, Gaussian smoothing and median filtering to reduce noise. It describes segmentation methods including histogram-based, single Gaussian background and frame difference approaches. Feature extraction methods like edges, bounding boxes and color are explained. Object detection using optical flow and block matching is covered. Finally, it discusses tracking and calculating velocity of the moving object. MATLAB is introduced as a technical computing language for solving these types of problems.
Digital images are formed through a process of image acquisition, sampling and quantization. An image is represented as a 2D array of pixels, with each pixel having an intensity or color value. The spatial resolution and pixel depth determine how much detail can be preserved from the original scene. A higher resolution captures more details but results in larger file sizes. The human eye can distinguish between intensities that differ by around 5% and can detect spatial frequencies up to about 60 cycles/degree. Digital image processing techniques are needed to enhance, analyze and compress digital images for different applications.
AN EFFICIENT FEATURE EXTRACTION AND CLASSIFICATION OF HANDWRITTEN DIGITS USIN...IJCSEA Journal
The wide range of shape variations for handwritten digits requires an adequate representation of thediscriminating features for classification. For the recognition of characters or numerals requires pixel valuesof a normalized raster image and proper features to reach very good classification rate. This paper primarily concerns the problem of isolated handwritten numeral recognition of English scripts.Multilayer Perceptron(MLP) classifier is used for classification. The principalcontributions presented here are preprocessing, feature extraction and multilayer perceptron (MLP) classifiers.The strength of our approach is efficient feature extraction and the comprehensive classification scheme due to which, we have been able to achieve a recognition rate of 95.6, better than the previous approaches.
This document provides an agenda and overview of topics related to intensity transformations and spatial filtering for image enhancement. It discusses piecewise-linear transformation functions including contrast stretching, intensity-level slicing, and bit-plane slicing. It also covers histogram processing techniques such as histogram equalization, histogram matching, and using histogram statistics. Finally, it outlines fundamentals of spatial filtering including the mechanics of spatial filtering, spatial correlation and convolution, and generating smoothing and sharpening spatial filters.
3 intensity transformations and spatial filtering slidesBHAGYAPRASADBUGGE
This document discusses basics of intensity transformations and spatial filtering of digital images. It covers the following key points:
- Intensity transformations map input pixel intensities to output intensities using an operator T. Common transformations include log, power-law, and piecewise-linear functions.
- Spatial filters operate on neighborhoods of pixels. Linear filters perform averaging or correlation while non-linear filters use ordering like median.
- Basic filters include smoothing to reduce noise, sharpening to enhance edges using Laplacian or unsharp masking, and gradient for edge detection.
- Fuzzy set theory can be applied to intensity transformations by defining membership functions for concepts like dark/bright. It can also be used for spatial filtering by defining
Digital images can be represented as multidimensional arrays of pixel values. Each pixel is associated with an intensity value or color vector. Digital images are formed through discrete sampling and quantization of a continuous scene. Key aspects of digital image representation include spatial resolution, gray level resolution, and image type (e.g. binary, RGB, intensity). The connectivity and adjacency of pixels are important concepts for analyzing image properties and structures.
This document compares the performance of image restoration techniques in the time and frequency domains. It proposes a new algorithm to denoise images corrupted by salt and pepper noise. The algorithm replaces noisy pixel values within a 3x3 window with a weighted median based on neighboring pixels. It applies filters like CLAHE, average, Wiener and median filtering before the proposed algorithm to further remove noise. Experimental results on test images show the proposed method achieves better noise removal compared to other techniques, with around a 60% increase in PSNR and 90% reduction in MSE. In conclusion, the proposed algorithm is effective at restoring images with high density salt and pepper noise.
This document provides information about a course on biomedical image processing. It lists the lecturer, Dr. Haris Masood, and contact information. It outlines the course assessment criteria including quizzes, assignments, a midterm, and final test. It introduces key concepts in digital image processing like image enhancement, restoration, compression, and segmentation. It also provides an overview of the historical development and applications of digital image processing.
A new gridding technique for high density microarrayAlexander Decker
This document describes a new gridding technique for high density microarray images. [1] The technique uses the intensity projection profile of the most suitable subimage to locate subarrays and individual spots without any user input parameters. [2] It is capable of processing images with irregular spots, varying surface intensity, and over 50% contamination. [3] The key steps are preprocessing the image, then using horizontal and vertical intensity projection profiles of the preprocessed image to estimate global parameters for locating subarrays, and local parameters for locating individual spots within each subarray.
Digital Ortho Image Creation of Hall County Aerial Photosmpadams77
Powerpoint Presentation that I presented at the Florida Academy of Science and Georgia Academy of Science Joint Conference held in Jacksonville, FL March 14th and 15th of 2008
PDE BASED FEATURES FOR TEXTURE ANALYSIS USING WAVELET TRANSFORMIJCI JOURNAL
This document summarizes a research paper that proposes a novel method for texture analysis using wavelet transforms and partial differential equations (PDEs). The method involves applying wavelet transforms to images to obtain directional information. Anisotropic diffusion, a PDE technique, is then used on the directional information to compute a texture approximation. Various statistical features are extracted from the texture approximation. Linear discriminant analysis enhances class separability of the features before classification using k-nearest neighbors. The method is evaluated on the Brodatz texture dataset and results show it achieves better classification accuracy than other methods while having lower computational cost.
The development of multimedia system technology in Content based Image Retrieval (CBIR) System is
one in every of the outstanding area to retrieve the images from an oversized collection of database. The feature
vectors of the query image are compared with feature vectors of the database images to get matching images.It is
much observed that anyone algorithm isn't beneficial in extracting all differing kinds of natural images. Thus an
intensive analysis of certain color, texture and shape extraction techniques are allotted to spot an efficient CBIR
technique that suits for a selected sort of images. The Extraction of an image includes feature description and
feature extraction. During this paper, we tend to projected Color Layout Descriptor (CLD), grey Level Co-
Occurrences Matrix (GLCM), Marker-Controlled Watershed Segmentation feature extraction technique that
extract the matching image based on the similarity of Color, Texture and shape within the database. For
performance analysis, the image retrieval timing results of the projected technique is calculated and compared
with every of the individual feature.
ER Publication,
IJETR, IJMCTR,
Journals,
International Journals,
High Impact Journals,
Monthly Journal,
Good quality Journals,
Research,
Research Papers,
Research Article,
Free Journals, Open access Journals,
erpublication.org,
Engineering Journal,
Science Journals,
This document discusses various intensity transformation and spatial filtering techniques for digital image enhancement. It covers single pixel operations like negative image and contrast stretching. It also discusses neighborhood operations such as averaging and median filters. Finally, it discusses geometric spatial transformations like scaling, rotation and translation. The document provides details on basic intensity transformation functions including log, power law, and piecewise linear transformations. It also covers histogram processing techniques like histogram equalization, matching and local histogram processing. Spatial filtering and its mechanics are explained.
An Improved TRICKS Method for Dynamic Contrast-Enhanced Tumor ImagingMike Aref
3D dynamic contrast-enhanced (DCE) imaging is a widely used technique for tumor diagnosis. A major challenge in this
application is to fast sample the dynamic signal variations while keeping the spatial resolution. TRICKS is a reduced-encoding imaging method
where k-space data are temporally subsampled during data acquisition and then recovered by linear interpolation before image reconstruction.
Significant errors can be introduced by the linear interpolation when underlying signal variations show nonlinear behavior, for example, in DCE
experiments where the signals show a rapid enhancement phase followed by a slow decay phase. This abstract presents a new image reconstruction
scheme to improve the TRICKS method. Using a nonlinear interpolation scheme, the new method can capture dynamic signal variations more accurately. Experimental results from a DCE mice tumor study are presented to demonstrate the effectiveness of the proposed method.
EFFICIENT IMAGE RETRIEVAL USING REGION BASED IMAGE RETRIEVALsipij
1) The document describes an efficient region-based image retrieval system that uses discrete wavelet transform and k-means clustering. It segments images into regions, each characterized by features like size, mean, and covariance.
2) The system pre-processes images by resizing, converting to HSV color space, performing DWT, and using k-means clustering on DWT coefficients to generate regions. It extracts features for each region and stores them in a database.
3) For retrieval, it pre-processes the query image similarly and calculates similarities between the query regions and database regions based on their features, returning similar images.
Spatial domain filtering and intensity transformations are techniques used in image processing. Spatial domain refers to the pixels that make up an image. Spatial domain techniques operate directly on pixels by applying operators to pixels and their neighbors. Common operators include averaging, median filtering, and contrast adjustments. Spatial filtering techniques include smoothing to reduce noise and sharpening to enhance edges through differentiation. Intensity transformations map input pixel values to output values using functions like logarithms, power laws, and piecewise linear approximations to modify image contrast and highlight certain intensity ranges.
Image compression using Hybrid wavelet Transform and their Performance Compa...IJMER
Images may be worth a thousand words, but they generally occupy much more space in hard disk, or
bandwidth in a transmission system, than their proverbial counterpart. Compressing an image is significantly
different than compressing raw binary data. Of course, general purpose compression programs can be used to
compress images, but the result is less than optimal. This is because images have certain statistical properties
which can be exploited by encoders specifically designed for them. Also, some of the finer details in the image
can be sacrificed for the sake of saving a little more bandwidth or storage space. Compression is the process of
representing information in a compact form. Compression is a necessary and essential method for creating
image files with manageable and transmittable sizes. The data compression schemes can be divided into
lossless and lossy compression. In lossless compression, reconstructed image is exactly same as compressed
image. In lossy image compression, high compression ratio is achieved at the cost of some error in reconstructed
image. Lossy compression generally provides much higher compression than lossless compression.
Similar to Chapter 3. Intensity Transformations and Spatial Filtering.pdf (20)
Comparative analysis between traditional aquaponics and reconstructed aquapon...bijceesjournal
The aquaponic system of planting is a method that does not require soil usage. It is a method that only needs water, fish, lava rocks (a substitute for soil), and plants. Aquaponic systems are sustainable and environmentally friendly. Its use not only helps to plant in small spaces but also helps reduce artificial chemical use and minimizes excess water use, as aquaponics consumes 90% less water than soil-based gardening. The study applied a descriptive and experimental design to assess and compare conventional and reconstructed aquaponic methods for reproducing tomatoes. The researchers created an observation checklist to determine the significant factors of the study. The study aims to determine the significant difference between traditional aquaponics and reconstructed aquaponics systems propagating tomatoes in terms of height, weight, girth, and number of fruits. The reconstructed aquaponics system’s higher growth yield results in a much more nourished crop than the traditional aquaponics system. It is superior in its number of fruits, height, weight, and girth measurement. Moreover, the reconstructed aquaponics system is proven to eliminate all the hindrances present in the traditional aquaponics system, which are overcrowding of fish, algae growth, pest problems, contaminated water, and dead fish.
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
The CBC machine is a common diagnostic tool used by doctors to measure a patient's red blood cell count, white blood cell count and platelet count. The machine uses a small sample of the patient's blood, which is then placed into special tubes and analyzed. The results of the analysis are then displayed on a screen for the doctor to review. The CBC machine is an important tool for diagnosing various conditions, such as anemia, infection and leukemia. It can also help to monitor a patient's response to treatment.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
Understanding Inductive Bias in Machine LearningSUTEJAS
This presentation explores the concept of inductive bias in machine learning. It explains how algorithms come with built-in assumptions and preferences that guide the learning process. You'll learn about the different types of inductive bias and how they can impact the performance and generalizability of machine learning models.
The presentation also covers the positive and negative aspects of inductive bias, along with strategies for mitigating potential drawbacks. We'll explore examples of how bias manifests in algorithms like neural networks and decision trees.
By understanding inductive bias, you can gain valuable insights into how machine learning models work and make informed decisions when building and deploying them.
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
UNLOCKING HEALTHCARE 4.0: NAVIGATING CRITICAL SUCCESS FACTORS FOR EFFECTIVE I...amsjournal
The Fourth Industrial Revolution is transforming industries, including healthcare, by integrating digital,
physical, and biological technologies. This study examines the integration of 4.0 technologies into
healthcare, identifying success factors and challenges through interviews with 70 stakeholders from 33
countries. Healthcare is evolving significantly, with varied objectives across nations aiming to improve
population health. The study explores stakeholders' perceptions on critical success factors, identifying
challenges such as insufficiently trained personnel, organizational silos, and structural barriers to data
exchange. Facilitators for integration include cost reduction initiatives and interoperability policies.
Technologies like IoT, Big Data, AI, Machine Learning, and robotics enhance diagnostics, treatment
precision, and real-time monitoring, reducing errors and optimizing resource utilization. Automation
improves employee satisfaction and patient care, while Blockchain and telemedicine drive cost reductions.
Successful integration requires skilled professionals and supportive policies, promising efficient resource
use, lower error rates, and accelerated processes, leading to optimized global healthcare outcomes.
UNLOCKING HEALTHCARE 4.0: NAVIGATING CRITICAL SUCCESS FACTORS FOR EFFECTIVE I...
Chapter 3. Intensity Transformations and Spatial Filtering.pdf
1. XỬ LÝ ẢNH TRONG CƠ ĐIỆN TỬ
Machine Vision
1
TRƯỜNG ĐẠI HỌC BÁCH KHOA HÀ NỘI
Giảng viên: TS. Nguyễn Thành Hùng
Đơn vị: Bộ môn Cơ điện tử, Viện Cơ khí
Hà Nội, 2021
2. 2
Chapter 3. Intensity Transformations and Spatial Filtering
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
❖Two principal categories of spatial processing are intensity transformations and
spatial filtering.
➢ Intensity transformations operate on single pixels of an image for tasks such
as contrast manipulation and image thresholding.
➢ Spatial filtering performs operations on the neighborhood of every pixel in an
image.
➢ Examples of spatial filtering include image smoothing and sharpening.
3. 3
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
4. 4
1. Background
❖The Basics of Intensity Transformations and Spatial Filtering
➢The spatial domain processes are based on the expression
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
where f(x, y) is an input image, g(x, y) is the
output image, and T is an operator on f defined
over a neighborhood of point (x, y).
A 3x3 neighborhood about a point (x0, y0) in an image. The neighborhood
is moved from pixel to pixel in the image to generate an output image.
5. 5
1. Background
❖The Basics of Intensity Transformations and Spatial Filtering
➢intensity (also called a gray-level, or mapping) transformation function
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Intensity transformation functions.
(a) Contrast stretching function.
(b) Thresholding function.
6. 6
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
7. 7
❖Three basic types of functions
➢ linear (negative and identity transformations)
➢ logarithmic (log and inverse-log transformations)
➢ power-law (nth power and nth root
transformations)
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Some basic intensity transformation functions.
2. Some Basic Intensity Transformation Functions
8. 8
2. Some Basic Intensity Transformation Functions
❖Image Negatives
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) A digital mammogram. (b) Negative image obtained using Eq. (3-3) .
(Image (a) Courtesy of General Electric Medical Systems.)
9. 9
2. Some Basic Intensity Transformation Functions
❖Log Transformations
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
where c is a constant and it is assumed that r 0
(a) Fourier spectrum displayed as a grayscale image. (b) Result of applying the log
transformation in Eq. (3-4) with c = 1. Both images are scaled to the range [0, 255].
10. 10
2. Some Basic Intensity Transformation Functions
❖Power-Law (Gamma) Transformations
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
where c and are positive constants
Plots of the gamma equation s = cr for various values
of (c = 1 in all cases).
11. 11
2. Some Basic Intensity Transformation Functions
❖Power-Law (Gamma) Transformations
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Image of a human retina. (b) Image as
as it appears on a monitor with a gamma
setting of 2.5 (note the darkness). (c)
Gammacorrected image. (d) Corrected
image, as it appears on the same monitor
(compare with the original image). (Image
(a) courtesy of the National Eye Institute,
NIH)
12. 12
2. Some Basic Intensity Transformation Functions
❖Power-Law (Gamma) Transformations
➢Contrast enhancement using power-law intensity transformations.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
a) Magnetic resonance image (MRI) of a fractured human spine (the region of the fracture is enclosed by the circle). (b)–(d)
Results of applying the transformation in Eq. (3-5) with and and 0.3, respectively. (Original image courtesy of Dr. David R. Pickens,
Department of Radiology and Radiological Sciences, Vanderbilt University Medical Center.)
13. 13
2. Some Basic Intensity Transformation Functions
❖Power-Law (Gamma) Transformations
➢Another illustration of power-law transformations.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Aerial image. (b)–(d) Results of applying the transformation in Eq. (3-5) with = 3.0, 4.0 and 5.0, respectively.
(c = 1 in all cases.) (Original image courtesy of NASA.)
14. 14
2. Some Basic Intensity Transformation Functions
❖Piecewise Linear Transformation Functions
➢Contrast Stretching
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
where rmin and rmax denote
the minimum and maximum
intensity levels in the input
image, respectively
where m is the mean intensity
level in the image
15. 15
2. Some Basic Intensity Transformation Functions
❖Piecewise Linear Transformation Functions
➢Intensity-Level Slicing
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Figure 1: (a) This transformation function highlights range [A, B] and
reduces all other intensities to a lower level. (b) This function highlights
range [A, B] and leaves other intensities unchanged.
Figure 2: (a) Aortic angiogram. (b) Result of using a slicing transformation
of the type illustrated in Fig. 1(a) , with the range of intensities of interest
selected in the upper end of the gray scale. (c) Result of using the
transformation in Fig. 1(b) , with the selected range set near black, so that
the grays in the area of the blood vessels and kidneys were preserved.
(Original image courtesy of Dr. Thomas R. Gest, University of Michigan
Medical School.)
16. 16
2. Some Basic Intensity Transformation Functions
❖Piecewise Linear Transformation Functions
➢Bit-Plane Slicing
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Bit-planes of an 8-bit image.
17. 17
2. Some Basic Intensity Transformation Functions
❖Piecewise Linear Transformation Functions
➢Bit-Plane Slicing
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) An 8-bit gray-scale
image of size pixels. (b)
through (i) Bit planes 8
through 1, respectively,
where plane 1 contains
the least significant bit.
Each bit plane is a binary
image. Figure (a) is an
SEM image of a
trophozoite that causes a
disease called giardiasis.
(Courtesy of Dr. Stan
Erlandsen, U.S. Center
for Disease Control and
Prevention.)
18. 18
2. Some Basic Intensity Transformation Functions
❖Piecewise Linear Transformation Functions
➢Bit-Plane Slicing
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Image reconstructed from bit planes: (a) 8 and 7; (b) 8, 7, and 6; (c) 8, 7, 6, and 5.
19. 19
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
20. 20
3. Histogram Processing
❖Histogram
➢The unnormalized histogram:
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
where rk is the k-th intensity level of an L-level
digital image f(x, y); nk is the number of pixels
in f with intensity rk and the subdivisions of the
intensity scale are called histogram bins.
21. 21
3. Histogram Processing
❖Histogram
➢The normalized histogram:
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
where M and N are the number of image rows and columns, respectively.
𝑘=1
𝐿−1
𝑝 𝑟𝑘 = 1
22. 22
3. Histogram Processing
❖Histogram
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Four image types and their corresponding histograms. (a) dark; (b) light; (c) low contrast; (d) high contrast.
The horizontal axis of the histograms are values of rk and the vertical axis are values of p(rk) .
23. 30
3. Histogram Processing
❖Histogram Equalization
➢The probability of occurrence of intensity level in a digital image is approximated
by
where MN is the total number of pixels in the image, and nk denotes the number of pixels that have
intensity rk.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
a histogram equalization or histogram linearization transformation
➢The discrete form of the transformation function is
24. 31
3. Histogram Processing
❖Histogram Equalization
➢Example: Illustration of the mechanics of histogram equalization.
• Suppose that a 3-bit image (L = 3) of size 64x64 pixels (MN = 4096) has the intensity distribution
in Table
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
25. 32
3. Histogram Processing
❖Histogram Equalization
➢Example: Illustration of the mechanics of histogram equalization.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
We round them to their nearest integer values in the range [0, 7]:
The values of the equalized histogram.
26. 33
3. Histogram Processing
❖Histogram Equalization
➢Example: Illustration of the mechanics of histogram equalization.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Histogram equalization. (a) Original histogram. (b) Transformation function. (c) Equalized histogram.
27. 34
3. Histogram Processing
❖Histogram Equalization
➢Algorithm for Histogram Equalization
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
28. 35
3. Histogram Processing
❖Histogram Equalization
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Histogram-
equalized images
Source images Histogram Histogram-
equalized images
Source images Histogram
29. 37
3. Histogram Processing
❖Histogram Equalization
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Image from Phoenix Lander. (b) Result of
histogram equalization. (c) Histogram of image
(a). (d) Histogram of image (b). (Original image
courtesy of NASA.)
30. 39
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
31. 40
4. Fundamentals of Spatial Filtering
❖The Mechanics of Linear Spatial Filtering
➢Spatial filter kernel: filter kernel, kernel, mask,
template, and window
➢Linear spatial filtering
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
32. 41
4. Fundamentals of Spatial Filtering
❖Spatial Correlation
and Convolution
➢1-D illustration
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
33. 42
4. Fundamentals of Spatial Filtering
❖Spatial Correlation and Convolution
➢2-D illustration
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
➢Correlation
➢Convolution
34. 43
4. Fundamentals of Spatial Filtering
❖Spatial Correlation and Convolution
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
35. 44
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
36. 45
5. Smoothing (Lowpass) Spatial Filters
➢Smoothing (also called averaging) spatial filters are used to reduce sharp
transitions in intensity.
➢Application: noise reduction, reduce aliasing, reduce irrelevant detail in an image,
smoothing the false contours, …
➢Linear spatial filtering
➢Nonlinear smoothing filters
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
37. 46
5. Smoothing (Lowpass) Spatial Filters
❖Box Filter Kernels
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
38. 47
5. Smoothing (Lowpass) Spatial Filters
❖Box Filter Kernels
➢Example: Lowpass filtering with a box
kernel
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Test pattern of size 1024x1024 pixels. (b)-(d)
Results of lowpass filtering with box kernels of
sizes 3x3, 11x11, and 21x21 respectively.
39. 48
5. Smoothing (Lowpass) Spatial Filters
❖Lowpass Gaussian Filter Kernels
➢Gaussian kernels of the form
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Distances from the center for
various sizes of square kernels.
40. 49
5. Smoothing (Lowpass) Spatial Filters
❖Lowpass Gaussian Filter Kernels
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Sampling a Gaussian function to obtain a discrete Gaussian kernel.
The values shown are for K = 1 and = 1. (b) Resulting kernel.
41. 50
5. Smoothing (Lowpass) Spatial Filters
❖Lowpass Gaussian Filter Kernels
➢Example: Lowpass filtering with a Gaussian kernel
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a)A test pattern of size 1024x1024. (b) Result of lowpass filtering the pattern with a Gaussian kernel of
size 21x21, with standard deviations = 3.5. (c) Result of using a kernel of size 43x43, with = 7. We
used K = 1 in all cases.
42. 51
5. Smoothing (Lowpass) Spatial Filters
❖Lowpass Gaussian Filter Kernels
➢Example: Lowpass filtering with a Gaussian kernel
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Result of filtering using a Gaussian kernels of size43x43, with = 7. (b) Result of using
a kernel of 85x85, with the same value of . (c) Difference image.
43. 52
5. Smoothing (Lowpass) Spatial Filters
➢Example: Comparison of Gaussian and box filter smoothing characteristics.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
44. 53
5. Smoothing (Lowpass) Spatial Filters
➢Example: Using lowpass filtering and thresholding for region extraction.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
45. 54
5. Smoothing (Lowpass) Spatial Filters
❖Order-Statistic (Nonlinear) Filters
➢Median filter: replaces the value of the center pixel by the median of the intensity
values in the neighborhood of that pixel
→Effective in the presence of impulse noise (salt-and-pepper noise)
→The 50th percentile of a ranked set of numbers
➢Max filter:
→Finding the brightest points in an image or for eroding dark areas adjacent to light
regions
→The 100th percentile filter
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
46. 55
5. Smoothing (Lowpass) Spatial Filters
❖Order-Statistic (Nonlinear) Filters
➢Min filter:
→used for the opposite purpose
→The 0th percentile filter
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
47. 56
5. Smoothing (Lowpass) Spatial Filters
❖Order-Statistic (Nonlinear) Filters
➢Example: Median filtering
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
48. 57
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
49. 58
6. Sharpening (Highpass) Spatial Filters
❖Foundation
➢First-order derivative
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
➢Second-order derivative
50. 59
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Laplacian
➢Laplacian
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
51. 60
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Laplacian
➢Laplacian kernel
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
52. 61
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Laplacian
➢The basic way in which the Laplacian is used for image sharpening:
▪ c = 1 if the center element of the Laplacian kernel is positive
▪ c = -1 if the center element of the Laplacian kernel is negative
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
53. 62
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Laplacian
➢Example: Image sharpening using the Laplacian
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
a) Blurred image of the North Pole of the moon. (b) Laplacian image obtained using the kernel in Fig.
3.51(a). (c) Image sharpened using Eq. (3-63) with c = -1. (d) Image sharpened using the same procedure,
but with the kernel in Fig. 3.51(b). (Original image courtesy of NASA.)
54. 63
6. Sharpening (Highpass) Spatial Filters
❖Unsharp Masking and Highboost Filtering
➢Unsharp masking
▪ Blur the original image
▪ Subtract the blurred image from the original (the resulting difference is called the mask)
▪ Add the mask to the original
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
blurred image
▪ When k = 1 → unsharp masking
▪ When k > 1 → highboost filtering
▪ When 0 k < 1 → reduces the contribution of the unsharp mask
55. 64
6. Sharpening (Highpass) Spatial Filters
❖Unsharp Masking and Highboost Filtering
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
1-D illustration of the mechanics of unsharp masking. (a) Original signal. (b) Blurred signal with original
shown dashed for reference. (c) Unsharp mask. (d) Sharpened signal, obtained by adding (c) to (a).
56. 65
6. Sharpening (Highpass) Spatial Filters
❖Unsharp Masking and Highboost Filtering
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Unretouched “soft-tone” digital image of size 469x600 pixels. (b) Image blurred using a 31x31 Gaussian lowpass
filter with = 5. (c) Mask. (d) Result of unsharp masking using Eq. (3-65) with k = 1. (e) and (f) Results of highboost
filtering with k = 2 and k = 3 respectively.
57. 66
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Gradient
➢The gradient of an image f at coordinates (x, y)
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
➢The magnitude (length) of vector f
58. 67
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Gradient
➢Roberts cross-gradient operators
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
59. 68
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Gradient
➢Sobel operators
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
60. 69
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Gradient
➢Filter masks
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) A 3x3 region of an image, where the zs are intensity values. (b)–(c) Roberts cross-gradient operators.
(d)–(e) Sobel operators. All the kernel coefficients sum to zero, as expected of a derivative operator.
61. 70
6. Sharpening (Highpass) Spatial Filters
❖Image Sharpening—the Gradient
➢Example: Using the gradient for edge enhancement.
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Image of a contact lens (note defects on the boundary at 4 and 5 o’clock).
(b) Sobel gradient. (Original image courtesy of Perceptics Corporation.)
62. 71
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
63. 72
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
❖Transfer functions of ideal 1-D filters
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Transfer functions of ideal 1-D
filters in the frequency domain
(u denotes frequency). (a)
Lowpass filter. (b) Highpass
filter. (c) Bandreject filter. (d)
Bandpass filter. (As before, we
show only positive frequencies
for simplicity.)
64. 73
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
❖Transfer functions of ideal 1-D filters
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
65. 74
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
❖Transfer functions of ideal 1-D filters
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
A zone plate image of size 597x597 pixels
(a) A 1-D spatial lowpass filter function. (b) 2-D kernel
obtained by rotating the 1-D profile about its center.
66. 75
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
❖Transfer functions of ideal 1-D filters
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(a) Zone plate image filtered with a separable lowpass kernel. (b) Image
filtered with the isotropic lowpass kernel in Fig. 3.60(b).
67. 76
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
❖Transfer functions of ideal 1-D filters
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
Spatial filtering of the zone
plate image. (a) Lowpass
result; this is the same as Fig.
3.61(b) . (b) Highpass result.
(c) Image (b) with intensities
scaled. (d) Bandreject result.
(e) Bandpass result. (f) Image
(e) with intensities scaled.
68. 77
Chapter 3. Intensity Transformations and Spatial Filtering
1. Background
2. Some Basic Intensity Transformation Functions
3. Histogram Processing
4. Fundamentals of Spatial Filtering
5. Smoothing (Lowpass) Spatial Filters
6. Sharpening (Highpass) Spatial Filters
7. Highpass, Bandreject, and Bandpass Filters from Lowpass Filters
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
69. 78
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
➢ Laplacian is superior for enhancing fine detail.
➢ The gradient has a stronger response in areas of significant intensity transitions (ramps and steps).
(a) Image of whole body bone scan. (b) Laplacian of (a). (c) Sharpened image obtained by adding (a) and (b).
(d) Sobel gradient of image (a). (Original image courtesy of G.E. Medical Systems.)
70. 79
8. Combining Spatial Enhancement Methods
Rafael C. Gonzalez, Richard E. Woods, “Digital image processing,” Pearson (2018).
(e) Sobel image smoothed with a 3x3 box filter. (f) Mask image formed by the product of (b) and (e). (g) Sharpened image
obtained by the adding images (a) and (f). (h) Final result obtained by applying a power-law transformation to (g). Compare
images (g) and (h) with (a). (Original image courtesy of G.E. Medical Systems.)