Report medical image processing image slice interpolation and noise removal in spatial and frequency domains
Upcoming SlideShare
Loading in...5
×
 

Report medical image processing image slice interpolation and noise removal in spatial and frequency domains

on

  • 1,229 views

Report of my engineer project on noise removal in medical images

Report of my engineer project on noise removal in medical images

Statistics

Views

Total Views
1,229
Views on SlideShare
1,227
Embed Views
2

Actions

Likes
0
Downloads
19
Comments
0

1 Embed 2

http://www.docseek.net 2

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Report medical image processing image slice interpolation and noise removal in spatial and frequency domains Report medical image processing image slice interpolation and noise removal in spatial and frequency domains Document Transcript

  • Title:Medical Image Processing: Image Slice Interpolation and NoiseRemoval in spatial and frequency domains.Shashank SinghEnrollment Number: 20000048.Semester 7Indian Institute of Information TechnologyALLAHABAD.Email:shashank@iiita.ac.inshashank_iiita@yahoo.com
  • Contents: 1. Certificate 2. Abstract. 3. Introduction. 4. Interpolation 5. Noise Removal 6. Results 7. Further Improvements 8. Acknowledgement 9. Further Improvements 10.References.
  • Certificate: This is to certify that Mr. Shashank Singh has successfullydeveloped the ‘Interpolation and Noise Removal’ modules for theMinistry of Human Resource Development funded Medical ImageProcessing Project. The work, carried out under my guidance, ishis original and the results are satisfactory.Dr. Uma Shankar Tiwari.Date: December 8, 2003.Place: Indian Institute of Information Technology Jhalwa - ALLAHABAD
  • Abstract: This project is a part of the project titled ‘Fusion ofMultimodal Medical Imagery Data’. The project is funded byMHRD. I have developed the modules for Noise Removal andInterpolation of slices from multiple input image slices.Introduction: The medical imagery and the subsequent processing of theimage data have come as boon to the medical diagnosis andtreatment. Different modalities exist for imagery, ranging fromMRI, CT, Spiral CT Scans and X-Ray to PET, SPECT, fMRI andUltrasound Imagery. The medical imagery data often consists of discrete slicesrepresenting the internals of the body. There are applications thatdemand in-between-slice views and that too in different planeorientations. Next, the imaging techniques can introduces certain noisesinto the captured data. It needs to be removed without affecting thetrue data. The two modules developed in this project are meant toaddress the above two issues to some extent.
  • Image Modalities: 1. Endoscope: visible light 2. Retina imaging: visible coherent light (laser) 3. Microscopes: laser (co focal imaging), in-coherent light (standard) 4. X-ray: electro magnetic waves 5. NMR: magnetic field 6. PET: radio-nuclear waves 7. US: sound wavesComputed Tomography (CT) 1. 1917: Inverse Radon Transform 2. 1971: Godfrey Hounsfield and Allan Cormack built the first computer tomography system 3. Dec. 10, 1979: Godfrey Hounsfield and Allan Cormack received the Nobel Prize in Medicine. 4. 1990: Introduction of Spiral CT by W. A. Kalender, W. Seissler, E. Klotz, and P.Vock 5. 2002: 16{slice CT and 4D reconstruction of the beating heart.Magnetic Resonance Imaging (MRI) 1. 1970: Lauterbur and Mansfield discorvered how to use magnetic fields to scan human bodies 2. 1973: discovery of magnetic resonance imaging by Paul C. Lauterbur and Sir Peter Mansfield 3. Dec. 10, 2003: Nobel Prize for Medicine awarded to Paul C. Lauterbur and Peter Mansfield 4. for more details see http://www.nobel.se
  • Interpolation: Medical imaging systems collect data typically in a slice-by-slice manner. Usually, the pixel size p of the scene within a slice isdifferent from spacing between adjacent slices. In addition, oftenthe spacing between slices may not be the same for all slices. Forvisualization, manipulation and analysis of such anisotropic data,they often need to be converted into data of isotropic discretizationor of desired level of discrete level of discretization in any of thethree (or higher) dimensions.Interpolation techniques can be divided into two categories: scenebased and object based. In scene-based methods, interpolated sceneintensity values are determined directly from the intensity valuesof the given scene. In object-based methods, some objectinformation extracted from the given scene is used in guiding theinterpolation process. I have interpolation of slices from a given set of images. The slicecan be either at a fixed depth or it can consist of patches selectedfrom multiple slice interpolated with increasing depth along x or yaxes.Interpolation is implemented in the Matlab files ‘interpolation.m’,which in turn uses two other files ‘funct1.m’ and funct2.m’.On running the ‘interpolation.m’, user is required to enter:- Information about input image files.- choose the type of interpolation.- select the depths for slices.
  • Noise Removal:Noise removal is implemented in the file ‘final.m’.The other file ‘final.fig’ contains the appropriate information aboutthe GUI. It provides a GUI to read in the images, add noise, applyfilters in spatial domain, calculated and visualize FFT , to applyhigh pass, low pass, band pass filters in frequency domain usingthe FFT.The user can view the original image, the noisy image, the filteredimages, FFT of different images all together in the four windowsof the GUI.Difficulties in Medical Image Processing:One has to be careful during processing operations like filteringand noise removal. It may happen that the apparent noise orirregularity is the signal of abnormality in the body. Secondly,Image data of the same region acquired after a time period maydiffer in orientation and shape due to patient movement and timechanges in the region. It makes comparison and processing of theimages difficult.Filters:Median FilterCommon Names: Median filtering, Rank filteringThe median filter is normally used to reduce noise in an image, somewhat like the meanfilter. However, it often does a better job than the mean filter of preserving useful detailin the image.
  • How It WorksLike the mean filter, the median filter considers each pixel in the image in turn and looksat its nearby neighbors to decide whether or not it is representative of its surroundings.Instead of simply replacing the pixel value with the mean of neighboring pixel values, itreplaces it with the median of those values. The median is calculated by first sorting allthe pixel values from the surrounding neighborhood into numerical order and thenreplacing the pixel being considered with the middle pixel value. (If the neighborhoodunder consideration contains an even number of pixels, the average of the two middlepixel values is used.) Figure 1 illustrates an example calculation.Figure 1 Calculating the median value of a pixel neighborhood. As can be seen thecentral pixel value of 150 is rather unrepresentative of the surrounding pixels and isreplaced with the median value: 124. A 3×3 square neighborhood is used here --- largerneighborhoods will produce more severe smoothing.Mean FilterCommon Names: Mean filtering, Smoothing, Averaging, Box filteringBrief DescriptionMean filtering is a simple, intuitive and easy to implement method of smoothing images,i.e. reducing the amount of intensity variation between one pixel and the next. It is oftenused to reduce noise in images.
  • How It WorksThe idea of mean filtering is simply to replace each pixel value in an image with themean (`average) value of its neighbours, including itself. This has the effect ofeliminating pixel values which are unrepresentative of their surroundings. Mean filteringis usually thought of as a convolution filter. Like other convolutions it is based around akernel, which represents the shape and size of the neighbourhood to be sampled whencalculating the mean. Often a 3×3 square kernel is used, as shown in Figure 1, althoughlarger kernels (e.g. 5×5 squares) can be used for more severe smoothing. (Note that asmall kernel can be applied more than once in order to produce a similar - but notidentical - effect as a single pass with a large kernel.)Figure 1 3×3 averaging kernel often used in mean filteringComputing the straightforward convolution of an image with this kernel carries out themean filtering process.Fourier TransformCommon Names: Fourier Transform, Spectral Analysis, Frequency AnalysisBrief DescriptionThe Fourier Transform is an important image processing tool which is used to decomposean image into its sine and cosine components. The output of the transformation representsthe image in the Fourier or frequency space, while the input image is the real spaceequivalent. In the Fourier space image, each point represents a particular frequencycontained in the real domain image.The Fourier Transform is used in a wide range of applications, such as image analysis,image filtering, image reconstruction and image compression.
  • How It WorksAs we are only concerned with digital images, we will restrict this discussion to theDiscrete Fourier Transform (DFT).The DFT is the sampled Fourier Transform and therefore does not contain all frequenciesforming an image, but only a set of samples which is large enough to fully describe thereal domain image. The number of frequencies corresponds to the number of pixels in thereal domain image, i.e. the image in the real and Fourier space are of the same size.For a square image of size N×N, the two-dimensional DFT is given by:where f(i,j) is the image in the real space and the exponential term is the basis functioncorresponding to each point F(k,l) in the Fourier space. The equation can be interpretedas: the value of each point F(k,l) is obtained by multiplying the real image with thecorresponding base function and summing the result.The basis functions are sine and cosine waves with increasing frequencies, i.e. F(0,0)represents the DC-component of the image which corresponds to the average brightnessand F(N-1,N-1) represents the highest frequency.In a similar way, the Fourier image can be re-transformed to the real domain. The inverseFourier transform is given by:To obtain the result for the above equations, a double sum has to be calculated for eachimage point. However, because the Fourier Transform is separable, it can be written aswhereUsing these two formulas, the real domain image is first transformed into an intermediateimage using N one-dimensional Fourier Transforms. This intermediate image is thentransformed into the final image, again using N one-dimensional Fourier Transforms.
  • Expressing the two-dimensional Fourier Transform in terms of a series of 2N one-dimensional transforms decreases the number of required computations.Even with these computational savings, the ordinary one-dimensional DFT hascomplexity. This can be reduced to if we employ the Fast Fourier Transform(FFT) to compute the one dimension DFTs. This is a significant improvement, inparticular for large images. There are various forms of the FFT and most of them restrictthe size of the input image that may be transformed, often to . The mathematicaldetails are well described in the literature.The Fourier Transform produces a complex number valued output image which can bedisplayed with two real images, either with the real and imaginary part or withmagnitude and phase. In image processing, often only the magnitude of the FourierTransform is displayed - as it contains most of the information of the geometric structureof the real space image. However, if we want to re-transform the Fourier image into thecorrect real space after some processing in the frequency domain, we must make sure topreserve both magnitude and phase of the Fourier image.The Fourier domain image has a much greater range than the image in the real space.Hence, to be sufficiently accurate, its values are usually calculated and stored in floatvalues.
  • Results:Interpolation: slice 5 slice 5.6 (Interpolated) slice 6Interpolation for varying depth along x axis: (a) for N = 10 (b) for N = 5Interpolation for varying depth along y axis: (a) for N = 10 (b) for N = 5
  • Filtering GUI:
  • Further Improvements: 1. Methods can be developed to generate 3D surface plot of the image data. 2. The other feature can be the generation of image slice along any plane cutting through the 3D model. 3. Registration of different modality images followed by useful operations like interpolation, change detection, segmentation etc. can be done.
  • Acknowledgement: I express my acknowledgement to Dr. Uma Shankar Tiwari, underwhose guidance I carried out the project development. I amthankful for his support and the valuable ideas he shared with me.
  • References: Books: 3D Imaging in Medicine Edited by: Jayaram K. Udupa, Ph.D. University of Pennsylvania Philadelphia, PA Gabor T. Herman, Ph.D. University of Pennsylvania Philadelphia, PA Digital Image Processing (2nd Edition) Authors: Rafael C. Gonzalez Richard E. Woods Web Resources: GE Medical Systems http://www.gemedicalsystems.com Image Processing Fundamentals http://www.ph.tn.tudelft.nl/Courses/FIP/noframes/fip.html Brain Imaging http://faculty.washington.edu/chudler/image.html The Whole Brain Atlas http://www.med.harvard.edu/AANLIB/home.html Papers: Survey: Interpolation Methods in Medical Image Processing Thomas M. Lehmann,* Member, IEEE, Claudia G¨onner, and Klaus Spitzer