SlideShare a Scribd company logo
1 of 9
Download to read offline
International 
OPEN ACCESS Journal 
Of Modern Engineering Research (IJMER) 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 17| 
Multiexposure Image Fusion Suhaina J1, Bindu K. R2 1Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India 2Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India 
I. INTRODUCTION 
Often a single sensor cannot produce a complete representation of a scene. Visible images provide spectral and spatial details, and if a target has the same color and spatial characteristics as its background, it cannot be distinguished from the background. Image fusion is the process of combining information from two or more images of a scene into a single composite image that is more informative and is more suitable for visual perception or computer processing. The objective in image fusion is to reduce uncertainty and minimize redundancy in the output while maximizing relevant information particular to an application or task. Given the same set of input images, different fused images may be created depending on the specific application and what is considered relevant information. There are several benefits in using image fusion: wider spatial and temporal coverage, decreased uncertainty, improved reliability, and increased robustness of system performance. A large number of image fusion methods [1]–[4] have been proposed in literature. Among these methods, multiscale image fusion [2] and data-driven image fusion [3] are very successful methods. They focus on different data representations, e.g., multi-scale coefficients [5], [6], or data driven decomposition coefficients [3], [7] and different image fusion rules to guide the fusion of coefficients. The major advantage of these methods is that they can well preserve the details of different source images. However, these kinds of methods may produce brightness and color distortions since spatial consistency is not well considered in the fusion process. Spatial consistency means that if two adjacent pixels have similar brightness or color, they will tend to have similar weights. A popular spatial consistency based fusion approach is formulating an energy function, where the pixel saliencies are encoded in the function and edge aligned weights are enforced by regularization terms, e.g., a smoothness term. This energy function can be then minimized globally to obtain the desired weight maps. To make full use of spatial context, optimization based image fusion approaches, e.g., generalized random walks [8], and Markov random fields [9] based methods have been proposed. These methods focus on estimating spatially smooth and edge aligned weights by solving an energy function and then fusing the source images by weighted average of pixel values. However, optimization based methods have a common limitation, i.e., inefficiency, since they require multiple iterations to find the global optimal solution. Moreover, another drawback is that global optimization based methods may over-smooth the resulting weights, which is not good for fusion. An interesting alternative to optimization based method is guided image filtering [10]. The proposed method employs guided filtering for layer extraction. The extracted layers are then fused separately. The remainder of this paper is organized as follows. In Section II, the guided image filtering algorithm is reviewed. Section III describes the proposed image fusion algorithm. The experimental results and discussions are presented in Section IV. Finally, Section V concludes the paper. 
Abstract: Many applications such as robot navigation, defense, medical and remote sensing perform various processing tasks, which can be performed more easily when all objects in different images of the same scene are combined into a single fused image. In this paper, we propose a fast and effective method for image fusion. The proposed method derives the intensity based variations that is large and small scale, from the source images. In this approach, guided filtering is employed for this extraction. Gaussian and Laplacian pyramidal approach is then used to fuse the different layers obtained. Experimental results demonstrate that the proposed method can obtain better performance for fusion of all sets of images. The results clearly indicate the feasibility of the proposed approach. 
Keywords: Gaussian Pyramid, Guided Filter, Image Fusion, Laplacian Pyramid, Multi-exposure images
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 18| 
II. GUIDED IMAGE FILTERING 
Guided filter is an image filter derived from a local linear model. It computes the filtering output by considering the content of a guidance image, which can be the input image itself or another different image. The guided filter can be used as an edge-preserving smoothing operator like the popular bilateral filter, but it has better behaviors near edges. The guided filter is also a more generic concept beyond smoothing: It can transfer the structures of the guidance image to the filtering output, enabling new filtering applications like dehazing and guided feathering. Moreover, the guided filter naturally has a fast and nonapproximate linear time algorithm, regardless of the kernel size and the intensity range. Currently, it is one of the fastest edge- preserving filters. Guided filter is both effective and efficient in a great variety of computer vision and computer graphics applications, including edge-aware smoothing, detail enhancement, HDR compression, image matting/feathering, dehazing, joint upsampling, etc. The filtering output is locally a linear transform of the guidance image. On one hand, the guided filter has good edge-preserving smoothing properties like the bilateral filter, but it does not suffer from the gradient reversal artifacts. On the other hand, the guided filter can be used beyond smoothing: With the help of the guidance image, it can make the filtering output more structured and less smoothed than the input. Moreover, the guided filter naturally has an O(N) time (in the number of pixels N) nonapproximate algorithm for both gray-scale and high-dimensional images, regardless of the kernel size and the intensity range. Typically, the CPU implementation achieves 40 ms per mega-pixel performing gray-scale filtering. It has great potential in computer vision and graphics, given its simplicity, efficiency, and high-quality. Fig. 2.1. Illustrations of the bilateral filtering process (left) and the guided filtering process (right) 2.1 Guided filter A general linear translation-variant filtering process is defined, which involves a guidance image I, an filtering input image p, and an output image q. Both I and p are given beforehand according to the application, and they can be identical. The filtering output at a pixel i is expressed as a weighted average: (1) where i and j are pixel indexes. The filter kernel Wij is a function of the guidance image I and independent of p. This filter is linear with respect to p. An example of such a filter is the joint bilateral filter (Fig. 2.1 (left)). The bilateral filtering kernel Wbf is given by : (2) where x is the pixel coordinate and Ki is a normalizing parameter to ensure . The parameters s and r adjust the sensitivity of the spatial similarity and the range (intensity/color) similarity, respectively. The joint bilateral filter degrades to the original bilateral filter when I and p are identical. The implicit weighted- average filters optimize a quadratic function and solve a linear system in this form: (3) where q and p are N-by-1 vectors concatenating {qi} and {pi}, respectively, and A is an N-by-N matrix only depends on I. The solution to (3), i.e., , has the same form as (1), with . The key assumption of the guided filter is a local linear model between the guidance I and the filtering output q. We assume that q is a linear transform of I in a window wk centered at the pixel k: (4)
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 19| 
where (ak, bk) are some linear coefficients assumed to be constant in wk. We use a square window of radius r. This local linear model ensures that q has an edge only if I has an edge, because . This model has been proven useful in image super-resolution, image matting and dehazing. To determine the linear coefficients {ak, bk}, we need constraints from the filtering input p. We model the output q as the input p subtracting some unwanted components n like noise/textures: (5) A solution that minimizes the difference between q and p while maintaining the linear model (4) is suggested. Specifically, the following cost function in the window wk is minimized : (6) Here, is a regularization parameter penalizing large ak. Equation (6) is the linear ridge regression model [11] and its solution is given by : (7) (8) Here,k and are mean and variance of I in wk, |w| is the number of pixels in wk, and is the mean of p in wk. Having obtained the linear coefficients {ak, bk}, we can compute the filtering output qi by (4). Fig.2.1 (right) shows an illustration of the guided filtering process. However, a pixel i is involved in all the overlapping windows wk that covers i, so the value of qi in (4) is not identical when it is computed in different windows. A simple strategy is to average all the possible values of qi. So after computing (ak, bk) for all windows wk in the image, we compute the filtering output by : (9) Noticing that due to the symmetry of the box window, (9) is rewritten as : (10) where and are the average coefficients of all windows overlapping i. The averaging strategy of overlapping windows is popular in image denoising. With the modification in (10), q is no longer scaling of I because the linear coefficients vary spatially. But as are the output of a mean filter, their gradients can be expected to be much smaller than that of I near strong edges. In short, abrupt intensity changes in I can be mostly preserved in q. Equations (7), (8), and (10) are the definition of the guided filter. A pseudocode is in Algorithm 1. In this algorithm, fmean is a mean filter with a window radius r. The abbreviations of correlation (corr), variance (var), and covariance (cov) indicate the intuitive meaning of these variables. Algorithm 1. Guided Filter. Input: filtering input image p, guidance image I, radius r, regularization Output: filtering output q. 1: meanI = fmean(I) meanp = fmean(p) corrI = fmean(I.*I) corrIp = fmean(I.*p) 2: varI = corrI - meanI .* meanI covIp = corrIp - meanI .* meanp 3: a = covIp ./(varI + ) b = meanp – a.* meanI 4: meana = fmean(a) meanb = fmean(b) 5: q = meana .* I + meanb 
III. OVERALL APPROACH 
The flowchart of the proposed image fusion method is shown in Fig. 3.1. We first employ guided filtering for the extraction of base layers and detail layers from the input images. 푞푖 computed in (9) preserves the strongest edges in 퐼 while smoothing small changes in intensity. Let bKbe the base layer computed from (9) (i.e.,bK = 푞푖 and 1 ≤ 퐾 ≤ 푁) for 퐾th input image denoted by 퐼퐾. The detail layer is defined as the difference between the guided filter output and the input image, which is defined as
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 20| 
(11) Fig 3.1. Flowchart of the proposed method 3.1 Base Layer Fusion The pyramid representation expresses an image as a sum of spatially band-passed images while retaining local spatial information in each band. A pyramid is created by lowpass filtering an image 퐺0 with a compact two-dimensional filter. The filtered image is then subsampled by removing every other pixel and every other row to obtain a reduced image 퐺1.This process is repeated to form a Gaussian pyramid 퐺0, 퐺1, 퐺2, 퐺3, . . . , 퐺푑. Expanding 퐺1 to the same size as 퐺0 and subtracting yields the band-passed image 퐿0. A Laplacian pyramid 퐿0, 퐿1, 퐿2, . . . , 퐿푑−1, can be built containing band-passed images of decreasing size and spatial frequency. lll+1 (12) where l refers to the number of levels in the pyramid. The original image can be reconstructed from the expanded band-pass images: (13) The Gaussian pyramid contains low-passed versions of the original 퐺0, at progressively lower spatial frequencies. This effect is clearly seen when the Gaussian pyramid levels are expanded to the same size as 퐺0. The Laplacian pyramid consists of band-passed copies of 퐺0. Each Laplacian level contains the edges of a certain size and spans approximately an octave in spatial frequency. (a) Quality measures Many images in the stack contain flat, colorless regions due to under- and overexposure. Such regions should receive less weight, while interesting areas containing bright colors and details should be preserved. The following measures are used to achieve this: • Contrast: Contrast is created by the difference in luminance reflected from two adjacent surfaces. In other words, contrast is the difference in visual properties that makes an object distinguishable from other object and the background. In visual perception contrast is determined by the difference in color and brightness of the object with other object. It is the difference between the darker and the lighter pixel of the image, if it is big the image will have high contrast and in the other case the image will have low contrast. A Laplacian filter is applied to the grayscale version of each image, and take the absolute value of the filter response. This yields a simple indicator C for contrast. It tends to assign a high weight to important elements such as edges and texture. • Saturation: As a photograph undergoes a longer exposure, the resulting colors become desaturated and eventually clipped. The saturation of a color is determined by a combination of light intensity and how much it is distributed across the spectrum of different wavelengths. The purest (most saturated) color is achieved by using just one wavelength at a high intensity, such as in laser light. If the intensity drops, then as a result the saturation drops. Saturated colors are desirable and make the image look vivid. A saturation measure S is included which is computed as the standard deviation within the R, G and B channel, at each pixel. 
• Exposure: Exposure is a term that refers to two aspects of photography – it is referring to how to control the lightness and the darkness of the image. In photography, exposure is the amount of light per unit area reaching a photographic film. A photograph may be described as overexposed when it has a loss of highlight detail, that is, when important bright parts of an image are washed out or effectively all white, known as blown out highlights or clipped whites. A photograph may be described as underexposed when it has a loss of shadow detail, that is, when important dark areas are muddy or indistinguishable from black, known as blocked up shadows. Looking at just the raw intensities within a channel, reveals how well a pixel is exposed. We want to
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 21| 
keep intensities that are not near zero (underexposed) or one (overexposed). We weight each intensity i based on how close it is to 0.5 using a Gauss curve: (14) To account for multiple color channels, we apply the Gauss curve to each channel separately, and multiply the results, yielding the measure E. The fused base layer is computed as the weighted sum of the base layers 푏1푏2. . . ,푏푁obtained across 푁 input exposures. Pyramidal approach is used to generate Laplacian pyramid of the base layers L{푏퐾}푙 and Gaussian pyramid of weight map functions 퐺{푊퐾}푙 estimated from three quality measures (i.e., saturation 푆퐾, contrast 퐶퐾, and exposure 퐸퐾). Here, 푙 (0 < 푙 < 푑) refers to the number of levels in the pyramid and 퐾 (1 < 퐾 < 푁) refers to the number of input images. The weight map is computed as the product of these three quality metrics (i.e. WK = 푆퐾 ⋅ 퐶퐾 ⋅ 퐸퐾). The L{푏퐾}푙 multiplied with the corresponding 퐺{푊퐾(푖‟, 푗‟)}푙 and summing over 퐾 yield modified Laplacian pyramid 퐿푙(푖‟, 푗‟) as follows: (15) Thethat contains well exposed pixels is reconstructed by expanding each level and then summing all the levels of the Laplacian pyramid: (16) 3.2 Detail Layer Fusion The detail layers computed in (11) across all the input exposures are linearly combined to produce fused detail layer 푑푓 that yields combined texture information as follows: (17) where is the user defined parameter to control amplification of texture details (typically set to 5). Finally, the detail enhanced fused image 푔is easily computed by simply adding up the fused base layer 푏푓computed in (16) and the manipulated fused detail layer in (17) as follows: (18) 3.3 Numerical Analysis Numerical analysis is the process of evaluating a technique via some objective metrics. For this purpose, two fusion quality metrics [12], i.e., information theory based metric (QMI) [13] and structure based metrics (Qc) [14] are adopted. In order to assess the fusion performance, fusion quality metric is used. (a) Normalized mutual information (QMI) Normalized mutual information, QMI is an information theory based metric. Mutual information improves image fusion quality assessments. One problem with traditional mutual information metric is that it is unstable and may bias the measure towards the source image with the highest entropy. The size of the overlapping part of the images influences the mutual information measure in two ways. First of all, a decrease in overlap decreases the number of samples, which reduces the statistical power of the probability distribution estimation. Secondly, with increasing misregistration (which usually coincides with decreasing overlap) the mutual information measure may actually increase. This can occur when the relative areas of object and background even out and the sum of the marginal entropies increases, faster than the joint entropy. Normalized measure of mutual information is less sensitive to changes in overlap. Hossny et al. modified it to the normalized mutual information [13]. Here, Hossny et al.‟s definition is adopted. (19) where H(A), H(B) and H(F) are the marginal entropy of A, B and F, and MI(A, F)is the mutual information between the source image A and the fused image F. (20) where H(A, F) is the joint entropy between A and F, H(A) and H(F) are the marginal entropy of A and F, respectively, and MI(B,F) is similar to MI(A, F). The quality metric QMI measures how well the original information from source images is preserved in the fused image. (b) Cvejic et al.’s metric (Qc) Cvejic et al.‟s metric, Qc is a structure based metric. It is calculated as follows :
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 22| 
(21) where is calculated as follows : (22) and are the covariance between A,B and F, UIQI refers to the universal image quality index. The Qc quality metric estimates how well the important information in the source images is preserved in the fused image, while minimizing the amount of distortion that could interfere with interpretation. 
IV. Results And Discussion 
The system described above is implemented using Matlab and the result was successfully obtained. In this section, the obtained results are provided. Figure 4.1 shows the base and detail layers. 
Figure 4.2: Base layers and Detail layers Pyramidal approach is used for fusing base layers. Quality measures of images are considered to compute the weight map. Weight map is the combination of contrast, saturation and exposure. Figure 4.3 shows the gaussian pyramid of weight map function. Figure 4.3: Gaussian pyramid Laplacian pyramid of the base layers are generated. Thus obtained laplacian pyramid is shown in figure 4.4. Figure 4.4: Laplacian pyramid Fused pyramid is obtained by combining the Gaussian pyramid of weight map functions and Laplacian pyramid of base layers. Figure 4.5 shows the fused pyramid.
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 23| 
Figure 4.5: Fused pyramid Fused base layer is the weighted sum of base layers. The detail layers obtained are boosted and fused. Figure 4.6 shows the fused base and detail layers. 
Figure 4.6: Fused base layer and detail layer 
Finally, the fused image is obtained by combining the obtained fused base and fused detail layers. The fused image is shown in figure 4.7. Numerical analysis is performed on the obtained results. Figure 4.7: Fused image The following shows the results obtained for some of the other source images. 
(a) (b) 
Figure 4.8: (a) Source Images (b) Fused Image
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 24| 
(a) (b) 
Figure 4.9: (a) Source Images (b) Fused Image 
(a) (b) 
Figure 4.10: (a) Source Images (b) Fused Image 
V. Conclusion 
We proposed a technique for fusing multiexposure input images. The proposed method constructs a detail enhanced image from a set of multiexposure images by using a multiresolution decomposition technique. When compared with the existing techniques which use multiresolution and single resolution analysis for exposure fusion, the current proposed method performs better in terms of enhancement of texture details in the fused image. The framework is inspired by the edge preserving property of guided filter that has better response near strong edges. Experiments show that the proposed method can well preserve the original and complementary information of multiple input images. REFERENCES [1] S. Li, J. Kwok, I. Tsang, and Y. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw., vol. 15, no. 6, pp. 1555–1561, Nov. 2004. [2] G. Pajares and J. M. de la Cruz, “A wavelet-based image fusion tutorial,” Pattern Recognit., vol. 37, no. 9, pp. 1855–1872, Sep. 2004. [3] D. Looney and D. Mandic, “Multiscale image fusion using complex extensions of EMD,” IEEE Trans. Signal Process., vol. 57, no. 4, pp. 1626–1630, Apr. 2009. [4] M. Kumar and S. Dass, “A total variation-based algorithm for pixellevel image fusion,” IEEE Trans. Image Process., vol. 18, no. 9, pp. 2137–2143, Sep. 2009. [5] P. Burt and E. Adelson, “The laplacian pyramid as a compact image code,” IEEE Trans. Commun., vol. 31, no. 4, pp. 532–540, Apr. 1983. [6] O. Rockinger, “Image sequence fusion using a shift-invariant wavelet transform,” in Proc. Int. Conf. Image Process., vol. 3, Washington, DC, USA, Oct. 1997, pp. 288–291. [7] J. Liang, Y. He, D. Liu, and X. Zeng, “Image fusion using higher order singular value decomposition,” IEEE Trans. Image Process., vol. 21, no. 5, pp. 2898–2909, May 2012. [8] R. Shen, I. Cheng, J. Shi, and A. Basu, “Generalized random walks for fusion of multi-exposure images,” IEEE Trans. Image Process., vol. 20, no. 12, pp. 3634–3646, Dec. 2011.
Multiexposure Image Fusion 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 25| 
[9] M. Xu, H. Chen, and P. Varshney, “An image fusion approach based on markov random fields,” IEEE Trans. Geosci. Remote Sens., vol. 49, no. 12, pp. 5116–5127, Dec. 2011. [10] K. He, J. Sun, and X. Tang, “Guided image filtering,” in Proc. Eur. Conf. Comput. Vis., Heraklion, Greece, Sep. 2010, pp. 1–14. [11] N. Draper and H. Smith, Applied Regression Analysis. New York, USA: Wiley, 1981. [12] Z. Liu, E. Blasch, Z. Xue, J. Zhao, R. Laganiere, and W. Wu,“Objective assessment of multiresolution image fusion algorithms for context enhancement in night vision: A comparative study,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 1, pp. 94–109, Jan. 2012. [13] M. Hossny, S. Nahavandi, and D. Creighton, “Comments on „information measure for performance of image fusion‟,” Electron. Lett., vol. 44, no. 18, pp. 1066–1067, Aug. 2008. [14] N. Cvejic, A. Loza, D. Bull, and N. Canagarajah, “A similarity metric for assessment of image fusion algorithms,” Int. J. Signal Process., vol. 2, no. 3, pp. 178–182, Apr. 2005.

More Related Content

What's hot

Multiple Ant Colony Optimizations for Stereo Matching
Multiple Ant Colony Optimizations for Stereo MatchingMultiple Ant Colony Optimizations for Stereo Matching
Multiple Ant Colony Optimizations for Stereo MatchingCSCJournals
 
Feature extraction based retrieval of
Feature extraction based retrieval ofFeature extraction based retrieval of
Feature extraction based retrieval ofijcsity
 
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET Journal
 
Object based image enhancement
Object based image enhancementObject based image enhancement
Object based image enhancementijait
 
Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...sipij
 
Hierarchical Approach for Total Variation Digital Image Inpainting
Hierarchical Approach for Total Variation Digital Image InpaintingHierarchical Approach for Total Variation Digital Image Inpainting
Hierarchical Approach for Total Variation Digital Image InpaintingIJCSEA Journal
 
K-Means Clustering in Moving Objects Extraction with Selective Background
K-Means Clustering in Moving Objects Extraction with Selective BackgroundK-Means Clustering in Moving Objects Extraction with Selective Background
K-Means Clustering in Moving Objects Extraction with Selective BackgroundIJCSIS Research Publications
 
Probabilistic model based image segmentation
Probabilistic model based image segmentationProbabilistic model based image segmentation
Probabilistic model based image segmentationijma
 
A Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray ImagesA Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray ImagesIJERA Editor
 
Brain tumor segmentation using asymmetry based histogram thresholding and k m...
Brain tumor segmentation using asymmetry based histogram thresholding and k m...Brain tumor segmentation using asymmetry based histogram thresholding and k m...
Brain tumor segmentation using asymmetry based histogram thresholding and k m...eSAT Publishing House
 
Development of stereo matching algorithm based on sum of absolute RGB color d...
Development of stereo matching algorithm based on sum of absolute RGB color d...Development of stereo matching algorithm based on sum of absolute RGB color d...
Development of stereo matching algorithm based on sum of absolute RGB color d...IJECEIAES
 
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...IOSR Journals
 
Image segmentation based on color
Image segmentation based on colorImage segmentation based on color
Image segmentation based on coloreSAT Journals
 
Region filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpaintingRegion filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpaintingWoonghee Lee
 

What's hot (18)

Multiple Ant Colony Optimizations for Stereo Matching
Multiple Ant Colony Optimizations for Stereo MatchingMultiple Ant Colony Optimizations for Stereo Matching
Multiple Ant Colony Optimizations for Stereo Matching
 
Feature extraction based retrieval of
Feature extraction based retrieval ofFeature extraction based retrieval of
Feature extraction based retrieval of
 
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
 
Image Inpainting
Image InpaintingImage Inpainting
Image Inpainting
 
IJET-V2I6P17
IJET-V2I6P17IJET-V2I6P17
IJET-V2I6P17
 
Object based image enhancement
Object based image enhancementObject based image enhancement
Object based image enhancement
 
Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...
 
Hierarchical Approach for Total Variation Digital Image Inpainting
Hierarchical Approach for Total Variation Digital Image InpaintingHierarchical Approach for Total Variation Digital Image Inpainting
Hierarchical Approach for Total Variation Digital Image Inpainting
 
K-Means Clustering in Moving Objects Extraction with Selective Background
K-Means Clustering in Moving Objects Extraction with Selective BackgroundK-Means Clustering in Moving Objects Extraction with Selective Background
K-Means Clustering in Moving Objects Extraction with Selective Background
 
Probabilistic model based image segmentation
Probabilistic model based image segmentationProbabilistic model based image segmentation
Probabilistic model based image segmentation
 
A Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray ImagesA Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray Images
 
Brain tumor segmentation using asymmetry based histogram thresholding and k m...
Brain tumor segmentation using asymmetry based histogram thresholding and k m...Brain tumor segmentation using asymmetry based histogram thresholding and k m...
Brain tumor segmentation using asymmetry based histogram thresholding and k m...
 
B42020710
B42020710B42020710
B42020710
 
Medial axis transformation based skeletonzation of image patterns using image...
Medial axis transformation based skeletonzation of image patterns using image...Medial axis transformation based skeletonzation of image patterns using image...
Medial axis transformation based skeletonzation of image patterns using image...
 
Development of stereo matching algorithm based on sum of absolute RGB color d...
Development of stereo matching algorithm based on sum of absolute RGB color d...Development of stereo matching algorithm based on sum of absolute RGB color d...
Development of stereo matching algorithm based on sum of absolute RGB color d...
 
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
 
Image segmentation based on color
Image segmentation based on colorImage segmentation based on color
Image segmentation based on color
 
Region filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpaintingRegion filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpainting
 

Viewers also liked

Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...CSCJournals
 
Iaetsd a novel approach to assess the quality of tone mapped
Iaetsd a novel approach to assess the quality of tone mappedIaetsd a novel approach to assess the quality of tone mapped
Iaetsd a novel approach to assess the quality of tone mappedIaetsd Iaetsd
 
Quality assessment of image fusion
Quality assessment of image fusionQuality assessment of image fusion
Quality assessment of image fusionijitjournal
 
On Some Integrals of Products of H -Functions
On Some Integrals of Products of  H -FunctionsOn Some Integrals of Products of  H -Functions
On Some Integrals of Products of H -FunctionsIJMER
 
Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...
Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...
Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...IJMER
 
Fd2427822788
Fd2427822788Fd2427822788
Fd2427822788IJMER
 
Design of Elliptical Patch Antenna with Single & Double U-Slot for Wireless ...
Design of Elliptical Patch Antenna with Single & Double U-Slot  for Wireless ...Design of Elliptical Patch Antenna with Single & Double U-Slot  for Wireless ...
Design of Elliptical Patch Antenna with Single & Double U-Slot for Wireless ...IJMER
 
The Contribution of Solar Energy to Reduce Electricity Shortage in the Gaza ...
The Contribution of Solar Energy to Reduce Electricity Shortage  in the Gaza ...The Contribution of Solar Energy to Reduce Electricity Shortage  in the Gaza ...
The Contribution of Solar Energy to Reduce Electricity Shortage in the Gaza ...IJMER
 
Optimization of Tool Wear: A Review
Optimization of Tool Wear: A ReviewOptimization of Tool Wear: A Review
Optimization of Tool Wear: A ReviewIJMER
 
Ijmer 46064551
Ijmer 46064551Ijmer 46064551
Ijmer 46064551IJMER
 
Application of Weighted Centroid Approach in Base Station Localization for Mi...
Application of Weighted Centroid Approach in Base Station Localization for Mi...Application of Weighted Centroid Approach in Base Station Localization for Mi...
Application of Weighted Centroid Approach in Base Station Localization for Mi...IJMER
 
Vibrational Analysis a Key for Pump Maintenance-Case Study
Vibrational Analysis a Key for Pump Maintenance-Case StudyVibrational Analysis a Key for Pump Maintenance-Case Study
Vibrational Analysis a Key for Pump Maintenance-Case StudyIJMER
 
Vision Based Deep Web data Extraction on Nested Query Result Records
Vision Based Deep Web data Extraction on Nested Query Result RecordsVision Based Deep Web data Extraction on Nested Query Result Records
Vision Based Deep Web data Extraction on Nested Query Result RecordsIJMER
 
Heavy Metals in Irrigated Crops along Tatsawarki River in Kano, Nigeria
Heavy Metals in Irrigated Crops along Tatsawarki River in Kano, NigeriaHeavy Metals in Irrigated Crops along Tatsawarki River in Kano, Nigeria
Heavy Metals in Irrigated Crops along Tatsawarki River in Kano, NigeriaIJMER
 
A Novel High Performance H-Bridge Converter Topology for 8/6 pole SRM Drive
A Novel High Performance H-Bridge Converter Topology for 8/6  pole SRM DriveA Novel High Performance H-Bridge Converter Topology for 8/6  pole SRM Drive
A Novel High Performance H-Bridge Converter Topology for 8/6 pole SRM DriveIJMER
 
Total Harmonic Distortion Alleviation by using Shunt Active Filter
Total Harmonic Distortion Alleviation by using Shunt Active FilterTotal Harmonic Distortion Alleviation by using Shunt Active Filter
Total Harmonic Distortion Alleviation by using Shunt Active FilterIJMER
 
Accelerating Real Time Applications on Heterogeneous Platforms
Accelerating Real Time Applications on Heterogeneous PlatformsAccelerating Real Time Applications on Heterogeneous Platforms
Accelerating Real Time Applications on Heterogeneous PlatformsIJMER
 
3D Median Filter Design for Iris Recognition
3D Median Filter Design for Iris Recognition3D Median Filter Design for Iris Recognition
3D Median Filter Design for Iris RecognitionIJMER
 
Advanced Automation System in Industrial Applications Using PIC Microcontrol...
Advanced Automation System in Industrial Applications Using  PIC Microcontrol...Advanced Automation System in Industrial Applications Using  PIC Microcontrol...
Advanced Automation System in Industrial Applications Using PIC Microcontrol...IJMER
 
Di2644594461
Di2644594461Di2644594461
Di2644594461IJMER
 

Viewers also liked (20)

Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
 
Iaetsd a novel approach to assess the quality of tone mapped
Iaetsd a novel approach to assess the quality of tone mappedIaetsd a novel approach to assess the quality of tone mapped
Iaetsd a novel approach to assess the quality of tone mapped
 
Quality assessment of image fusion
Quality assessment of image fusionQuality assessment of image fusion
Quality assessment of image fusion
 
On Some Integrals of Products of H -Functions
On Some Integrals of Products of  H -FunctionsOn Some Integrals of Products of  H -Functions
On Some Integrals of Products of H -Functions
 
Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...
Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...
Energy Based Analysis of a Thermal Power Station for Energy Efficiency Improv...
 
Fd2427822788
Fd2427822788Fd2427822788
Fd2427822788
 
Design of Elliptical Patch Antenna with Single & Double U-Slot for Wireless ...
Design of Elliptical Patch Antenna with Single & Double U-Slot  for Wireless ...Design of Elliptical Patch Antenna with Single & Double U-Slot  for Wireless ...
Design of Elliptical Patch Antenna with Single & Double U-Slot for Wireless ...
 
The Contribution of Solar Energy to Reduce Electricity Shortage in the Gaza ...
The Contribution of Solar Energy to Reduce Electricity Shortage  in the Gaza ...The Contribution of Solar Energy to Reduce Electricity Shortage  in the Gaza ...
The Contribution of Solar Energy to Reduce Electricity Shortage in the Gaza ...
 
Optimization of Tool Wear: A Review
Optimization of Tool Wear: A ReviewOptimization of Tool Wear: A Review
Optimization of Tool Wear: A Review
 
Ijmer 46064551
Ijmer 46064551Ijmer 46064551
Ijmer 46064551
 
Application of Weighted Centroid Approach in Base Station Localization for Mi...
Application of Weighted Centroid Approach in Base Station Localization for Mi...Application of Weighted Centroid Approach in Base Station Localization for Mi...
Application of Weighted Centroid Approach in Base Station Localization for Mi...
 
Vibrational Analysis a Key for Pump Maintenance-Case Study
Vibrational Analysis a Key for Pump Maintenance-Case StudyVibrational Analysis a Key for Pump Maintenance-Case Study
Vibrational Analysis a Key for Pump Maintenance-Case Study
 
Vision Based Deep Web data Extraction on Nested Query Result Records
Vision Based Deep Web data Extraction on Nested Query Result RecordsVision Based Deep Web data Extraction on Nested Query Result Records
Vision Based Deep Web data Extraction on Nested Query Result Records
 
Heavy Metals in Irrigated Crops along Tatsawarki River in Kano, Nigeria
Heavy Metals in Irrigated Crops along Tatsawarki River in Kano, NigeriaHeavy Metals in Irrigated Crops along Tatsawarki River in Kano, Nigeria
Heavy Metals in Irrigated Crops along Tatsawarki River in Kano, Nigeria
 
A Novel High Performance H-Bridge Converter Topology for 8/6 pole SRM Drive
A Novel High Performance H-Bridge Converter Topology for 8/6  pole SRM DriveA Novel High Performance H-Bridge Converter Topology for 8/6  pole SRM Drive
A Novel High Performance H-Bridge Converter Topology for 8/6 pole SRM Drive
 
Total Harmonic Distortion Alleviation by using Shunt Active Filter
Total Harmonic Distortion Alleviation by using Shunt Active FilterTotal Harmonic Distortion Alleviation by using Shunt Active Filter
Total Harmonic Distortion Alleviation by using Shunt Active Filter
 
Accelerating Real Time Applications on Heterogeneous Platforms
Accelerating Real Time Applications on Heterogeneous PlatformsAccelerating Real Time Applications on Heterogeneous Platforms
Accelerating Real Time Applications on Heterogeneous Platforms
 
3D Median Filter Design for Iris Recognition
3D Median Filter Design for Iris Recognition3D Median Filter Design for Iris Recognition
3D Median Filter Design for Iris Recognition
 
Advanced Automation System in Industrial Applications Using PIC Microcontrol...
Advanced Automation System in Industrial Applications Using  PIC Microcontrol...Advanced Automation System in Industrial Applications Using  PIC Microcontrol...
Advanced Automation System in Industrial Applications Using PIC Microcontrol...
 
Di2644594461
Di2644594461Di2644594461
Di2644594461
 

Similar to Multiexposure Image Fusion

Multiexposure Image Fusion
Multiexposure Image FusionMultiexposure Image Fusion
Multiexposure Image FusionIJMER
 
Iaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filterIaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filterIaetsd Iaetsd
 
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...iosrjce
 
Image fusion using nsct denoising and target extraction for visual surveillance
Image fusion using nsct denoising and target extraction for visual surveillanceImage fusion using nsct denoising and target extraction for visual surveillance
Image fusion using nsct denoising and target extraction for visual surveillanceeSAT Publishing House
 
Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...
Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...
Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...IJTET Journal
 
Improved Weighted Least Square Filter Based Pan Sharpening using Fuzzy Logic
Improved Weighted Least Square Filter Based Pan Sharpening using Fuzzy LogicImproved Weighted Least Square Filter Based Pan Sharpening using Fuzzy Logic
Improved Weighted Least Square Filter Based Pan Sharpening using Fuzzy LogicIRJET Journal
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...ijma
 
Kernel based similarity estimation and real time tracking of moving
Kernel based similarity estimation and real time tracking of movingKernel based similarity estimation and real time tracking of moving
Kernel based similarity estimation and real time tracking of movingIAEME Publication
 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingIAEME Publication
 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingIAEME Publication
 
IRJET - Dehazing of Single Nighttime Haze Image using Superpixel Method
IRJET -  	  Dehazing of Single Nighttime Haze Image using Superpixel MethodIRJET -  	  Dehazing of Single Nighttime Haze Image using Superpixel Method
IRJET - Dehazing of Single Nighttime Haze Image using Superpixel MethodIRJET Journal
 
An efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithmAn efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithmAlexander Decker
 
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier ExposurePerformance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposureiosrjce
 
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKINGA PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKINGIRJET Journal
 
Spectral approach to image projection with cubic
Spectral approach to image projection with cubicSpectral approach to image projection with cubic
Spectral approach to image projection with cubiciaemedu
 
Orientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern RecognitionOrientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern RecognitionIOSRjournaljce
 
A Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesA Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesIRJET Journal
 
Image Enhancement using Guided Filter for under Exposed Images
Image Enhancement using Guided Filter for under Exposed ImagesImage Enhancement using Guided Filter for under Exposed Images
Image Enhancement using Guided Filter for under Exposed ImagesDr. Amarjeet Singh
 

Similar to Multiexposure Image Fusion (20)

Multiexposure Image Fusion
Multiexposure Image FusionMultiexposure Image Fusion
Multiexposure Image Fusion
 
Iaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filterIaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filter
 
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
 
Image fusion using nsct denoising and target extraction for visual surveillance
Image fusion using nsct denoising and target extraction for visual surveillanceImage fusion using nsct denoising and target extraction for visual surveillance
Image fusion using nsct denoising and target extraction for visual surveillance
 
Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...
Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...
Supervised Blood Vessel Segmentation in Retinal Images Using Gray level and M...
 
Improved Weighted Least Square Filter Based Pan Sharpening using Fuzzy Logic
Improved Weighted Least Square Filter Based Pan Sharpening using Fuzzy LogicImproved Weighted Least Square Filter Based Pan Sharpening using Fuzzy Logic
Improved Weighted Least Square Filter Based Pan Sharpening using Fuzzy Logic
 
H010315356
H010315356H010315356
H010315356
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
 
Kernel based similarity estimation and real time tracking of moving
Kernel based similarity estimation and real time tracking of movingKernel based similarity estimation and real time tracking of moving
Kernel based similarity estimation and real time tracking of moving
 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matching
 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matching
 
IRJET - Dehazing of Single Nighttime Haze Image using Superpixel Method
IRJET -  	  Dehazing of Single Nighttime Haze Image using Superpixel MethodIRJET -  	  Dehazing of Single Nighttime Haze Image using Superpixel Method
IRJET - Dehazing of Single Nighttime Haze Image using Superpixel Method
 
An efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithmAn efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithm
 
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier ExposurePerformance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
 
I010634450
I010634450I010634450
I010634450
 
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKINGA PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
 
Spectral approach to image projection with cubic
Spectral approach to image projection with cubicSpectral approach to image projection with cubic
Spectral approach to image projection with cubic
 
Orientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern RecognitionOrientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern Recognition
 
A Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesA Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing Approaches
 
Image Enhancement using Guided Filter for under Exposed Images
Image Enhancement using Guided Filter for under Exposed ImagesImage Enhancement using Guided Filter for under Exposed Images
Image Enhancement using Guided Filter for under Exposed Images
 

More from IJMER

A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...IJMER
 
Developing Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed DelintingDeveloping Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed DelintingIJMER
 
Study & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja FibreStudy & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja FibreIJMER
 
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)IJMER
 
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...IJMER
 
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...IJMER
 
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...IJMER
 
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...IJMER
 
Static Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works SimulationStatic Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works SimulationIJMER
 
High Speed Effortless Bicycle
High Speed Effortless BicycleHigh Speed Effortless Bicycle
High Speed Effortless BicycleIJMER
 
Integration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise ApplicationsIntegration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise ApplicationsIJMER
 
Microcontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation SystemMicrocontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation SystemIJMER
 
On some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological SpacesOn some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological SpacesIJMER
 
Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...IJMER
 
Natural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine LearningNatural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine LearningIJMER
 
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessEvolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessIJMER
 
Material Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded CylindersMaterial Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded CylindersIJMER
 
Studies On Energy Conservation And Audit
Studies On Energy Conservation And AuditStudies On Energy Conservation And Audit
Studies On Energy Conservation And AuditIJMER
 
An Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDLAn Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDLIJMER
 
Discrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One PreyDiscrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One PreyIJMER
 

More from IJMER (20)

A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...
 
Developing Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed DelintingDeveloping Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed Delinting
 
Study & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja FibreStudy & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja Fibre
 
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
 
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
 
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
 
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
 
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
 
Static Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works SimulationStatic Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
 
High Speed Effortless Bicycle
High Speed Effortless BicycleHigh Speed Effortless Bicycle
High Speed Effortless Bicycle
 
Integration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise ApplicationsIntegration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise Applications
 
Microcontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation SystemMicrocontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation System
 
On some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological SpacesOn some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological Spaces
 
Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...
 
Natural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine LearningNatural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine Learning
 
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessEvolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
 
Material Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded CylindersMaterial Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded Cylinders
 
Studies On Energy Conservation And Audit
Studies On Energy Conservation And AuditStudies On Energy Conservation And Audit
Studies On Energy Conservation And Audit
 
An Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDLAn Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDL
 
Discrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One PreyDiscrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One Prey
 

Recently uploaded

Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINESIVASHANKAR N
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSSIVASHANKAR N
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptxthe ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptxhumanexperienceaaa
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Christo Ananth
 
chaitra-1.pptx fake news detection using machine learning
chaitra-1.pptx  fake news detection using machine learningchaitra-1.pptx  fake news detection using machine learning
chaitra-1.pptx fake news detection using machine learningmisbanausheenparvam
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSRajkumarAkumalla
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...ranjana rawat
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidNikhilNagaraju
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130Suhani Kapoor
 

Recently uploaded (20)

Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptxthe ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
 
chaitra-1.pptx fake news detection using machine learning
chaitra-1.pptx  fake news detection using machine learningchaitra-1.pptx  fake news detection using machine learning
chaitra-1.pptx fake news detection using machine learning
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfid
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
 

Multiexposure Image Fusion

  • 1. International OPEN ACCESS Journal Of Modern Engineering Research (IJMER) | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 17| Multiexposure Image Fusion Suhaina J1, Bindu K. R2 1Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India 2Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India I. INTRODUCTION Often a single sensor cannot produce a complete representation of a scene. Visible images provide spectral and spatial details, and if a target has the same color and spatial characteristics as its background, it cannot be distinguished from the background. Image fusion is the process of combining information from two or more images of a scene into a single composite image that is more informative and is more suitable for visual perception or computer processing. The objective in image fusion is to reduce uncertainty and minimize redundancy in the output while maximizing relevant information particular to an application or task. Given the same set of input images, different fused images may be created depending on the specific application and what is considered relevant information. There are several benefits in using image fusion: wider spatial and temporal coverage, decreased uncertainty, improved reliability, and increased robustness of system performance. A large number of image fusion methods [1]–[4] have been proposed in literature. Among these methods, multiscale image fusion [2] and data-driven image fusion [3] are very successful methods. They focus on different data representations, e.g., multi-scale coefficients [5], [6], or data driven decomposition coefficients [3], [7] and different image fusion rules to guide the fusion of coefficients. The major advantage of these methods is that they can well preserve the details of different source images. However, these kinds of methods may produce brightness and color distortions since spatial consistency is not well considered in the fusion process. Spatial consistency means that if two adjacent pixels have similar brightness or color, they will tend to have similar weights. A popular spatial consistency based fusion approach is formulating an energy function, where the pixel saliencies are encoded in the function and edge aligned weights are enforced by regularization terms, e.g., a smoothness term. This energy function can be then minimized globally to obtain the desired weight maps. To make full use of spatial context, optimization based image fusion approaches, e.g., generalized random walks [8], and Markov random fields [9] based methods have been proposed. These methods focus on estimating spatially smooth and edge aligned weights by solving an energy function and then fusing the source images by weighted average of pixel values. However, optimization based methods have a common limitation, i.e., inefficiency, since they require multiple iterations to find the global optimal solution. Moreover, another drawback is that global optimization based methods may over-smooth the resulting weights, which is not good for fusion. An interesting alternative to optimization based method is guided image filtering [10]. The proposed method employs guided filtering for layer extraction. The extracted layers are then fused separately. The remainder of this paper is organized as follows. In Section II, the guided image filtering algorithm is reviewed. Section III describes the proposed image fusion algorithm. The experimental results and discussions are presented in Section IV. Finally, Section V concludes the paper. Abstract: Many applications such as robot navigation, defense, medical and remote sensing perform various processing tasks, which can be performed more easily when all objects in different images of the same scene are combined into a single fused image. In this paper, we propose a fast and effective method for image fusion. The proposed method derives the intensity based variations that is large and small scale, from the source images. In this approach, guided filtering is employed for this extraction. Gaussian and Laplacian pyramidal approach is then used to fuse the different layers obtained. Experimental results demonstrate that the proposed method can obtain better performance for fusion of all sets of images. The results clearly indicate the feasibility of the proposed approach. Keywords: Gaussian Pyramid, Guided Filter, Image Fusion, Laplacian Pyramid, Multi-exposure images
  • 2. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 18| II. GUIDED IMAGE FILTERING Guided filter is an image filter derived from a local linear model. It computes the filtering output by considering the content of a guidance image, which can be the input image itself or another different image. The guided filter can be used as an edge-preserving smoothing operator like the popular bilateral filter, but it has better behaviors near edges. The guided filter is also a more generic concept beyond smoothing: It can transfer the structures of the guidance image to the filtering output, enabling new filtering applications like dehazing and guided feathering. Moreover, the guided filter naturally has a fast and nonapproximate linear time algorithm, regardless of the kernel size and the intensity range. Currently, it is one of the fastest edge- preserving filters. Guided filter is both effective and efficient in a great variety of computer vision and computer graphics applications, including edge-aware smoothing, detail enhancement, HDR compression, image matting/feathering, dehazing, joint upsampling, etc. The filtering output is locally a linear transform of the guidance image. On one hand, the guided filter has good edge-preserving smoothing properties like the bilateral filter, but it does not suffer from the gradient reversal artifacts. On the other hand, the guided filter can be used beyond smoothing: With the help of the guidance image, it can make the filtering output more structured and less smoothed than the input. Moreover, the guided filter naturally has an O(N) time (in the number of pixels N) nonapproximate algorithm for both gray-scale and high-dimensional images, regardless of the kernel size and the intensity range. Typically, the CPU implementation achieves 40 ms per mega-pixel performing gray-scale filtering. It has great potential in computer vision and graphics, given its simplicity, efficiency, and high-quality. Fig. 2.1. Illustrations of the bilateral filtering process (left) and the guided filtering process (right) 2.1 Guided filter A general linear translation-variant filtering process is defined, which involves a guidance image I, an filtering input image p, and an output image q. Both I and p are given beforehand according to the application, and they can be identical. The filtering output at a pixel i is expressed as a weighted average: (1) where i and j are pixel indexes. The filter kernel Wij is a function of the guidance image I and independent of p. This filter is linear with respect to p. An example of such a filter is the joint bilateral filter (Fig. 2.1 (left)). The bilateral filtering kernel Wbf is given by : (2) where x is the pixel coordinate and Ki is a normalizing parameter to ensure . The parameters s and r adjust the sensitivity of the spatial similarity and the range (intensity/color) similarity, respectively. The joint bilateral filter degrades to the original bilateral filter when I and p are identical. The implicit weighted- average filters optimize a quadratic function and solve a linear system in this form: (3) where q and p are N-by-1 vectors concatenating {qi} and {pi}, respectively, and A is an N-by-N matrix only depends on I. The solution to (3), i.e., , has the same form as (1), with . The key assumption of the guided filter is a local linear model between the guidance I and the filtering output q. We assume that q is a linear transform of I in a window wk centered at the pixel k: (4)
  • 3. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 19| where (ak, bk) are some linear coefficients assumed to be constant in wk. We use a square window of radius r. This local linear model ensures that q has an edge only if I has an edge, because . This model has been proven useful in image super-resolution, image matting and dehazing. To determine the linear coefficients {ak, bk}, we need constraints from the filtering input p. We model the output q as the input p subtracting some unwanted components n like noise/textures: (5) A solution that minimizes the difference between q and p while maintaining the linear model (4) is suggested. Specifically, the following cost function in the window wk is minimized : (6) Here, is a regularization parameter penalizing large ak. Equation (6) is the linear ridge regression model [11] and its solution is given by : (7) (8) Here,k and are mean and variance of I in wk, |w| is the number of pixels in wk, and is the mean of p in wk. Having obtained the linear coefficients {ak, bk}, we can compute the filtering output qi by (4). Fig.2.1 (right) shows an illustration of the guided filtering process. However, a pixel i is involved in all the overlapping windows wk that covers i, so the value of qi in (4) is not identical when it is computed in different windows. A simple strategy is to average all the possible values of qi. So after computing (ak, bk) for all windows wk in the image, we compute the filtering output by : (9) Noticing that due to the symmetry of the box window, (9) is rewritten as : (10) where and are the average coefficients of all windows overlapping i. The averaging strategy of overlapping windows is popular in image denoising. With the modification in (10), q is no longer scaling of I because the linear coefficients vary spatially. But as are the output of a mean filter, their gradients can be expected to be much smaller than that of I near strong edges. In short, abrupt intensity changes in I can be mostly preserved in q. Equations (7), (8), and (10) are the definition of the guided filter. A pseudocode is in Algorithm 1. In this algorithm, fmean is a mean filter with a window radius r. The abbreviations of correlation (corr), variance (var), and covariance (cov) indicate the intuitive meaning of these variables. Algorithm 1. Guided Filter. Input: filtering input image p, guidance image I, radius r, regularization Output: filtering output q. 1: meanI = fmean(I) meanp = fmean(p) corrI = fmean(I.*I) corrIp = fmean(I.*p) 2: varI = corrI - meanI .* meanI covIp = corrIp - meanI .* meanp 3: a = covIp ./(varI + ) b = meanp – a.* meanI 4: meana = fmean(a) meanb = fmean(b) 5: q = meana .* I + meanb III. OVERALL APPROACH The flowchart of the proposed image fusion method is shown in Fig. 3.1. We first employ guided filtering for the extraction of base layers and detail layers from the input images. 푞푖 computed in (9) preserves the strongest edges in 퐼 while smoothing small changes in intensity. Let bKbe the base layer computed from (9) (i.e.,bK = 푞푖 and 1 ≤ 퐾 ≤ 푁) for 퐾th input image denoted by 퐼퐾. The detail layer is defined as the difference between the guided filter output and the input image, which is defined as
  • 4. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 20| (11) Fig 3.1. Flowchart of the proposed method 3.1 Base Layer Fusion The pyramid representation expresses an image as a sum of spatially band-passed images while retaining local spatial information in each band. A pyramid is created by lowpass filtering an image 퐺0 with a compact two-dimensional filter. The filtered image is then subsampled by removing every other pixel and every other row to obtain a reduced image 퐺1.This process is repeated to form a Gaussian pyramid 퐺0, 퐺1, 퐺2, 퐺3, . . . , 퐺푑. Expanding 퐺1 to the same size as 퐺0 and subtracting yields the band-passed image 퐿0. A Laplacian pyramid 퐿0, 퐿1, 퐿2, . . . , 퐿푑−1, can be built containing band-passed images of decreasing size and spatial frequency. lll+1 (12) where l refers to the number of levels in the pyramid. The original image can be reconstructed from the expanded band-pass images: (13) The Gaussian pyramid contains low-passed versions of the original 퐺0, at progressively lower spatial frequencies. This effect is clearly seen when the Gaussian pyramid levels are expanded to the same size as 퐺0. The Laplacian pyramid consists of band-passed copies of 퐺0. Each Laplacian level contains the edges of a certain size and spans approximately an octave in spatial frequency. (a) Quality measures Many images in the stack contain flat, colorless regions due to under- and overexposure. Such regions should receive less weight, while interesting areas containing bright colors and details should be preserved. The following measures are used to achieve this: • Contrast: Contrast is created by the difference in luminance reflected from two adjacent surfaces. In other words, contrast is the difference in visual properties that makes an object distinguishable from other object and the background. In visual perception contrast is determined by the difference in color and brightness of the object with other object. It is the difference between the darker and the lighter pixel of the image, if it is big the image will have high contrast and in the other case the image will have low contrast. A Laplacian filter is applied to the grayscale version of each image, and take the absolute value of the filter response. This yields a simple indicator C for contrast. It tends to assign a high weight to important elements such as edges and texture. • Saturation: As a photograph undergoes a longer exposure, the resulting colors become desaturated and eventually clipped. The saturation of a color is determined by a combination of light intensity and how much it is distributed across the spectrum of different wavelengths. The purest (most saturated) color is achieved by using just one wavelength at a high intensity, such as in laser light. If the intensity drops, then as a result the saturation drops. Saturated colors are desirable and make the image look vivid. A saturation measure S is included which is computed as the standard deviation within the R, G and B channel, at each pixel. • Exposure: Exposure is a term that refers to two aspects of photography – it is referring to how to control the lightness and the darkness of the image. In photography, exposure is the amount of light per unit area reaching a photographic film. A photograph may be described as overexposed when it has a loss of highlight detail, that is, when important bright parts of an image are washed out or effectively all white, known as blown out highlights or clipped whites. A photograph may be described as underexposed when it has a loss of shadow detail, that is, when important dark areas are muddy or indistinguishable from black, known as blocked up shadows. Looking at just the raw intensities within a channel, reveals how well a pixel is exposed. We want to
  • 5. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 21| keep intensities that are not near zero (underexposed) or one (overexposed). We weight each intensity i based on how close it is to 0.5 using a Gauss curve: (14) To account for multiple color channels, we apply the Gauss curve to each channel separately, and multiply the results, yielding the measure E. The fused base layer is computed as the weighted sum of the base layers 푏1푏2. . . ,푏푁obtained across 푁 input exposures. Pyramidal approach is used to generate Laplacian pyramid of the base layers L{푏퐾}푙 and Gaussian pyramid of weight map functions 퐺{푊퐾}푙 estimated from three quality measures (i.e., saturation 푆퐾, contrast 퐶퐾, and exposure 퐸퐾). Here, 푙 (0 < 푙 < 푑) refers to the number of levels in the pyramid and 퐾 (1 < 퐾 < 푁) refers to the number of input images. The weight map is computed as the product of these three quality metrics (i.e. WK = 푆퐾 ⋅ 퐶퐾 ⋅ 퐸퐾). The L{푏퐾}푙 multiplied with the corresponding 퐺{푊퐾(푖‟, 푗‟)}푙 and summing over 퐾 yield modified Laplacian pyramid 퐿푙(푖‟, 푗‟) as follows: (15) Thethat contains well exposed pixels is reconstructed by expanding each level and then summing all the levels of the Laplacian pyramid: (16) 3.2 Detail Layer Fusion The detail layers computed in (11) across all the input exposures are linearly combined to produce fused detail layer 푑푓 that yields combined texture information as follows: (17) where is the user defined parameter to control amplification of texture details (typically set to 5). Finally, the detail enhanced fused image 푔is easily computed by simply adding up the fused base layer 푏푓computed in (16) and the manipulated fused detail layer in (17) as follows: (18) 3.3 Numerical Analysis Numerical analysis is the process of evaluating a technique via some objective metrics. For this purpose, two fusion quality metrics [12], i.e., information theory based metric (QMI) [13] and structure based metrics (Qc) [14] are adopted. In order to assess the fusion performance, fusion quality metric is used. (a) Normalized mutual information (QMI) Normalized mutual information, QMI is an information theory based metric. Mutual information improves image fusion quality assessments. One problem with traditional mutual information metric is that it is unstable and may bias the measure towards the source image with the highest entropy. The size of the overlapping part of the images influences the mutual information measure in two ways. First of all, a decrease in overlap decreases the number of samples, which reduces the statistical power of the probability distribution estimation. Secondly, with increasing misregistration (which usually coincides with decreasing overlap) the mutual information measure may actually increase. This can occur when the relative areas of object and background even out and the sum of the marginal entropies increases, faster than the joint entropy. Normalized measure of mutual information is less sensitive to changes in overlap. Hossny et al. modified it to the normalized mutual information [13]. Here, Hossny et al.‟s definition is adopted. (19) where H(A), H(B) and H(F) are the marginal entropy of A, B and F, and MI(A, F)is the mutual information between the source image A and the fused image F. (20) where H(A, F) is the joint entropy between A and F, H(A) and H(F) are the marginal entropy of A and F, respectively, and MI(B,F) is similar to MI(A, F). The quality metric QMI measures how well the original information from source images is preserved in the fused image. (b) Cvejic et al.’s metric (Qc) Cvejic et al.‟s metric, Qc is a structure based metric. It is calculated as follows :
  • 6. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 22| (21) where is calculated as follows : (22) and are the covariance between A,B and F, UIQI refers to the universal image quality index. The Qc quality metric estimates how well the important information in the source images is preserved in the fused image, while minimizing the amount of distortion that could interfere with interpretation. IV. Results And Discussion The system described above is implemented using Matlab and the result was successfully obtained. In this section, the obtained results are provided. Figure 4.1 shows the base and detail layers. Figure 4.2: Base layers and Detail layers Pyramidal approach is used for fusing base layers. Quality measures of images are considered to compute the weight map. Weight map is the combination of contrast, saturation and exposure. Figure 4.3 shows the gaussian pyramid of weight map function. Figure 4.3: Gaussian pyramid Laplacian pyramid of the base layers are generated. Thus obtained laplacian pyramid is shown in figure 4.4. Figure 4.4: Laplacian pyramid Fused pyramid is obtained by combining the Gaussian pyramid of weight map functions and Laplacian pyramid of base layers. Figure 4.5 shows the fused pyramid.
  • 7. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 23| Figure 4.5: Fused pyramid Fused base layer is the weighted sum of base layers. The detail layers obtained are boosted and fused. Figure 4.6 shows the fused base and detail layers. Figure 4.6: Fused base layer and detail layer Finally, the fused image is obtained by combining the obtained fused base and fused detail layers. The fused image is shown in figure 4.7. Numerical analysis is performed on the obtained results. Figure 4.7: Fused image The following shows the results obtained for some of the other source images. (a) (b) Figure 4.8: (a) Source Images (b) Fused Image
  • 8. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 24| (a) (b) Figure 4.9: (a) Source Images (b) Fused Image (a) (b) Figure 4.10: (a) Source Images (b) Fused Image V. Conclusion We proposed a technique for fusing multiexposure input images. The proposed method constructs a detail enhanced image from a set of multiexposure images by using a multiresolution decomposition technique. When compared with the existing techniques which use multiresolution and single resolution analysis for exposure fusion, the current proposed method performs better in terms of enhancement of texture details in the fused image. The framework is inspired by the edge preserving property of guided filter that has better response near strong edges. Experiments show that the proposed method can well preserve the original and complementary information of multiple input images. REFERENCES [1] S. Li, J. Kwok, I. Tsang, and Y. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw., vol. 15, no. 6, pp. 1555–1561, Nov. 2004. [2] G. Pajares and J. M. de la Cruz, “A wavelet-based image fusion tutorial,” Pattern Recognit., vol. 37, no. 9, pp. 1855–1872, Sep. 2004. [3] D. Looney and D. Mandic, “Multiscale image fusion using complex extensions of EMD,” IEEE Trans. Signal Process., vol. 57, no. 4, pp. 1626–1630, Apr. 2009. [4] M. Kumar and S. Dass, “A total variation-based algorithm for pixellevel image fusion,” IEEE Trans. Image Process., vol. 18, no. 9, pp. 2137–2143, Sep. 2009. [5] P. Burt and E. Adelson, “The laplacian pyramid as a compact image code,” IEEE Trans. Commun., vol. 31, no. 4, pp. 532–540, Apr. 1983. [6] O. Rockinger, “Image sequence fusion using a shift-invariant wavelet transform,” in Proc. Int. Conf. Image Process., vol. 3, Washington, DC, USA, Oct. 1997, pp. 288–291. [7] J. Liang, Y. He, D. Liu, and X. Zeng, “Image fusion using higher order singular value decomposition,” IEEE Trans. Image Process., vol. 21, no. 5, pp. 2898–2909, May 2012. [8] R. Shen, I. Cheng, J. Shi, and A. Basu, “Generalized random walks for fusion of multi-exposure images,” IEEE Trans. Image Process., vol. 20, no. 12, pp. 3634–3646, Dec. 2011.
  • 9. Multiexposure Image Fusion | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 25| [9] M. Xu, H. Chen, and P. Varshney, “An image fusion approach based on markov random fields,” IEEE Trans. Geosci. Remote Sens., vol. 49, no. 12, pp. 5116–5127, Dec. 2011. [10] K. He, J. Sun, and X. Tang, “Guided image filtering,” in Proc. Eur. Conf. Comput. Vis., Heraklion, Greece, Sep. 2010, pp. 1–14. [11] N. Draper and H. Smith, Applied Regression Analysis. New York, USA: Wiley, 1981. [12] Z. Liu, E. Blasch, Z. Xue, J. Zhao, R. Laganiere, and W. Wu,“Objective assessment of multiresolution image fusion algorithms for context enhancement in night vision: A comparative study,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 1, pp. 94–109, Jan. 2012. [13] M. Hossny, S. Nahavandi, and D. Creighton, “Comments on „information measure for performance of image fusion‟,” Electron. Lett., vol. 44, no. 18, pp. 1066–1067, Aug. 2008. [14] N. Cvejic, A. Loza, D. Bull, and N. Canagarajah, “A similarity metric for assessment of image fusion algorithms,” Int. J. Signal Process., vol. 2, no. 3, pp. 178–182, Apr. 2005.