SlideShare a Scribd company logo
1 of 9
Download to read offline
International
OPEN ACCESS Journal
Of Modern Engineering Research (IJMER)
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 17|
Multiexposure Image Fusion
Suhaina J1
, Bindu K. R2
1
Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India
2
Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India
I. INTRODUCTION
Often a single sensor cannot produce a complete representation of a scene. Visible images provide
spectral and spatial details, and if a target has the same color and spatial characteristics as its background, it
cannot be distinguished from the background. Image fusion is the process of combining information from two
or more images of a scene into a single composite image that is more informative and is more suitable for
visual perception or computer processing. The objective in image fusion is to reduce uncertainty and minimize
redundancy in the output while maximizing relevant information particular to an application or task. Given the
same set of input images, different fused images may be created depending on the specific application and
what is considered relevant information. There are several benefits in using image fusion: wider spatial and
temporal coverage, decreased uncertainty, improved reliability, and increased robustness of system
performance.
A large number of image fusion methods [1]โ€“[4] have been proposed in literature. Among these
methods, multiscale image fusion [2] and data-driven image fusion [3] are very successful methods. They focus
on different data representations, e.g., multi-scale coefficients [5], [6], or data driven decomposition
coefficients [3], [7] and different image fusion rules to guide the fusion of coefficients. The major advantage of
these methods is that they can well preserve the details of different source images. However, these kinds of
methods may produce brightness and color distortions since spatial consistency is not well considered in the
fusion process. Spatial consistency means that if two adjacent pixels have similar brightness or color, they will
tend to have similar weights. A popular spatial consistency based fusion approach is formulating an energy
function, where the pixel saliencies are encoded in the function and edge aligned weights are enforced by
regularization terms, e.g., a smoothness term. This energy function can be then minimized globally to obtain
the desired weight maps. To make full use of spatial context, optimization based image fusion approaches, e.g.,
generalized random walks [8], and Markov random fields [9] based methods have been proposed. These
methods focus on estimating spatially smooth and edge aligned weights by solving an energy function and then
fusing the source images by weighted average of pixel values. However, optimization based methods have a
common limitation, i.e., inefficiency, since they require multiple iterations to find the global optimal solution.
Moreover, another drawback is that global optimization based methods may over-smooth the resulting weights,
which is not good for fusion. An interesting alternative to optimization based method is guided image filtering
[10]. The proposed method employs guided filtering for layer extraction. The extracted layers are then fused
separately.
The remainder of this paper is organized as follows. In Section II, the guided image filtering
algorithm is reviewed. Section III describes the proposed image fusion algorithm. The experimental results and
discussions are presented in Section IV. Finally, Section V concludes the paper.
Abstract: Many applications such as robot navigation, defense, medical and remote sensing perform
various processing tasks, which can be performed more easily when all objects in different images of the
same scene are combined into a single fused image. In this paper, we propose a fast and effective
method for image fusion. The proposed method derives the intensity based variations that is large and
small scale, from the source images. In this approach, guided filtering is employed for this extraction.
Gaussian and Laplacian pyramidal approach is then used to fuse the different layers obtained.
Experimental results demonstrate that the proposed method can obtain better performance for fusion of
all sets of images. The results clearly indicate the feasibility of the proposed approach.
Keywords: Gaussian Pyramid, Guided Filter, Image Fusion, Laplacian Pyramid, Multi-exposure
images
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 18|
II. GUIDED IMAGE FILTERING
Guided filter is an image filter derived from a local linear model. It computes the filtering output by
considering the content of a guidance image, which can be the input image itself or another different image.
The guided filter can be used as an edge-preserving smoothing operator like the popular bilateral filter, but it
has better behaviors near edges. The guided filter is also a more generic concept beyond smoothing: It can
transfer the structures of the guidance image to the filtering output, enabling new filtering applications like
dehazing and guided feathering. Moreover, the guided filter naturally has a fast and nonapproximate linear
time algorithm, regardless of the kernel size and the intensity range. Currently, it is one of the fastest edge-
preserving filters. Guided filter is both effective and efficient in a great variety of computer vision and
computer graphics applications, including edge-aware smoothing, detail enhancement, HDR compression,
image matting/feathering, dehazing, joint upsampling, etc.
The filtering output is locally a linear transform of the guidance image. On one hand, the guided filter
has good edge-preserving smoothing properties like the bilateral filter, but it does not suffer from the gradient
reversal artifacts. On the other hand, the guided filter can be used beyond smoothing: With the help of the
guidance image, it can make the filtering output more structured and less smoothed than the input. Moreover,
the guided filter naturally has an O(N) time (in the number of pixels N) nonapproximate algorithm for both
gray-scale and high-dimensional images, regardless of the kernel size and the intensity range. Typically, the
CPU implementation achieves 40 ms per mega-pixel performing gray-scale filtering. It has great potential in
computer vision and graphics, given its simplicity, efficiency, and high-quality.
Fig. 2.1. Illustrations of the bilateral filtering process (left) and the guided filtering process (right)
2.1 Guided filter
A general linear translation-variant filtering process is defined, which involves a guidance image I, an
filtering input image p, and an output image q. Both I and p are given beforehand according to the application,
and they can be identical. The filtering output at a pixel i is expressed as a weighted average:
(1)
where i and j are pixel indexes. The filter kernel Wij is a function of the guidance image I and independent of
p. This filter is linear with respect to p. An example of such a filter is the joint bilateral filter (Fig. 2.1 (left)).
The bilateral filtering kernel Wbf is given by :
(2)
where x is the pixel coordinate and Ki is a normalizing parameter to ensure . The parameters s
and r adjust the sensitivity of the spatial similarity and the range (intensity/color) similarity, respectively. The
joint bilateral filter degrades to the original bilateral filter when I and p are identical. The implicit weighted-
average filters optimize a quadratic function and solve a linear system in this form:
(3)
where q and p are N-by-1 vectors concatenating {qi} and {pi}, respectively, and A is an N-by-N matrix only
depends on I. The solution to (3), i.e., , has the same form as (1), with .
The key assumption of the guided filter is a local linear model between the guidance I and the filtering
output q. We assume that q is a linear transform of I in a window wk centered at the pixel k:
(4)
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 19|
where (ak, bk) are some linear coefficients assumed to be constant in wk. We use a square window of radius r.
This local linear model ensures that q has an edge only if I has an edge, because . This model has
been proven useful in image super-resolution, image matting and dehazing.
To determine the linear coefficients {ak, bk}, we need constraints from the filtering input p. We model
the output q as the input p subtracting some unwanted components n like noise/textures:
(5)
A solution that minimizes the difference between q and p while maintaining the linear model (4) is suggested.
Specifically, the following cost function in the window wk is minimized :
(6)
Here, is a regularization parameter penalizing large ak.
Equation (6) is the linear ridge regression model [11] and its solution is given by :
(7)
(8)
Here, k and are mean and variance of I in wk, |w| is the number of pixels in wk, and is
the mean of p in wk. Having obtained the linear coefficients {ak, bk}, we can compute the filtering output qi by
(4). Fig.2.1 (right) shows an illustration of the guided filtering process.
However, a pixel i is involved in all the overlapping windows wk that covers i, so the value of qi in (4)
is not identical when it is computed in different windows. A simple strategy is to average all the possible
values of qi. So after computing (ak, bk) for all windows wk in the image, we compute the filtering output by :
(9)
Noticing that due to the symmetry of the box window, (9) is rewritten as :
(10)
where and are the average coefficients of all windows overlapping i. The
averaging strategy of overlapping windows is popular in image denoising.
With the modification in (10), q is no longer scaling of I because the linear coefficients
vary spatially. But as are the output of a mean filter, their gradients can be expected to be much
smaller than that of I near strong edges. In short, abrupt intensity changes in I can be mostly preserved in q.
Equations (7), (8), and (10) are the definition of the guided filter. A pseudocode is in Algorithm 1. In
this algorithm, fmean is a mean filter with a window radius r. The abbreviations of correlation (corr), variance
(var), and covariance (cov) indicate the intuitive meaning of these variables.
Algorithm 1. Guided Filter.
Input: filtering input image p, guidance image I, radius r, regularization
Output: filtering output q.
1: meanI = fmean(I)
meanp = fmean(p)
corrI = fmean(I.*I)
corrIp = fmean(I.*p)
2: varI = corrI - meanI .* meanI
covIp = corrIp - meanI .* meanp
3: a = covIp ./(varI + )
b = meanp โ€“ a.* meanI
4: meana = fmean(a)
meanb = fmean(b)
5: q = meana .* I + meanb
III. OVERALL APPROACH
The flowchart of the proposed image fusion method is shown in Fig. 3.1. We first employ guided
filtering for the extraction of base layers and detail layers from the input images. ๐‘ž๐‘– computed in (9) preserves
the strongest edges in ๐ผ while smoothing small changes in intensity. Let bK be the base layer computed
from (9) (i.e.,bK = ๐‘ž๐‘– and 1 โ‰ค ๐พ โ‰ค ๐‘) for ๐พth
input image denoted by ๐ผ ๐พ . The detail layer is defined
as the difference between the guided filter output and the input image, which is defined as
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 20|
(11)
Fig 3.1. Flowchart of the proposed method
3.1 Base Layer Fusion
The pyramid representation expresses an image as a sum of spatially band-passed images while
retaining local spatial information in each band. A pyramid is created by lowpass filtering an image ๐บ0 with a
compact two-dimensional filter. The filtered image is then subsampled by removing every other pixel and
every other row to obtain a reduced image ๐บ1.This process is repeated to form a Gaussian pyramid ๐บ0, ๐บ1, ๐บ2,
๐บ3, . . . , ๐บ๐‘‘. Expanding ๐บ1 to the same size as ๐บ0 and subtracting yields the band-passed image ๐ฟ0. A
Laplacian pyramid ๐ฟ0, ๐ฟ1, ๐ฟ2, . . . , ๐ฟ๐‘‘โˆ’1, can be built containing band-passed images of decreasing size and
spatial frequency.
l l l+1 (12)
where l refers to the number of levels in the pyramid.
The original image can be reconstructed from the expanded band-pass images:
(13)
The Gaussian pyramid contains low-passed versions of the original ๐บ0, at progressively lower spatial
frequencies. This effect is clearly seen when the Gaussian pyramid levels are expanded to the same size as ๐บ0.
The Laplacian pyramid consists of band-passed copies of ๐บ0. Each Laplacian level contains the edges of a
certain size and spans approximately an octave in spatial frequency.
(a) Quality measures
Many images in the stack contain flat, colorless regions due to under- and overexposure. Such regions
should receive less weight, while interesting areas containing bright colors and details should be preserved.
The following measures are used to achieve this:
โ€ข Contrast: Contrast is created by the difference in luminance reflected from two adjacent surfaces. In other
words, contrast is the difference in visual properties that makes an object distinguishable from other object and
the background. In visual perception contrast is determined by the difference in color and brightness of the
object with other object. It is the difference between the darker and the lighter pixel of the image, if it is big the
image will have high contrast and in the other case the image will have low contrast.
A Laplacian filter is applied to the grayscale version of each image, and take the absolute value of the
filter response. This yields a simple indicator C for contrast. It tends to assign a high weight to important
elements such as edges and texture.
โ€ข Saturation: As a photograph undergoes a longer exposure, the resulting colors become desaturated and
eventually clipped. The saturation of a color is determined by a combination of light intensity and how much it
is distributed across the spectrum of different wavelengths. The purest (most saturated) color is achieved by
using just one wavelength at a high intensity, such as in laser light. If the intensity drops, then as a result the
saturation drops. Saturated colors are desirable and make the image look vivid. A saturation measure S is
included which is computed as the standard deviation within the R, G and B channel, at each pixel.
โ€ข Exposure: Exposure is a term that refers to two aspects of photography โ€“ it is referring to how to control the
lightness and the darkness of the image. In photography, exposure is the amount of light per unit area reaching
a photographic film. A photograph may be described as overexposed when it has a loss of highlight detail, that
is, when important bright parts of an image are washed out or effectively all white, known as blown out
highlights or clipped whites. A photograph may be described as underexposed when it has a loss of shadow
detail, that is, when important dark areas are muddy or indistinguishable from black, known as blocked up
shadows. Looking at just the raw intensities within a channel, reveals how well a pixel is exposed. We want to
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 21|
keep intensities that are not near zero (underexposed) or one (overexposed). We weight each intensity i based
on how close it is to 0.5 using a Gauss curve:
(14)
To account for multiple color channels, we apply the Gauss curve to each channel separately, and multiply the
results, yielding the measure E.
The fused base layer is computed as the weighted sum of the base layers
๐‘1 ๐‘2 . . . ,๐‘๐‘ obtained across ๐‘ input exposures. Pyramidal approach is used to generate
Laplacian pyramid of the base layers L{๐‘ ๐พ }๐‘™ and Gaussian pyramid of weight map functions
๐บ{๐‘Š ๐พ }๐‘™ estimated from three quality measures (i.e., saturation ๐‘† ๐พ , contrast ๐ถ ๐พ , and
exposure ๐ธ ๐พ ). Here, ๐‘™ (0 < ๐‘™ < ๐‘‘) refers to the number of levels in the pyramid and ๐พ (1 < ๐พ < ๐‘) refers
to the number of input images. The weight map is computed as the product of these three quality metrics (i.e.
WK = ๐‘† ๐พ โ‹… ๐ถ ๐พ โ‹… ๐ธ ๐พ ). The L{๐‘ ๐พ }๐‘™ multiplied with the corresponding ๐บ{๐‘Š ๐พ(๐‘–โ€Ÿ,
๐‘—โ€Ÿ)}๐‘™ and summing over ๐พ yield modified Laplacian pyramid ๐ฟ๐‘™(๐‘–โ€Ÿ, ๐‘—โ€Ÿ) as follows:
(15)
The that contains well exposed pixels is reconstructed by expanding each level and then summing all
the levels of the Laplacian pyramid:
(16)
3.2 Detail Layer Fusion
The detail layers computed in (11) across all the input exposures are linearly combined to produce
fused detail layer ๐‘‘๐‘“ that yields combined texture information as follows:
(17)
where is the user defined parameter to control amplification of texture details (typically set to 5).
Finally, the detail enhanced fused image ๐‘” is easily computed by simply adding up the fused
base layer ๐‘ ๐‘“ computed in (16) and the manipulated fused detail layer in (17) as follows:
(18)
3.3 Numerical Analysis
Numerical analysis is the process of evaluating a technique via some objective metrics. For this
purpose, two fusion quality metrics [12], i.e., information theory based metric (QMI) [13] and structure based
metrics (Qc) [14] are adopted. In order to assess the fusion performance, fusion quality metric is used.
(a) Normalized mutual information (QMI)
Normalized mutual information, QMI is an information theory based metric. Mutual information
improves image fusion quality assessments. One problem with traditional mutual information metric is that it
is unstable and may bias the measure towards the source image with the highest entropy.
The size of the overlapping part of the images influences the mutual information measure in two
ways. First of all, a decrease in overlap decreases the number of samples, which reduces the statistical power of
the probability distribution estimation. Secondly, with increasing misregistration (which usually coincides with
decreasing overlap) the mutual information measure may actually increase. This can occur when the relative
areas of object and background even out and the sum of the marginal entropies increases, faster than the joint
entropy. Normalized measure of mutual information is less sensitive to changes in overlap. Hossny et al.
modified it to the normalized mutual information [13]. Here, Hossny et al.โ€Ÿs definition is adopted.
(19)
where H(A), H(B) and H(F) are the marginal entropy of A, B and F, and MI(A, F)is the mutual information
between the source image A and the fused image F.
(20)
where H(A, F) is the joint entropy between A and F, H(A) and H(F) are the marginal entropy of A and F,
respectively, and MI(B,F) is similar to MI(A, F). The quality metric QMI measures how well the original
information from source images is preserved in the fused image.
(b) Cvejic et al.โ€™s metric (Qc)
Cvejic et al.โ€Ÿs metric, Qc is a structure based metric. It is calculated as follows :
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 22|
(21)
where is calculated as follows :
(22)
and are the covariance between A,B and F, UIQI refers to the universal image quality index. The Qc
quality metric estimates how well the important information in the source images is preserved in the fused
image, while minimizing the amount of distortion that could interfere with interpretation.
IV. Results And Discussion
The system described above is implemented using Matlab and the result was successfully obtained. In
this section, the obtained results are provided. Figure 4.1 shows the base and detail layers.
Figure 4.2: Base layers and Detail layers
Pyramidal approach is used for fusing base layers. Quality measures of images are considered to
compute the weight map. Weight map is the combination of contrast, saturation and exposure. Figure 4.3
shows the gaussian pyramid of weight map function.
Figure 4.3: Gaussian pyramid
Laplacian pyramid of the base layers are generated. Thus obtained laplacian pyramid is shown in figure 4.4.
Figure 4.4: Laplacian pyramid
Fused pyramid is obtained by combining the Gaussian pyramid of weight map functions and
Laplacian pyramid of base layers. Figure 4.5 shows the fused pyramid.
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 23|
Figure 4.5: Fused pyramid
Fused base layer is the weighted sum of base layers. The detail layers obtained are boosted and fused.
Figure 4.6 shows the fused base and detail layers.
Figure 4.6: Fused base layer and detail layer
Finally, the fused image is obtained by combining the obtained fused base and fused detail layers. The
fused image is shown in figure 4.7. Numerical analysis is performed on the obtained results.
Figure 4.7: Fused image
The following shows the results obtained for some of the other source images.
(a) (b)
Figure 4.8: (a) Source Images (b) Fused Image
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 24|
(a) (b)
Figure 4.9: (a) Source Images (b) Fused Image
(a) (b)
Figure 4.10: (a) Source Images (b) Fused Image
V. Conclusion
We proposed a technique for fusing multiexposure input images. The proposed method constructs a
detail enhanced image from a set of multiexposure images by using a multiresolution decomposition technique.
When compared with the existing techniques which use multiresolution and single resolution analysis for
exposure fusion, the current proposed method performs better in terms of enhancement of texture details in the
fused image. The framework is inspired by the edge preserving property of guided filter that has better
response near strong edges. Experiments show that the proposed method can well preserve the original and
complementary information of multiple input images.
REFERENCES
[1] S. Li, J. Kwok, I. Tsang, and Y. Wang, โ€œFusing images with different focuses using support vector machines,โ€
IEEE Trans. Neural Netw., vol. 15, no. 6, pp. 1555โ€“1561, Nov. 2004.
[2] G. Pajares and J. M. de la Cruz, โ€œA wavelet-based image fusion tutorial,โ€ Pattern Recognit., vol. 37, no. 9, pp.
1855โ€“1872, Sep. 2004.
[3] D. Looney and D. Mandic, โ€œMultiscale image fusion using complex extensions of EMD,โ€ IEEE Trans. Signal
Process., vol. 57, no. 4, pp. 1626โ€“1630, Apr. 2009.
[4] M. Kumar and S. Dass, โ€œA total variation-based algorithm for pixellevel image fusion,โ€ IEEE Trans. Image
Process., vol. 18, no. 9, pp. 2137โ€“2143, Sep. 2009.
[5] P. Burt and E. Adelson, โ€œThe laplacian pyramid as a compact image code,โ€ IEEE Trans. Commun., vol. 31, no. 4,
pp. 532โ€“540, Apr. 1983.
[6] O. Rockinger, โ€œImage sequence fusion using a shift-invariant wavelet transform,โ€ in Proc. Int. Conf. Image
Process., vol. 3, Washington, DC, USA, Oct. 1997, pp. 288โ€“291.
[7] J. Liang, Y. He, D. Liu, and X. Zeng, โ€œImage fusion using higher order singular value decomposition,โ€ IEEE Trans.
Image Process., vol. 21, no. 5, pp. 2898โ€“2909, May 2012.
[8] R. Shen, I. Cheng, J. Shi, and A. Basu, โ€œGeneralized random walks for fusion of multi-exposure images,โ€ IEEE
Trans. Image Process., vol. 20, no. 12, pp. 3634โ€“3646, Dec. 2011.
Multiexposure Image Fusion
| IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 25|
[9] M. Xu, H. Chen, and P. Varshney, โ€œAn image fusion approach based on markov random fields,โ€ IEEE Trans.
Geosci. Remote Sens., vol. 49, no. 12, pp. 5116โ€“5127, Dec. 2011.
[10] K. He, J. Sun, and X. Tang, โ€œGuided image filtering,โ€ in Proc. Eur. Conf. Comput. Vis., Heraklion, Greece, Sep.
2010, pp. 1โ€“14.
[11] N. Draper and H. Smith, Applied Regression Analysis. New York, USA: Wiley, 1981.
[12] Z. Liu, E. Blasch, Z. Xue, J. Zhao, R. Laganiere, and W. Wu,โ€œObjective assessment of multiresolution image
fusion algorithms for context enhancement in night vision: A comparative study,โ€ IEEE Trans. Pattern Anal. Mach.
Intell., vol. 34, no. 1, pp. 94โ€“109, Jan. 2012.
[13] M. Hossny, S. Nahavandi, and D. Creighton, โ€œComments on โ€žinformation measure for performance of image
fusionโ€Ÿ,โ€ Electron. Lett., vol. 44, no. 18, pp. 1066โ€“1067, Aug. 2008.
[14] N. Cvejic, A. Loza, D. Bull, and N. Canagarajah, โ€œA similarity metric for assessment of image fusion algorithms,โ€
Int. J. Signal Process., vol. 2, no. 3, pp. 178โ€“182, Apr. 2005.

More Related Content

What's hot

IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET Journal
ย 
Ijetcas14 504
Ijetcas14 504Ijetcas14 504
Ijetcas14 504Iasir Journals
ย 
Region filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpaintingRegion filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpaintingWoonghee Lee
ย 
Spectral approach to image projection with cubic b spline interpolation
Spectral approach to image projection with cubic b spline interpolationSpectral approach to image projection with cubic b spline interpolation
Spectral approach to image projection with cubic b spline interpolationiaemedu
ย 
Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...sipij
ย 
A systematic image compression in the combination of linear vector quantisati...
A systematic image compression in the combination of linear vector quantisati...A systematic image compression in the combination of linear vector quantisati...
A systematic image compression in the combination of linear vector quantisati...eSAT Publishing House
ย 
Different Image Fusion Techniques โ€“A Critical Review
Different Image Fusion Techniques โ€“A Critical ReviewDifferent Image Fusion Techniques โ€“A Critical Review
Different Image Fusion Techniques โ€“A Critical ReviewIJMER
ย 
Image Inpainting Using Deep Learning
Image Inpainting Using Deep Learning Image Inpainting Using Deep Learning
Image Inpainting Using Deep Learning MohammadPooya Malek
ย 
A Survey on Exemplar-Based Image Inpainting Techniques
A Survey on Exemplar-Based Image Inpainting TechniquesA Survey on Exemplar-Based Image Inpainting Techniques
A Survey on Exemplar-Based Image Inpainting Techniquesijsrd.com
ย 
A Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray ImagesA Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray ImagesIJERA Editor
ย 
Object based image enhancement
Object based image enhancementObject based image enhancement
Object based image enhancementijait
ย 
Image in Painting Techniques: A survey
Image in Painting Techniques: A survey Image in Painting Techniques: A survey
Image in Painting Techniques: A survey IOSR Journals
ย 
Single image super resolution with improved wavelet interpolation and iterati...
Single image super resolution with improved wavelet interpolation and iterati...Single image super resolution with improved wavelet interpolation and iterati...
Single image super resolution with improved wavelet interpolation and iterati...iosrjce
ย 
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...IOSR Journals
ย 
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...Editor IJCATR
ย 
IMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSINGIMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSINGgarima0690
ย 
Review on Image Enhancement in Spatial Domain
Review on Image Enhancement in Spatial DomainReview on Image Enhancement in Spatial Domain
Review on Image Enhancement in Spatial Domainidescitation
ย 

What's hot (19)

IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
ย 
Ijetcas14 504
Ijetcas14 504Ijetcas14 504
Ijetcas14 504
ย 
Image Fusion
Image FusionImage Fusion
Image Fusion
ย 
Region filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpaintingRegion filling and object removal by exemplar based image inpainting
Region filling and object removal by exemplar based image inpainting
ย 
Spectral approach to image projection with cubic b spline interpolation
Spectral approach to image projection with cubic b spline interpolationSpectral approach to image projection with cubic b spline interpolation
Spectral approach to image projection with cubic b spline interpolation
ย 
Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...Contrast enhancement using various statistical operations and neighborhood pr...
Contrast enhancement using various statistical operations and neighborhood pr...
ย 
H010315356
H010315356H010315356
H010315356
ย 
A systematic image compression in the combination of linear vector quantisati...
A systematic image compression in the combination of linear vector quantisati...A systematic image compression in the combination of linear vector quantisati...
A systematic image compression in the combination of linear vector quantisati...
ย 
Different Image Fusion Techniques โ€“A Critical Review
Different Image Fusion Techniques โ€“A Critical ReviewDifferent Image Fusion Techniques โ€“A Critical Review
Different Image Fusion Techniques โ€“A Critical Review
ย 
Image Inpainting Using Deep Learning
Image Inpainting Using Deep Learning Image Inpainting Using Deep Learning
Image Inpainting Using Deep Learning
ย 
A Survey on Exemplar-Based Image Inpainting Techniques
A Survey on Exemplar-Based Image Inpainting TechniquesA Survey on Exemplar-Based Image Inpainting Techniques
A Survey on Exemplar-Based Image Inpainting Techniques
ย 
A Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray ImagesA Novel Feature Extraction Scheme for Medical X-Ray Images
A Novel Feature Extraction Scheme for Medical X-Ray Images
ย 
Object based image enhancement
Object based image enhancementObject based image enhancement
Object based image enhancement
ย 
Image in Painting Techniques: A survey
Image in Painting Techniques: A survey Image in Painting Techniques: A survey
Image in Painting Techniques: A survey
ย 
Single image super resolution with improved wavelet interpolation and iterati...
Single image super resolution with improved wavelet interpolation and iterati...Single image super resolution with improved wavelet interpolation and iterati...
Single image super resolution with improved wavelet interpolation and iterati...
ย 
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
ย 
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
ย 
IMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSINGIMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSING
ย 
Review on Image Enhancement in Spatial Domain
Review on Image Enhancement in Spatial DomainReview on Image Enhancement in Spatial Domain
Review on Image Enhancement in Spatial Domain
ย 

Viewers also liked

Vision of Daniel 8 - Part 2
Vision of Daniel 8 - Part 2Vision of Daniel 8 - Part 2
Vision of Daniel 8 - Part 2Robert Taylor
ย 
Comparing: Routing Protocols on Basis of sleep mode
Comparing: Routing Protocols on Basis of sleep modeComparing: Routing Protocols on Basis of sleep mode
Comparing: Routing Protocols on Basis of sleep modeIJMER
ย 
Digital foot print
Digital foot printDigital foot print
Digital foot printeringolden24
ย 
Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...
Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...
Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...IJMER
ย 
ANN Based Prediction Model as a Condition Monitoring Tool for Machine Tools
ANN Based Prediction Model as a Condition Monitoring Tool for Machine ToolsANN Based Prediction Model as a Condition Monitoring Tool for Machine Tools
ANN Based Prediction Model as a Condition Monitoring Tool for Machine ToolsIJMER
ย 
Dp31547550
Dp31547550Dp31547550
Dp31547550IJMER
ย 
Taguchi Analysis of Erosion Wear Maize Husk Based Polymer Composite
Taguchi Analysis of Erosion Wear Maize Husk Based Polymer CompositeTaguchi Analysis of Erosion Wear Maize Husk Based Polymer Composite
Taguchi Analysis of Erosion Wear Maize Husk Based Polymer CompositeIJMER
ย 
Development of Web-Based Courseware for Van Hieleโ€™s Visualization Level
Development of Web-Based Courseware for Van Hieleโ€™s Visualization LevelDevelopment of Web-Based Courseware for Van Hieleโ€™s Visualization Level
Development of Web-Based Courseware for Van Hieleโ€™s Visualization LevelIJMER
ย 
Ay2419471950
Ay2419471950Ay2419471950
Ay2419471950IJMER
ย 
Windows 8 app dev
Windows 8 app devWindows 8 app dev
Windows 8 app devElise Daans
ย 
Health service quality development 282 presentation
Health service quality development 282 presentationHealth service quality development 282 presentation
Health service quality development 282 presentationE-gama
ย 
B02414541458
B02414541458B02414541458
B02414541458IJMER
ย 
Health service quality development 282 presentation
Health service quality development 282 presentationHealth service quality development 282 presentation
Health service quality development 282 presentationE-gama
ย 
Optimized CAM Design
Optimized CAM DesignOptimized CAM Design
Optimized CAM DesignIJMER
ย 
An Experimental Investigation of Capacity Utilization in Manufacturing Indus...
An Experimental Investigation of Capacity Utilization in  Manufacturing Indus...An Experimental Investigation of Capacity Utilization in  Manufacturing Indus...
An Experimental Investigation of Capacity Utilization in Manufacturing Indus...IJMER
ย 
Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...
Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...
Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...IJMER
ย 
Bc2419681971
Bc2419681971Bc2419681971
Bc2419681971IJMER
ย 
Ewil project presentation
Ewil project presentationEwil project presentation
Ewil project presentationEWIL_EU
ย 
How to get over your networking fears part 2
How to get over your networking fears part 2How to get over your networking fears part 2
How to get over your networking fears part 2denise2228
ย 
On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...
On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...
On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...IJMER
ย 

Viewers also liked (20)

Vision of Daniel 8 - Part 2
Vision of Daniel 8 - Part 2Vision of Daniel 8 - Part 2
Vision of Daniel 8 - Part 2
ย 
Comparing: Routing Protocols on Basis of sleep mode
Comparing: Routing Protocols on Basis of sleep modeComparing: Routing Protocols on Basis of sleep mode
Comparing: Routing Protocols on Basis of sleep mode
ย 
Digital foot print
Digital foot printDigital foot print
Digital foot print
ย 
Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...
Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...
Effect of Chemical Reaction and Radiation Absorption on Unsteady Convective H...
ย 
ANN Based Prediction Model as a Condition Monitoring Tool for Machine Tools
ANN Based Prediction Model as a Condition Monitoring Tool for Machine ToolsANN Based Prediction Model as a Condition Monitoring Tool for Machine Tools
ANN Based Prediction Model as a Condition Monitoring Tool for Machine Tools
ย 
Dp31547550
Dp31547550Dp31547550
Dp31547550
ย 
Taguchi Analysis of Erosion Wear Maize Husk Based Polymer Composite
Taguchi Analysis of Erosion Wear Maize Husk Based Polymer CompositeTaguchi Analysis of Erosion Wear Maize Husk Based Polymer Composite
Taguchi Analysis of Erosion Wear Maize Husk Based Polymer Composite
ย 
Development of Web-Based Courseware for Van Hieleโ€™s Visualization Level
Development of Web-Based Courseware for Van Hieleโ€™s Visualization LevelDevelopment of Web-Based Courseware for Van Hieleโ€™s Visualization Level
Development of Web-Based Courseware for Van Hieleโ€™s Visualization Level
ย 
Ay2419471950
Ay2419471950Ay2419471950
Ay2419471950
ย 
Windows 8 app dev
Windows 8 app devWindows 8 app dev
Windows 8 app dev
ย 
Health service quality development 282 presentation
Health service quality development 282 presentationHealth service quality development 282 presentation
Health service quality development 282 presentation
ย 
B02414541458
B02414541458B02414541458
B02414541458
ย 
Health service quality development 282 presentation
Health service quality development 282 presentationHealth service quality development 282 presentation
Health service quality development 282 presentation
ย 
Optimized CAM Design
Optimized CAM DesignOptimized CAM Design
Optimized CAM Design
ย 
An Experimental Investigation of Capacity Utilization in Manufacturing Indus...
An Experimental Investigation of Capacity Utilization in  Manufacturing Indus...An Experimental Investigation of Capacity Utilization in  Manufacturing Indus...
An Experimental Investigation of Capacity Utilization in Manufacturing Indus...
ย 
Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...
Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...
Experimental Study of the Fatigue Strength of Glass fiber epoxy and Chapstan ...
ย 
Bc2419681971
Bc2419681971Bc2419681971
Bc2419681971
ย 
Ewil project presentation
Ewil project presentationEwil project presentation
Ewil project presentation
ย 
How to get over your networking fears part 2
How to get over your networking fears part 2How to get over your networking fears part 2
How to get over your networking fears part 2
ย 
On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...
On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...
On ranges and null spaces of a special type of operator named ๐€ โˆ’ ๐’‹๐’†๐’„๐’•๐’Š๐’๐’. โ€“ ...
ย 

Similar to Multiexposure Image Fusion

Paper id 27201451
Paper id 27201451Paper id 27201451
Paper id 27201451IJRAT
ย 
Iaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filterIaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filterIaetsd Iaetsd
ย 
A Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesA Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesIRJET Journal
ย 
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...iosrjce
ย 
Spectral approach to image projection with cubic
Spectral approach to image projection with cubicSpectral approach to image projection with cubic
Spectral approach to image projection with cubiciaemedu
ย 
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONINFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONIJCI JOURNAL
ย 
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKINGA PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKINGIRJET Journal
ย 
Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317Yashwant Kurmi
ย 
NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...
NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...
NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...paperpublications3
ย 
Medial Axis Transformation based Skeletonzation of Image Patterns using Image...
Medial Axis Transformation based Skeletonzation of Image Patterns using Image...Medial Axis Transformation based Skeletonzation of Image Patterns using Image...
Medial Axis Transformation based Skeletonzation of Image Patterns using Image...IOSR Journals
ย 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingIAEME Publication
ย 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingIAEME Publication
ย 
EFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGES
EFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGESEFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGES
EFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGESijcnac
ย 
An efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithmAn efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithmAlexander Decker
ย 
IRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGA
IRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGAIRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGA
IRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGAIRJET Journal
ย 
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier ExposurePerformance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposureiosrjce
ย 
Orientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern RecognitionOrientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern RecognitionIOSRjournaljce
ย 
Reduced-reference Video Quality Metric Using Spatial Information in Salient R...
Reduced-reference Video Quality Metric Using Spatial Information in Salient R...Reduced-reference Video Quality Metric Using Spatial Information in Salient R...
Reduced-reference Video Quality Metric Using Spatial Information in Salient R...TELKOMNIKA JOURNAL
ย 
ADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSION
ADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSIONADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSION
ADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSIONijistjournal
ย 

Similar to Multiexposure Image Fusion (20)

Paper id 27201451
Paper id 27201451Paper id 27201451
Paper id 27201451
ย 
Iaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filterIaetsd a modified image fusion approach using guided filter
Iaetsd a modified image fusion approach using guided filter
ย 
A Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesA Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing Approaches
ย 
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
Optimized Implementation of Edge Preserving Color Guided Filter for Video on ...
ย 
Spectral approach to image projection with cubic
Spectral approach to image projection with cubicSpectral approach to image projection with cubic
Spectral approach to image projection with cubic
ย 
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONINFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
ย 
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKINGA PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
A PROJECT REPORT ON REMOVAL OF UNNECESSARY OBJECTS FROM PHOTOS USING MASKING
ย 
Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317
ย 
NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...
NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...
NUMBER PLATE IMAGE DETECTION FOR FAST MOTION VEHICLES USING BLUR KERNEL ESTIM...
ย 
Medial Axis Transformation based Skeletonzation of Image Patterns using Image...
Medial Axis Transformation based Skeletonzation of Image Patterns using Image...Medial Axis Transformation based Skeletonzation of Image Patterns using Image...
Medial Axis Transformation based Skeletonzation of Image Patterns using Image...
ย 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matching
ย 
Improved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matchingImproved nonlocal means based on pre classification and invariant block matching
Improved nonlocal means based on pre classification and invariant block matching
ย 
EFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGES
EFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGESEFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGES
EFFICIENT IMAGE COMPRESSION USING LAPLACIAN PYRAMIDAL FILTERS FOR EDGE IMAGES
ย 
An efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithmAn efficient image segmentation approach through enhanced watershed algorithm
An efficient image segmentation approach through enhanced watershed algorithm
ย 
IRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGA
IRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGAIRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGA
IRJET-Underwater Image Enhancement by Wavelet Decomposition using FPGA
ย 
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier ExposurePerformance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
Performance of Efficient Closed-Form Solution to Comprehensive Frontier Exposure
ย 
I010634450
I010634450I010634450
I010634450
ย 
Orientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern RecognitionOrientation Spectral Resolution Coding for Pattern Recognition
Orientation Spectral Resolution Coding for Pattern Recognition
ย 
Reduced-reference Video Quality Metric Using Spatial Information in Salient R...
Reduced-reference Video Quality Metric Using Spatial Information in Salient R...Reduced-reference Video Quality Metric Using Spatial Information in Salient R...
Reduced-reference Video Quality Metric Using Spatial Information in Salient R...
ย 
ADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSION
ADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSIONADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSION
ADOPTING AND IMPLEMENTATION OF SELF ORGANIZING FEATURE MAP FOR IMAGE FUSION
ย 

More from IJMER

A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...IJMER
ย 
Developing Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed DelintingDeveloping Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed DelintingIJMER
ย 
Study & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja FibreStudy & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja FibreIJMER
ย 
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)IJMER
ย 
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...IJMER
ย 
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...IJMER
ย 
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...IJMER
ย 
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...IJMER
ย 
Static Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works SimulationStatic Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works SimulationIJMER
ย 
High Speed Effortless Bicycle
High Speed Effortless BicycleHigh Speed Effortless Bicycle
High Speed Effortless BicycleIJMER
ย 
Integration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise ApplicationsIntegration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise ApplicationsIJMER
ย 
Microcontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation SystemMicrocontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation SystemIJMER
ย 
On some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological SpacesOn some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological SpacesIJMER
ย 
Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...IJMER
ย 
Natural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine LearningNatural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine LearningIJMER
ย 
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessEvolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessIJMER
ย 
Material Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded CylindersMaterial Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded CylindersIJMER
ย 
Studies On Energy Conservation And Audit
Studies On Energy Conservation And AuditStudies On Energy Conservation And Audit
Studies On Energy Conservation And AuditIJMER
ย 
An Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDLAn Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDLIJMER
ย 
Discrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One PreyDiscrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One PreyIJMER
ย 

More from IJMER (20)

A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...A Study on Translucent Concrete Product and Its Properties by Using Optical F...
A Study on Translucent Concrete Product and Its Properties by Using Optical F...
ย 
Developing Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed DelintingDeveloping Cost Effective Automation for Cotton Seed Delinting
Developing Cost Effective Automation for Cotton Seed Delinting
ย 
Study & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja FibreStudy & Testing Of Bio-Composite Material Based On Munja Fibre
Study & Testing Of Bio-Composite Material Based On Munja Fibre
ย 
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)
ย 
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...
ย 
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...
ย 
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...
ย 
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...
ย 
Static Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works SimulationStatic Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
Static Analysis of Go-Kart Chassis by Analytical and Solid Works Simulation
ย 
High Speed Effortless Bicycle
High Speed Effortless BicycleHigh Speed Effortless Bicycle
High Speed Effortless Bicycle
ย 
Integration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise ApplicationsIntegration of Struts & Spring & Hibernate for Enterprise Applications
Integration of Struts & Spring & Hibernate for Enterprise Applications
ย 
Microcontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation SystemMicrocontroller Based Automatic Sprinkler Irrigation System
Microcontroller Based Automatic Sprinkler Irrigation System
ย 
On some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological SpacesOn some locally closed sets and spaces in Ideal Topological Spaces
On some locally closed sets and spaces in Ideal Topological Spaces
ย 
Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...Intrusion Detection and Forensics based on decision tree and Association rule...
Intrusion Detection and Forensics based on decision tree and Association rule...
ย 
Natural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine LearningNatural Language Ambiguity and its Effect on Machine Learning
Natural Language Ambiguity and its Effect on Machine Learning
ย 
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessEvolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
ย 
Material Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded CylindersMaterial Parameter and Effect of Thermal Load on Functionally Graded Cylinders
Material Parameter and Effect of Thermal Load on Functionally Graded Cylinders
ย 
Studies On Energy Conservation And Audit
Studies On Energy Conservation And AuditStudies On Energy Conservation And Audit
Studies On Energy Conservation And Audit
ย 
An Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDLAn Implementation of I2C Slave Interface using Verilog HDL
An Implementation of I2C Slave Interface using Verilog HDL
ย 
Discrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One PreyDiscrete Model of Two Predators competing for One Prey
Discrete Model of Two Predators competing for One Prey
ย 

Recently uploaded

TechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor CatchersTechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catcherssdickerson1
ย 
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncWhy does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncssuser2ae721
ย 
Gurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort service
Gurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort serviceGurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort service
Gurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort servicejennyeacort
ย 
UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)
UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)
UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)Dr SOUNDIRARAJ N
ย 
home automation using Arduino by Aditya Prasad
home automation using Arduino by Aditya Prasadhome automation using Arduino by Aditya Prasad
home automation using Arduino by Aditya Prasadaditya806802
ย 
Research Methodology for Engineering pdf
Research Methodology for Engineering pdfResearch Methodology for Engineering pdf
Research Methodology for Engineering pdfCaalaaAbdulkerim
ย 
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfgUnit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfgsaravananr517913
ย 
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionSachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionDr.Costas Sachpazis
ย 
Solving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.pptSolving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.pptJasonTagapanGulla
ย 
Mine Environment II Lab_MI10448MI__________.pptx
Mine Environment II Lab_MI10448MI__________.pptxMine Environment II Lab_MI10448MI__________.pptx
Mine Environment II Lab_MI10448MI__________.pptxRomil Mishra
ย 
Instrumentation, measurement and control of bio process parameters ( Temperat...
Instrumentation, measurement and control of bio process parameters ( Temperat...Instrumentation, measurement and control of bio process parameters ( Temperat...
Instrumentation, measurement and control of bio process parameters ( Temperat...121011101441
ย 
POWER SYSTEMS-1 Complete notes examples
POWER SYSTEMS-1 Complete notes  examplesPOWER SYSTEMS-1 Complete notes  examples
POWER SYSTEMS-1 Complete notes examplesDr. Gudipudi Nageswara Rao
ย 
Energy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptxEnergy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptxsiddharthjain2303
ย 
Internet of things -Arshdeep Bahga .pptx
Internet of things -Arshdeep Bahga .pptxInternet of things -Arshdeep Bahga .pptx
Internet of things -Arshdeep Bahga .pptxVelmuruganTECE
ย 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
ย 
Transport layer issues and challenges - Guide
Transport layer issues and challenges - GuideTransport layer issues and challenges - Guide
Transport layer issues and challenges - GuideGOPINATHS437943
ย 
๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...9953056974 Low Rate Call Girls In Saket, Delhi NCR
ย 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
ย 

Recently uploaded (20)

TechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor CatchersTechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTACยฎ CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
ย 
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncWhy does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
ย 
Gurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort service
Gurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort serviceGurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort service
Gurgaon โœก๏ธ9711147426โœจCall In girls Gurgaon Sector 51 escort service
ย 
UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)
UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)
UNIT III ANALOG ELECTRONICS (BASIC ELECTRONICS)
ย 
young call girls in Green Park๐Ÿ” 9953056974 ๐Ÿ” escort Service
young call girls in Green Park๐Ÿ” 9953056974 ๐Ÿ” escort Serviceyoung call girls in Green Park๐Ÿ” 9953056974 ๐Ÿ” escort Service
young call girls in Green Park๐Ÿ” 9953056974 ๐Ÿ” escort Service
ย 
home automation using Arduino by Aditya Prasad
home automation using Arduino by Aditya Prasadhome automation using Arduino by Aditya Prasad
home automation using Arduino by Aditya Prasad
ย 
Research Methodology for Engineering pdf
Research Methodology for Engineering pdfResearch Methodology for Engineering pdf
Research Methodology for Engineering pdf
ย 
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfgUnit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
Unit7-DC_Motors nkkjnsdkfnfcdfknfdgfggfg
ย 
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionSachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
ย 
Solving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.pptSolving The Right Triangles PowerPoint 2.ppt
Solving The Right Triangles PowerPoint 2.ppt
ย 
Mine Environment II Lab_MI10448MI__________.pptx
Mine Environment II Lab_MI10448MI__________.pptxMine Environment II Lab_MI10448MI__________.pptx
Mine Environment II Lab_MI10448MI__________.pptx
ย 
Instrumentation, measurement and control of bio process parameters ( Temperat...
Instrumentation, measurement and control of bio process parameters ( Temperat...Instrumentation, measurement and control of bio process parameters ( Temperat...
Instrumentation, measurement and control of bio process parameters ( Temperat...
ย 
POWER SYSTEMS-1 Complete notes examples
POWER SYSTEMS-1 Complete notes  examplesPOWER SYSTEMS-1 Complete notes  examples
POWER SYSTEMS-1 Complete notes examples
ย 
Energy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptxEnergy Awareness training ppt for manufacturing process.pptx
Energy Awareness training ppt for manufacturing process.pptx
ย 
Internet of things -Arshdeep Bahga .pptx
Internet of things -Arshdeep Bahga .pptxInternet of things -Arshdeep Bahga .pptx
Internet of things -Arshdeep Bahga .pptx
ย 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
ย 
Transport layer issues and challenges - Guide
Transport layer issues and challenges - GuideTransport layer issues and challenges - Guide
Transport layer issues and challenges - Guide
ย 
๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
๐Ÿ”9953056974๐Ÿ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
ย 
Design and analysis of solar grass cutter.pdf
Design and analysis of solar grass cutter.pdfDesign and analysis of solar grass cutter.pdf
Design and analysis of solar grass cutter.pdf
ย 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
ย 

Multiexposure Image Fusion

  • 1. International OPEN ACCESS Journal Of Modern Engineering Research (IJMER) | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 17| Multiexposure Image Fusion Suhaina J1 , Bindu K. R2 1 Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India 2 Department of Electronics and Communication Engineering, FISAT/ Mahatma Gandhi University, India I. INTRODUCTION Often a single sensor cannot produce a complete representation of a scene. Visible images provide spectral and spatial details, and if a target has the same color and spatial characteristics as its background, it cannot be distinguished from the background. Image fusion is the process of combining information from two or more images of a scene into a single composite image that is more informative and is more suitable for visual perception or computer processing. The objective in image fusion is to reduce uncertainty and minimize redundancy in the output while maximizing relevant information particular to an application or task. Given the same set of input images, different fused images may be created depending on the specific application and what is considered relevant information. There are several benefits in using image fusion: wider spatial and temporal coverage, decreased uncertainty, improved reliability, and increased robustness of system performance. A large number of image fusion methods [1]โ€“[4] have been proposed in literature. Among these methods, multiscale image fusion [2] and data-driven image fusion [3] are very successful methods. They focus on different data representations, e.g., multi-scale coefficients [5], [6], or data driven decomposition coefficients [3], [7] and different image fusion rules to guide the fusion of coefficients. The major advantage of these methods is that they can well preserve the details of different source images. However, these kinds of methods may produce brightness and color distortions since spatial consistency is not well considered in the fusion process. Spatial consistency means that if two adjacent pixels have similar brightness or color, they will tend to have similar weights. A popular spatial consistency based fusion approach is formulating an energy function, where the pixel saliencies are encoded in the function and edge aligned weights are enforced by regularization terms, e.g., a smoothness term. This energy function can be then minimized globally to obtain the desired weight maps. To make full use of spatial context, optimization based image fusion approaches, e.g., generalized random walks [8], and Markov random fields [9] based methods have been proposed. These methods focus on estimating spatially smooth and edge aligned weights by solving an energy function and then fusing the source images by weighted average of pixel values. However, optimization based methods have a common limitation, i.e., inefficiency, since they require multiple iterations to find the global optimal solution. Moreover, another drawback is that global optimization based methods may over-smooth the resulting weights, which is not good for fusion. An interesting alternative to optimization based method is guided image filtering [10]. The proposed method employs guided filtering for layer extraction. The extracted layers are then fused separately. The remainder of this paper is organized as follows. In Section II, the guided image filtering algorithm is reviewed. Section III describes the proposed image fusion algorithm. The experimental results and discussions are presented in Section IV. Finally, Section V concludes the paper. Abstract: Many applications such as robot navigation, defense, medical and remote sensing perform various processing tasks, which can be performed more easily when all objects in different images of the same scene are combined into a single fused image. In this paper, we propose a fast and effective method for image fusion. The proposed method derives the intensity based variations that is large and small scale, from the source images. In this approach, guided filtering is employed for this extraction. Gaussian and Laplacian pyramidal approach is then used to fuse the different layers obtained. Experimental results demonstrate that the proposed method can obtain better performance for fusion of all sets of images. The results clearly indicate the feasibility of the proposed approach. Keywords: Gaussian Pyramid, Guided Filter, Image Fusion, Laplacian Pyramid, Multi-exposure images
  • 2. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 18| II. GUIDED IMAGE FILTERING Guided filter is an image filter derived from a local linear model. It computes the filtering output by considering the content of a guidance image, which can be the input image itself or another different image. The guided filter can be used as an edge-preserving smoothing operator like the popular bilateral filter, but it has better behaviors near edges. The guided filter is also a more generic concept beyond smoothing: It can transfer the structures of the guidance image to the filtering output, enabling new filtering applications like dehazing and guided feathering. Moreover, the guided filter naturally has a fast and nonapproximate linear time algorithm, regardless of the kernel size and the intensity range. Currently, it is one of the fastest edge- preserving filters. Guided filter is both effective and efficient in a great variety of computer vision and computer graphics applications, including edge-aware smoothing, detail enhancement, HDR compression, image matting/feathering, dehazing, joint upsampling, etc. The filtering output is locally a linear transform of the guidance image. On one hand, the guided filter has good edge-preserving smoothing properties like the bilateral filter, but it does not suffer from the gradient reversal artifacts. On the other hand, the guided filter can be used beyond smoothing: With the help of the guidance image, it can make the filtering output more structured and less smoothed than the input. Moreover, the guided filter naturally has an O(N) time (in the number of pixels N) nonapproximate algorithm for both gray-scale and high-dimensional images, regardless of the kernel size and the intensity range. Typically, the CPU implementation achieves 40 ms per mega-pixel performing gray-scale filtering. It has great potential in computer vision and graphics, given its simplicity, efficiency, and high-quality. Fig. 2.1. Illustrations of the bilateral filtering process (left) and the guided filtering process (right) 2.1 Guided filter A general linear translation-variant filtering process is defined, which involves a guidance image I, an filtering input image p, and an output image q. Both I and p are given beforehand according to the application, and they can be identical. The filtering output at a pixel i is expressed as a weighted average: (1) where i and j are pixel indexes. The filter kernel Wij is a function of the guidance image I and independent of p. This filter is linear with respect to p. An example of such a filter is the joint bilateral filter (Fig. 2.1 (left)). The bilateral filtering kernel Wbf is given by : (2) where x is the pixel coordinate and Ki is a normalizing parameter to ensure . The parameters s and r adjust the sensitivity of the spatial similarity and the range (intensity/color) similarity, respectively. The joint bilateral filter degrades to the original bilateral filter when I and p are identical. The implicit weighted- average filters optimize a quadratic function and solve a linear system in this form: (3) where q and p are N-by-1 vectors concatenating {qi} and {pi}, respectively, and A is an N-by-N matrix only depends on I. The solution to (3), i.e., , has the same form as (1), with . The key assumption of the guided filter is a local linear model between the guidance I and the filtering output q. We assume that q is a linear transform of I in a window wk centered at the pixel k: (4)
  • 3. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 19| where (ak, bk) are some linear coefficients assumed to be constant in wk. We use a square window of radius r. This local linear model ensures that q has an edge only if I has an edge, because . This model has been proven useful in image super-resolution, image matting and dehazing. To determine the linear coefficients {ak, bk}, we need constraints from the filtering input p. We model the output q as the input p subtracting some unwanted components n like noise/textures: (5) A solution that minimizes the difference between q and p while maintaining the linear model (4) is suggested. Specifically, the following cost function in the window wk is minimized : (6) Here, is a regularization parameter penalizing large ak. Equation (6) is the linear ridge regression model [11] and its solution is given by : (7) (8) Here, k and are mean and variance of I in wk, |w| is the number of pixels in wk, and is the mean of p in wk. Having obtained the linear coefficients {ak, bk}, we can compute the filtering output qi by (4). Fig.2.1 (right) shows an illustration of the guided filtering process. However, a pixel i is involved in all the overlapping windows wk that covers i, so the value of qi in (4) is not identical when it is computed in different windows. A simple strategy is to average all the possible values of qi. So after computing (ak, bk) for all windows wk in the image, we compute the filtering output by : (9) Noticing that due to the symmetry of the box window, (9) is rewritten as : (10) where and are the average coefficients of all windows overlapping i. The averaging strategy of overlapping windows is popular in image denoising. With the modification in (10), q is no longer scaling of I because the linear coefficients vary spatially. But as are the output of a mean filter, their gradients can be expected to be much smaller than that of I near strong edges. In short, abrupt intensity changes in I can be mostly preserved in q. Equations (7), (8), and (10) are the definition of the guided filter. A pseudocode is in Algorithm 1. In this algorithm, fmean is a mean filter with a window radius r. The abbreviations of correlation (corr), variance (var), and covariance (cov) indicate the intuitive meaning of these variables. Algorithm 1. Guided Filter. Input: filtering input image p, guidance image I, radius r, regularization Output: filtering output q. 1: meanI = fmean(I) meanp = fmean(p) corrI = fmean(I.*I) corrIp = fmean(I.*p) 2: varI = corrI - meanI .* meanI covIp = corrIp - meanI .* meanp 3: a = covIp ./(varI + ) b = meanp โ€“ a.* meanI 4: meana = fmean(a) meanb = fmean(b) 5: q = meana .* I + meanb III. OVERALL APPROACH The flowchart of the proposed image fusion method is shown in Fig. 3.1. We first employ guided filtering for the extraction of base layers and detail layers from the input images. ๐‘ž๐‘– computed in (9) preserves the strongest edges in ๐ผ while smoothing small changes in intensity. Let bK be the base layer computed from (9) (i.e.,bK = ๐‘ž๐‘– and 1 โ‰ค ๐พ โ‰ค ๐‘) for ๐พth input image denoted by ๐ผ ๐พ . The detail layer is defined as the difference between the guided filter output and the input image, which is defined as
  • 4. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 20| (11) Fig 3.1. Flowchart of the proposed method 3.1 Base Layer Fusion The pyramid representation expresses an image as a sum of spatially band-passed images while retaining local spatial information in each band. A pyramid is created by lowpass filtering an image ๐บ0 with a compact two-dimensional filter. The filtered image is then subsampled by removing every other pixel and every other row to obtain a reduced image ๐บ1.This process is repeated to form a Gaussian pyramid ๐บ0, ๐บ1, ๐บ2, ๐บ3, . . . , ๐บ๐‘‘. Expanding ๐บ1 to the same size as ๐บ0 and subtracting yields the band-passed image ๐ฟ0. A Laplacian pyramid ๐ฟ0, ๐ฟ1, ๐ฟ2, . . . , ๐ฟ๐‘‘โˆ’1, can be built containing band-passed images of decreasing size and spatial frequency. l l l+1 (12) where l refers to the number of levels in the pyramid. The original image can be reconstructed from the expanded band-pass images: (13) The Gaussian pyramid contains low-passed versions of the original ๐บ0, at progressively lower spatial frequencies. This effect is clearly seen when the Gaussian pyramid levels are expanded to the same size as ๐บ0. The Laplacian pyramid consists of band-passed copies of ๐บ0. Each Laplacian level contains the edges of a certain size and spans approximately an octave in spatial frequency. (a) Quality measures Many images in the stack contain flat, colorless regions due to under- and overexposure. Such regions should receive less weight, while interesting areas containing bright colors and details should be preserved. The following measures are used to achieve this: โ€ข Contrast: Contrast is created by the difference in luminance reflected from two adjacent surfaces. In other words, contrast is the difference in visual properties that makes an object distinguishable from other object and the background. In visual perception contrast is determined by the difference in color and brightness of the object with other object. It is the difference between the darker and the lighter pixel of the image, if it is big the image will have high contrast and in the other case the image will have low contrast. A Laplacian filter is applied to the grayscale version of each image, and take the absolute value of the filter response. This yields a simple indicator C for contrast. It tends to assign a high weight to important elements such as edges and texture. โ€ข Saturation: As a photograph undergoes a longer exposure, the resulting colors become desaturated and eventually clipped. The saturation of a color is determined by a combination of light intensity and how much it is distributed across the spectrum of different wavelengths. The purest (most saturated) color is achieved by using just one wavelength at a high intensity, such as in laser light. If the intensity drops, then as a result the saturation drops. Saturated colors are desirable and make the image look vivid. A saturation measure S is included which is computed as the standard deviation within the R, G and B channel, at each pixel. โ€ข Exposure: Exposure is a term that refers to two aspects of photography โ€“ it is referring to how to control the lightness and the darkness of the image. In photography, exposure is the amount of light per unit area reaching a photographic film. A photograph may be described as overexposed when it has a loss of highlight detail, that is, when important bright parts of an image are washed out or effectively all white, known as blown out highlights or clipped whites. A photograph may be described as underexposed when it has a loss of shadow detail, that is, when important dark areas are muddy or indistinguishable from black, known as blocked up shadows. Looking at just the raw intensities within a channel, reveals how well a pixel is exposed. We want to
  • 5. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 21| keep intensities that are not near zero (underexposed) or one (overexposed). We weight each intensity i based on how close it is to 0.5 using a Gauss curve: (14) To account for multiple color channels, we apply the Gauss curve to each channel separately, and multiply the results, yielding the measure E. The fused base layer is computed as the weighted sum of the base layers ๐‘1 ๐‘2 . . . ,๐‘๐‘ obtained across ๐‘ input exposures. Pyramidal approach is used to generate Laplacian pyramid of the base layers L{๐‘ ๐พ }๐‘™ and Gaussian pyramid of weight map functions ๐บ{๐‘Š ๐พ }๐‘™ estimated from three quality measures (i.e., saturation ๐‘† ๐พ , contrast ๐ถ ๐พ , and exposure ๐ธ ๐พ ). Here, ๐‘™ (0 < ๐‘™ < ๐‘‘) refers to the number of levels in the pyramid and ๐พ (1 < ๐พ < ๐‘) refers to the number of input images. The weight map is computed as the product of these three quality metrics (i.e. WK = ๐‘† ๐พ โ‹… ๐ถ ๐พ โ‹… ๐ธ ๐พ ). The L{๐‘ ๐พ }๐‘™ multiplied with the corresponding ๐บ{๐‘Š ๐พ(๐‘–โ€Ÿ, ๐‘—โ€Ÿ)}๐‘™ and summing over ๐พ yield modified Laplacian pyramid ๐ฟ๐‘™(๐‘–โ€Ÿ, ๐‘—โ€Ÿ) as follows: (15) The that contains well exposed pixels is reconstructed by expanding each level and then summing all the levels of the Laplacian pyramid: (16) 3.2 Detail Layer Fusion The detail layers computed in (11) across all the input exposures are linearly combined to produce fused detail layer ๐‘‘๐‘“ that yields combined texture information as follows: (17) where is the user defined parameter to control amplification of texture details (typically set to 5). Finally, the detail enhanced fused image ๐‘” is easily computed by simply adding up the fused base layer ๐‘ ๐‘“ computed in (16) and the manipulated fused detail layer in (17) as follows: (18) 3.3 Numerical Analysis Numerical analysis is the process of evaluating a technique via some objective metrics. For this purpose, two fusion quality metrics [12], i.e., information theory based metric (QMI) [13] and structure based metrics (Qc) [14] are adopted. In order to assess the fusion performance, fusion quality metric is used. (a) Normalized mutual information (QMI) Normalized mutual information, QMI is an information theory based metric. Mutual information improves image fusion quality assessments. One problem with traditional mutual information metric is that it is unstable and may bias the measure towards the source image with the highest entropy. The size of the overlapping part of the images influences the mutual information measure in two ways. First of all, a decrease in overlap decreases the number of samples, which reduces the statistical power of the probability distribution estimation. Secondly, with increasing misregistration (which usually coincides with decreasing overlap) the mutual information measure may actually increase. This can occur when the relative areas of object and background even out and the sum of the marginal entropies increases, faster than the joint entropy. Normalized measure of mutual information is less sensitive to changes in overlap. Hossny et al. modified it to the normalized mutual information [13]. Here, Hossny et al.โ€Ÿs definition is adopted. (19) where H(A), H(B) and H(F) are the marginal entropy of A, B and F, and MI(A, F)is the mutual information between the source image A and the fused image F. (20) where H(A, F) is the joint entropy between A and F, H(A) and H(F) are the marginal entropy of A and F, respectively, and MI(B,F) is similar to MI(A, F). The quality metric QMI measures how well the original information from source images is preserved in the fused image. (b) Cvejic et al.โ€™s metric (Qc) Cvejic et al.โ€Ÿs metric, Qc is a structure based metric. It is calculated as follows :
  • 6. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 22| (21) where is calculated as follows : (22) and are the covariance between A,B and F, UIQI refers to the universal image quality index. The Qc quality metric estimates how well the important information in the source images is preserved in the fused image, while minimizing the amount of distortion that could interfere with interpretation. IV. Results And Discussion The system described above is implemented using Matlab and the result was successfully obtained. In this section, the obtained results are provided. Figure 4.1 shows the base and detail layers. Figure 4.2: Base layers and Detail layers Pyramidal approach is used for fusing base layers. Quality measures of images are considered to compute the weight map. Weight map is the combination of contrast, saturation and exposure. Figure 4.3 shows the gaussian pyramid of weight map function. Figure 4.3: Gaussian pyramid Laplacian pyramid of the base layers are generated. Thus obtained laplacian pyramid is shown in figure 4.4. Figure 4.4: Laplacian pyramid Fused pyramid is obtained by combining the Gaussian pyramid of weight map functions and Laplacian pyramid of base layers. Figure 4.5 shows the fused pyramid.
  • 7. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 23| Figure 4.5: Fused pyramid Fused base layer is the weighted sum of base layers. The detail layers obtained are boosted and fused. Figure 4.6 shows the fused base and detail layers. Figure 4.6: Fused base layer and detail layer Finally, the fused image is obtained by combining the obtained fused base and fused detail layers. The fused image is shown in figure 4.7. Numerical analysis is performed on the obtained results. Figure 4.7: Fused image The following shows the results obtained for some of the other source images. (a) (b) Figure 4.8: (a) Source Images (b) Fused Image
  • 8. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 24| (a) (b) Figure 4.9: (a) Source Images (b) Fused Image (a) (b) Figure 4.10: (a) Source Images (b) Fused Image V. Conclusion We proposed a technique for fusing multiexposure input images. The proposed method constructs a detail enhanced image from a set of multiexposure images by using a multiresolution decomposition technique. When compared with the existing techniques which use multiresolution and single resolution analysis for exposure fusion, the current proposed method performs better in terms of enhancement of texture details in the fused image. The framework is inspired by the edge preserving property of guided filter that has better response near strong edges. Experiments show that the proposed method can well preserve the original and complementary information of multiple input images. REFERENCES [1] S. Li, J. Kwok, I. Tsang, and Y. Wang, โ€œFusing images with different focuses using support vector machines,โ€ IEEE Trans. Neural Netw., vol. 15, no. 6, pp. 1555โ€“1561, Nov. 2004. [2] G. Pajares and J. M. de la Cruz, โ€œA wavelet-based image fusion tutorial,โ€ Pattern Recognit., vol. 37, no. 9, pp. 1855โ€“1872, Sep. 2004. [3] D. Looney and D. Mandic, โ€œMultiscale image fusion using complex extensions of EMD,โ€ IEEE Trans. Signal Process., vol. 57, no. 4, pp. 1626โ€“1630, Apr. 2009. [4] M. Kumar and S. Dass, โ€œA total variation-based algorithm for pixellevel image fusion,โ€ IEEE Trans. Image Process., vol. 18, no. 9, pp. 2137โ€“2143, Sep. 2009. [5] P. Burt and E. Adelson, โ€œThe laplacian pyramid as a compact image code,โ€ IEEE Trans. Commun., vol. 31, no. 4, pp. 532โ€“540, Apr. 1983. [6] O. Rockinger, โ€œImage sequence fusion using a shift-invariant wavelet transform,โ€ in Proc. Int. Conf. Image Process., vol. 3, Washington, DC, USA, Oct. 1997, pp. 288โ€“291. [7] J. Liang, Y. He, D. Liu, and X. Zeng, โ€œImage fusion using higher order singular value decomposition,โ€ IEEE Trans. Image Process., vol. 21, no. 5, pp. 2898โ€“2909, May 2012. [8] R. Shen, I. Cheng, J. Shi, and A. Basu, โ€œGeneralized random walks for fusion of multi-exposure images,โ€ IEEE Trans. Image Process., vol. 20, no. 12, pp. 3634โ€“3646, Dec. 2011.
  • 9. Multiexposure Image Fusion | IJMER | ISSN: 2249โ€“6645 | www.ijmer.com | Vol. 4 | Iss.11| Nov. 2014 | 25| [9] M. Xu, H. Chen, and P. Varshney, โ€œAn image fusion approach based on markov random fields,โ€ IEEE Trans. Geosci. Remote Sens., vol. 49, no. 12, pp. 5116โ€“5127, Dec. 2011. [10] K. He, J. Sun, and X. Tang, โ€œGuided image filtering,โ€ in Proc. Eur. Conf. Comput. Vis., Heraklion, Greece, Sep. 2010, pp. 1โ€“14. [11] N. Draper and H. Smith, Applied Regression Analysis. New York, USA: Wiley, 1981. [12] Z. Liu, E. Blasch, Z. Xue, J. Zhao, R. Laganiere, and W. Wu,โ€œObjective assessment of multiresolution image fusion algorithms for context enhancement in night vision: A comparative study,โ€ IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 1, pp. 94โ€“109, Jan. 2012. [13] M. Hossny, S. Nahavandi, and D. Creighton, โ€œComments on โ€žinformation measure for performance of image fusionโ€Ÿ,โ€ Electron. Lett., vol. 44, no. 18, pp. 1066โ€“1067, Aug. 2008. [14] N. Cvejic, A. Loza, D. Bull, and N. Canagarajah, โ€œA similarity metric for assessment of image fusion algorithms,โ€ Int. J. Signal Process., vol. 2, no. 3, pp. 178โ€“182, Apr. 2005.