SlideShare a Scribd company logo
1 of 15
Download to read offline
Natarajan Meghanathan, et al. (Eds): SIPM, FCST, ITCA, WSE, ACSIT, CS & IT 06, pp. 479–493, 2012.
© CS & IT-CSCP 2012 DOI : 10.5121/csit.2012.2347
A NOVEL METRIC APPROACH
EVALUATION FOR THE SPATIAL
ENHANCEMENT OF PAN-SHARPENED
IMAGES
Firouz Abdullah Al-Wassai1
and Dr. N.V. Kalyankar2
1
Department of Computer Science, (SRTMU), Nanded, India
fairozwaseai@yahoo.com
2
Principal, Yeshwant Mahavidyala College, Nanded, India
drkalyankarnv@yahoo.com
ABSTRACT
Various and different methods can be used to produce high-resolution multispectral images
from high-resolution panchromatic image (PAN) and low-resolution multispectral images (MS),
mostly on the pixel level. The Quality of image fusion is an essential determinant of the value of
processing images fusion for many applications. Spatial and spectral qualities are the two
important indexes that used to evaluate the quality of any fused image. However, the jury is still
out of fused image’s benefits if it compared with its original images. In addition, there is a lack
of measures for assessing the objective quality of the spatial resolution for the fusion methods.
So, an objective quality of the spatial resolution assessment for fusion images is required.
Therefore, this paper describes a new approach proposed to estimate the spatial resolution
improve by High Past Division Index (HPDI) upon calculating the spatial-frequency of the edge
regions of the image and it deals with a comparison of various analytical techniques for
evaluating the Spatial quality, and estimating the colour distortion added by image fusion
including: MG, SG, FCC, SD, En, SNR, CC and NRMSE. In addition, this paper devotes to
concentrate on the comparison of various image fusion techniques based on pixel and feature
fusion technique.
KEYWORDS
image quality; spectral metrics; spatial metrics; HPDI, Image Fusion.
1. INTRODUCTION
The Quality of image fusion is an essential determinant of the value of processing images fusion
for many applications. Spatial and spectral qualities are the two important indexes that used to
evaluate the quality of any fused image. Generally, one aims to preserve as much source
information as possible in the fused image with the expectation that performance with the fused
image will be better than, or at least as good as, performance with the source images [1]. Several
authors describe different spatial and spectral quality analysis techniques of the fused images.
480 Computer Science & Information Technology ( CS & IT )
Some of them enable subjective, the others objective, numerical definition of spatial or spectral
quality of the fused data [2-5]. The evaluation of the spatial quality of the pan-sharpened images
is equally important since the goal is to retain the high spatial resolution of the PAN image. A
survey of the pan sharpening literature revealed there were very few papers that evaluated the
spatial quality of the pan-sharpened imagery [6]. However, the jury is still out on the benefits of a
fused image compared to its original images. There is also a lack of measures for assessing the
objective quality of the spatial resolution of the fusion methods. As a result of that, an objective
quality of the spatial resolution assessment for fusion images is required.
This study presented a new approach to assess the spatial quality of a fused image based on
HPDI, depends upon the spatial-frequency of the edge regions of the image and comparing it with
other methods as [27, 28, 33]. In addition, many spectral quality metrics, to compare the
properties of fused images and their ability to preserve the similarity with respect to the MS
image while incorporating the spatial resolution of the PAN image, should increase the spectral
fidelity while retaining the spatial resolution of the PAN. In addition, this study focuses on
cambering that the best methods based on pixel fusion techniques (see section 2) are those with
the fallowing feature fusion techniques: Segment Fusion (SF), Principal Component Analysis
based Feature Fusion (PCA) and Edge Fusion (EF) in [7].
The paper organized as follows .Section 2 gives the image fusion techniques; Section 3 includes
the quality of evaluation of the fused images; Section 4 covers the experimental results and
analysis then subsequently followed by the conclusion.
2. IMAGE FUSION TECHNIQUES
Image fusion techniques can be divided into three levels, namely: pixel level, feature level and
decision level of representation [8-10]. The image fusion techniques based on pixel can be
grouped into several techniques depending on the tools or the processing methods for image
fusion procedure summarized as fallow:
1) Arithmetic Combination techniques: such as Bovey Transform (BT) [11-13]; Color
Normalized Transformation (CN) [14, 15]; Multiplicative Method (MLT) [17, 18].
2) Component Substitution fusion techniques: such as IHS, HSV, HLS and YIQ in [19].
3) Frequency Filtering Methods :such as in [20] High-Pass Filter Additive Method (HPFA) ,
High –Frequency- Addition Method (HFA) , High Frequency Modulation Method (HFM)
and The Wavelet transform-based fusion method (WT).
4) Statistical Methods: such as in [21] Local Mean Matching (LMM), Local Mean and Variance
Matching (LMVM), Regression variable substitution (RVS), and Local Correlation Modeling
(LCM).
All the above techniques employed in our previous studies [19-21]. Therefore, the best method
for each group selected in this study as the fallowing: (HFA), (HFM) [20], (RVS) [21] and the
IHS method by [22].
To explain the algorithms through this study, Pixels should have the same spatial resolution from
two different sources that are manipulated to obtain the resultant image. Here, The PAN image
have a different spatial resolution from that of the MS image. Therefore, re-sampling of MS
image to the spatial resolution of PAN is an essential step in some fusion methods to bring the
Computer Science & Information Technology ( CS & IT ) 481
MS image to the same size of PAN, thus the re-sampled MS image will be noted by Μ୩ that
represents the set of DN of band k in the re-sampled MS image.
3. QUALITY EVALUATION OF THE FUSED IMAGES
This section describes the various spatial and spectral quality metrics used to evaluate them. The
spectral fidelity of the fused images with respect to the MS images is described. When analyzing
the spectral quality of the fused images we compare spectral characteristics of images obtained
from the different methods, with the spectral characteristics of re-sampled MS images. Since the
goal is to preserve the radiometry of the original MS images, any metric used must measure the
amount of change in DN values in the pan-sharpened image F୩compared to the original imageM୩.
Also, In order to evaluate the spatial properties of the fused images, a PAN image and intensity
image of the fused image have to be compared since the goal is to retain the high spatial
resolution of the PAN image. In the following F୩ , M୩are the measurements of each the
brightness values pixels of the result image and the original MS image of bandk, Mഥ୩ and Fത୩are
the mean brightness values of both images and are of size n ∗ m . BV is the brightness value of
image data Mഥ୩ and Fത୩.
3.1 SPECTRAL QUALITY METRICS:
1) Standard Deviation (‫)۲܁‬
The SD, which is the square root of variance, reflects the spread in the data. Thus, a high
contrast image will have a larger variance, and a low contrast image will have a low
variance. It indicates the closeness of the fused image to the original MS image at a pixel
level. The ideal value is zero.
ߪ = ට
∑ ∑ (஻௏(௡೙
ೕసభ ,௠)ିఓ)మ೘
೔సభ
௠×௡
(1)
2222)))) Entropy(۳‫)ܖ‬
The En of an image is a measure of information content but has not been used to assess
the effects of information change in fused images. En reflects the capacity of the
information carried by images. The larger En means that high information in the image
[6]. By applying Shannon’s entropy in evaluation the information content of an image,
the formula is modified as [23]:
En = − ∑ P(i)logଶP(i)ଶହହ
୧ୀ଴ (2)
Where P(i) is the ratio of the number of the pixels with gray value equal to ݅ over the
total number of the pixels.
3) Signal-to Noise Ratio (‫܀ۼ܁‬)
The signal is the information content of the data of MS imageM୩, while the merging ‫ܨ‬௞
can cause the noise, as error that is added to the signal. The ܴ‫ܵܯ‬௞ of the SNR can be
used to calculate the ܴܵܰ௞, given by [24]:
ܴܵܰ௞ = ඨ
∑ ∑ (ிೖ(௜,௝))మ೘
ೕ
೙
೔
∑ ∑ (ிೖ(௜,௝)ିெೖ(௜,௝))మ೘
ೕ
೙
೔
(3)
482 Computer Science & Information Technology ( CS & IT )
4) Correlation Coefficient (۱۱)
The CC measures the closeness or similarity between two images. It can vary between –
1 to +1. A value close to +1 indicates that the two images are very similar, while a value
close to –1 indicates that they are highly dissimilar. The formula to compute the
correlation between F୩ , M୩:
࡯࡯ =
∑ ∑ (ࡲ࢑(࢏,࢐)ିࡲ࢑)(ࡹ࢑(࢏,࢐)ିࡹ࢑)࢓
࢐
࢔
࢏
ට∑ ∑ (ࡲ࢑(࢏,࢐)ିࡲ࢑)૛࢓
࢐
࢔
࢏ ට∑ ∑ (ࡹ࢑(࢏,࢐)ିࡹ࢑)૛࢓
࢐
࢔
࢏
(4)
Since the pan-sharpened image larger (more pixels) than the original MS image it is not
possible to compute the CC or apply any other mathematical operation between them.
Thus, the up-sampled MS image M୩is used for this comparison.
5) Normalization Root Mean Square Error (NRMSE)
The NRMSE used in order to assess the effects of information changing for the fused
image. When level of information loss can be expressed as a function of the original MS
pixel M୩ and the fused pixelF୩, by using the NRMSE between M୩ and F୩images in band
k. The Normalized Root- Mean-Square Error NRMSE୩ between F୩ and M୩ is a point
analysis in multispectral space representing the amount of change the original MS pixel
and the corresponding output pixels using the following equation [27]:
ܴܰ‫ܧܵܯ‬௞ = ට
ଵ
௡௠∗ଶହହమ
∑ ∑ (‫ܨ‬௞(݅, ݆) − ‫ܯ‬௞(݅, ݆))ଶ௠
௝
௡
௜ (5)
6) The Histogram Analysis
The histograms of the multispectral original MS and the fused bands must be evaluated
[4]. If the spectral information preserved in the fused image, its histogram will closely
resemble the histogram of the MS image. The analysis of histogram deals with the
brightness value histograms of all RGB-color bands, and L-component of the resample
MS image and the fused A greater difference of the shape of the corresponding
histograms represents a greater spectral change [31].
3.2 SPATIAL QUALITY METRICS
1) Mean Grades (MG)
MG has been used as a measure of image sharpness by [27, 28]. The gradient at any pixel is the
derivative of the DN values of neighboring pixels. Generally, sharper images have higher
gradient values. Thus, any image fusion method should result in increased gradient values
because this process makes the images sharper compared to the low-resolution image. The
calculation formula is [6]:
‫ܩ‬̅ =
ଵ
(௠ିଵ)(௡ିଵ)
∑ ∑ ට
∆ூೣ
మ ା∆ூ೤
మ
ଶ
௡ିଵ
௝ୀଵ
௠ିଵ
௜ୀଵ (6)
Where
∆‫ܫ‬௫ = ݂(݅ + 1, ݆) − ݂(݅, ݆)
∆‫ܫ‬௬ = ݂(݅, ݆ + 1) − ݂(݅, ݆) (7)
Computer Science & Information Technology ( CS & IT ) 483
Where ∆‫ܫ‬௫and ∆‫ܫ‬௬ are the horizontal and vertical gradients per pixel of the image fused ݂(݅, ݆).
generally, the larger ‫ܩ‬̅, the more the hierarchy, and the more definite the fused image.
2) Soble Grades (SG)
This approach developed in this study by used the Soble operator. That by computes
discrete gradient in the horizontal and vertical directions at the pixel location ݅, ݆ of an
image ݂(݅, ݆). The Soble operator was the most popular edge detection operator until the
development of edge detection techniques with a theoretical basis. It proved popular
because it gave a better performance contemporaneous edge detection operator than
other such as the Prewitt operator [30]. For this, which is clearly more costly to
evaluate, the orthogonal components of gradient as the following [31]:
‫ݔܩ‬ = ሼ݂(݅ − 1, ݆ + 1) + 2݂(݅ − 1, ݆) + ݂(݅ − 1, ݆ − 1)ሽ − ሼ݂(݅ + 1, ݆ + 1) +
2݂(݅ + 1, ݆) + ݂(݅ + 1, ݆ − 1)ሽ
And
‫ݕܩ‬ = ሼ݂(݅ − 1, ݆ + 1) + 2݂(݅, ݆ + 1) + ݂(݅ + 1, ݆ + 1)ሽ − ሼ݂(݅ − 1, ݆ −
1) + 2݂(݅, ݆ − 1) + ݂(݅ + 1, ݆ − 1)ሽ (8)
It can be seen that the Soble operator is equivalent to simultaneous application of the
templates as the following [32]:
‫ܩ‬௫ = ൥
1 2 1
0 0 0
−1 −2 −1
൩ ‫ܩ‬௬ = ൥
−1 0 1
−2 0 2
−1 0 1
൩ (9)
Then the discrete gradient ‫ܩ‬ of an image ݂(݅, ݆) is given by
‫ܩ‬̅ =
ଵ
(௠ିଵ)(௡ିଵ)
∑ ∑ ටீೣ
మ
ାீ೤
మ
ଶ
(௡ିଵ)
௝ୀଵ
(௠ିଵ)
௜ୀଵ (10)
Where G୶and G୷ are the horizontal and vertical gradients per pixel. Generally, the
larger values forGഥ , the more the hierarchy and the more definite the fused image.
3) Filtered Correlation Coefficients (FCC)
This approach was introduced [33]. In the Zhou’s approach, the correlation coefficients
between the high-pass filtered fused PAN and TM images and the high-pass filtered
PAN image are taken as an index of the spatial quality. The high-pass filter is known as
a Laplacian filter as illustrated in eq. (11)
:mask = ൥
−1 −1 −1
−1 8 −1
−1 −1 −1
൩ (11)
However, the magnitude of the edges does not necessarily have to coincide, which is
the reason why Zhou et al proposed to look at their correlation coefficients [33]. So, in
this method the average correlation coefficient of the faltered PAN image and all
faltered bands is calculated to obtain FCC. An FCC value close to one indicates high
spatial quality.
4) HPDI a New Scheme Of Spatial Evaluation Quality
To explain the new proposed technique of HPDI to evaluation the quality of the spatial
resolution specifying the edges in the image by using the Laplacian filter (eq.11). The
484 Computer Science & Information Technology ( CS & IT )
Laplacian filtered PAN image is taken as an index of the spatial quality to measure the
amount of edge information from the PAN image is transferred into the fused images. The
deviation index between the high pass filtered ܲ and the fused ‫ܨ‬௞ images would indicate
how much spatial information from the PAN image has been incorporated into the ‫ܵܯ‬
image to obtain HPDI as follows:
HPDI୩ =
1
nm
෍ ෍
|F୩(i, j) − P(i, j)|
P(i, j)
୫
୨
୬
୧
(12)
The larger value HPDI the better image quality. Indicates that the fusion result it has a
high spatial resolution quality of the image.
4. EXPERIMENTAL RESULTS
The above assessment techniques are tested on fusion of Indian IRS-1C PAN of the 5.8- m
resolution panchromatic band and the Landsat TM the red (0.63 - 0.69 µm), green (0.52 - 0.60
µm) and blue (0.45 - 0.52 µm) bands of the 30 m resolution multispectral image were used in this
work. Fig.1 shows the IRS-1C PAN and multispectral TM images. Hence, this work is an attempt
to study the quality of the images fused from different sensors with various characteristics. The
size of the PAN is 600 * 525 pixels at 6 bits per pixel and the size of the original multispectral is
120 * 105 pixels at 8 bits per pixel, but this is up-sampled by nearest neighbor to same size the
PAN image. The pairs of images were
geometrically registered to each other. The HFA,
HFM, HIS, RVS, PCA, EF, and SF methods are
employed to fuse IRS-C PAN and TM multi-
spectral images.
4.1. Spectral Quality Metrics Results
From table1 and Fig. 2 shows those parameters
for the fused images using various methods. It can
be seen that from Fig. 2a and table1 the SD
results of the fused images remains constant for
all methods except the IHS. According to the
computation results En in table1, the increased En
indicates the change in quantity of information
content for spectral resolution through the
merging. From table1 and Fig.2b, it is obvious
that En of the fused images have been changed
when compared to the original MS except the
PCA. In Fig.2c and table1 the maximum
correlation values was for PCA. In Fig.2d and
table1 the maximum results of SNR were with
the SF, and HFA. Results SNR and NRMSE
appear changing significantly. It can be observed
from table1 with the diagram Fig. 2d & Fig. 2e
for results SNR and NRMSE of the fused image,
the SF and HFA methods gives the best results
Fig.1: The Representation of Original
PAN and MS Images
Computer Science & Information Technology ( CS & IT ) 485
with respect to the other methods. Means that this method maintains most of information spectral
content of the original MS data set which gets the same values presented the lowest value of the
NRMSE as well as the high of the CC and SNR. Hence, the SF and HFA fused images for
preservation of the spectral resolution original MS image much better techniques than the other
methods.
Fig. 2a: Chart Representation of SD Fig. 2b: Chart Representation of En
Fig.2c: Chart Representation of CC Fig. 2d: Chart Representation of SNR
Fig. 2e: Chart Representation of NRMSE
Fig. 2: Chart Representation of SD, En, CC, SNR, NRMSE of Fused Images
0
10
20
30
40
50
60
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3
ORG EF HFA HFM IHS PCA RVS SF
SD
0
1
2
3
4
5
6
7
8
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3
ORG EF HFA HFM IHS PCA RVS SF
En
0.84
0.86
0.88
0.9
0.92
0.94
0.96
0.98
1
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3
EF HFA HFM IHS PCA RVS SF
CC
0.000
5.000
10.000
15.000
20.000
25.000
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3
EF HFA HFM IHS PCA RVS SF
SNR
0.000
0.050
0.100
0.150
0.200
0.250
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3
EF HFA HFM IHS PCA RVS SF
NRMSE
486 Computer Science & Information Technology ( CS & IT )
The spectral distortion introduced by the fusion can be analyzed the histogram for all RGB color
bands and L-component that appears changing significantly. Fig.2 noted that the matching for R
&G color bands between the original MS with the fused images. Many of the image fusion
methods examined in this study and the best matching for the intensity values between the original
MS image and the fused image for each of the R&G color bands obtained by SF. There are also
matching for the B color band in Fig.3 and L- component in Fig.3 except at the values of intensity
that ranging in value 253 to 255 not appear the values intensity of the original image whereas
highlight the values of intensity of the merged images clearly in the Fig.2 & Fig.3. That does not
mean its conflicting values or the spectral resolution if we know that the PAN band (0.50 - 0.75
µm) does not spectrally overlap the blue band of the MS (0.45 - 0.52 µm). Means that during the
process of merging been added intensity values found in the PAN image and there have been no
in the original MS image which are subject to short wavelengths affected by many factors during
the transfer and There can be no to talk about these factors in this context. Most researchers
histogram match the PAN band to each MS band before merging them and substituting the high
frequency coefficients of the PAN image in place of the MS image’s coefficients such as HIS
&PCA methods . However, they have been found where the radiometric normalization as EF
&PCA methods is left out Fig. 2 &3. Also, by analyzing the histogram of the Fig. 3 for the fused
image, we found that the values of intensity are more significantly when values of 255 for the
G&B color bands of the original MS image. The extremism in the Fig. 3 for the intensity of
luminosity disappeared.
Table 1: The Spectral Quality Metrics Results for the Original MS and Fused Image Methods
Method Band SD En CC SNR NRMSE
ORG
1 51.018 5.2093 -- --- ----
2 51.477 5.2263 --- --- ----
3 51.983 5.2326 --- --- ---
EF
1 55.184 6.0196 0.896 2.742 0.173
2 55.792 6.0415 0.896 2.546 0.173
3 56.308 6.0423 0.898 2.359 0.173
HFA
1 52.793 5.7651 0.943 9.107 0.068
2 53.57 5.7833 0.943 8.518 0.069
3 54.498 5.7915 0.943 7.946 0.070
HFM
1 52.76 5.9259 0.934 8.660 0.071
2 53.343 5.8979 0.94 8.581 0.068
3 54.136 5.8721 0.945 8.388 0.066
IHS
1 41.164 7.264 0.915 4.063 0.189
2 41.986 7.293 0.917 3.801 0.196
3 42.709 7.264 0.917 3.690 0.192
PCA
1 47.875 5.1968 0.984 21.611 0.030
2 49.313 5.2485 0.985 19.835 0.031
3 51.092 5.2941 0.986 18.515 0.031
RVS
1 51.323 5.8841 0.924 5.936 0.102
2 51.769 5.8475 0.932 5.887 0.098
3 52.374 5.8166 0.938 5.800 0.094
SF
1 51.603 5.687 0.944 11.422 0.056
2 52.207 5.7047 0.944 10.732 0.056
3 53.028 5.7123 0.945 10.144 0.056
Computer Science & Information Technology ( CS & IT ) 487
Generally, the best method through the previous analysis of the Fig.2 and Fig.3 to preservation of
the maximum spectral characteristics as possible to the original image for each RGB band and L-
component was with SF method. Because the edges are affected more than homogenous regions
through the process of merge by moving spatial details to the multispectral MS image and
consequently affect on its features and that showed in the image after the merged.
Fig.3: Histogram Analysis for All RGB-Color Band and L-Component of Fused Images with MS Image
EF
HFA
HFM
IHS
PCA
RVS
SF
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(R)
Intensity
MS EF
0
0.005
0.01
0.015
0.02
0.025
0.03
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(G)
Intensity
MS EF
0
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.01
0.011
0.012
0
12
24
36
48
60
72
84
96
108
120
132
144
156
168
180
192
204
216
228
240
252
P(B)
Intensity
MS EF
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(L)
Intensity
MS EF
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(R)
Intensity
MS HFA
0
0.005
0.01
0.015
0.02
0.025
0.03
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(G)
Intensity
MS HFA
0
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.01
0.011
0.012
0
12
24
36
48
60
72
84
96
108
120
132
144
156
168
180
192
204
216
228
240
252
P(B)
Intensity
MS HFA
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(L)
Intensity
MS HFA
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(R)
Intensity
MS HFM
0
0.005
0.01
0.015
0.02
0.025
0.03
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(G)
Intensity
MS HFM
0
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.01
0.011
0.012
0
12
24
36
48
60
72
84
96
108
120
132
144
156
168
180
192
204
216
228
240
252
P(B)
Intensity
MS HFM
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(L)
Intensity
MS HFM
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(R)
Intensity
MS IHS
0
0.005
0.01
0.015
0.02
0.025
0.03
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(G)
Intensity
MS IHS
0
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.01
0.011
0.012
0
12
24
36
48
60
72
84
96
108
120
132
144
156
168
180
192
204
216
228
240
252
P(B)
Intensity
MS IHS
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(L)
Intensity
MS IHS
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(R)
Intensity
MS PCA
0
0.005
0.01
0.015
0.02
0.025
0.03
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(G)
Intensity
MS PCA
0
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.01
0.011
0.012
0
12
24
36
48
60
72
84
96
108
120
132
144
156
168
180
192
204
216
228
240
252
P(B)
Intensity
MS PCA
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(L)
Intensity
MS PCA
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(R)
Intensity
MS RVS
0
0.005
0.01
0.015
0.02
0.025
0.03
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(G)
Intensity
MS RVS
0
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.01
0.011
0.012
0
12
24
36
48
60
72
84
96
108
120
132
144
156
168
180
192
204
216
228
240
252
P(B)
Intensity
MS RVS
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(L)
Intensity
MS RVS
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(R)
Intensity
MS SV
0
0.005
0.01
0.015
0.02
0.025
0.03
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(G)
Intensity
MS SV
0
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.01
0.011
0.012
0
12
24
36
48
60
72
84
96
108
120
132
144
156
168
180
192
204
216
228
240
252
P(B)
Intensity
MS SV
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0
11
22
33
44
55
66
77
88
99
110
121
132
143
154
165
176
187
198
209
220
231
242
253
P(L)
Intensity
MS SV
488 Computer Science & Information Technology ( CS & IT )
4.2 Spatial Quality Metrics Results:
Table 2 and Fig. 5 show the result of the fused
images using various methods. It is clearly that the
seven fusion methods are capable of improving the
spatial resolution with respect to the original MS
image. From table2 and Fig. 4 shows those
parameters for the fused images using various
methods. It can be seen that from Fig. 4a and table2
the MG results of the fused images increase the
spatial resolution for all methods except the PCA
and IHS. Also, in Fig.4a the maximum gradient for
SG was 64 edge but for MG in table2 and Fig.4b the
maximum gradient was 25 edge means that the SG it
gave, overall, a better performance than MG to edge
detection. In addition, the SG appears the results of
the fused images increase the gradient for all
methods except the PCA means that the decreasing
in gradient that it dose not enhance the spatial
quality. The maximum results of MG and SG for
sharpen image methods was for the EF but the
nearest to the PAN it was SF has the same results
approximately. However, when comparing them to
the PAN it can be seen that the SF close to the
Result of the PAN. Other means the SF added the
details of the PAN image to the MS image as well as
the maximum preservation of the spatial resolution
of the PAN.
According to the computation results, FCC in table2
and Fig.4c the increase FCC indicates the amount of
edge information from the PAN image transferred
into the fused images in quantity of spatial
resolution through the merging. The maximum
results of FCC From table2 and Fig.4c were for the
HFA, HFM and SF. The purposed approach of
HPDI as the spatial quality metric is more
important than the other spatial quality matrices to
distinguish the best spatial enhancement through
the merging. Also, the analytical technique of HPDI
is much more useful for measuring the spatial
enhancement corresponding to the Pan image than
the other methods since the FCC or SG and MG
gave the same results for some methods; but the
HPDI gave the smallest different ratio between
those methods. It can be observed that from Fig.4d
and table2 the maximum results of the purpose
approach HPDI it were with the SF followed HFM
methods.
Fig. 4a: Chart Representation of MG
Fig. 4b: Chart Representation of SG
Fig. 4c: Chart Representation of FCC
Fig. 4d: Chart Representation of HPDI
Fig. 4: Chart Representation of MG,
SG, FCC & HPDI of Fused Images
0
5
10
15
20
25
30
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1
EF HFA HFM IHS PCA RVS
MG
0
10
20
30
40
50
60
70
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2
EF HFA HFM IHS PCA RVS SF
SG
0.000
0.100
0.200
0.300
0.400
0.500
0.600
0.700
0.800
0.900
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2
EF HFA HFM IHS PCA RVS
FCC
-0.080
-0.060
-0.040
-0.020
0.000
0.020
0.040
0.060
0.080
0.100
1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2
EF HFA HFM IHS PCA RVS
HPDI
Computer Science & Information Technology ( CS & IT ) 489
Fig.5a: HFA
Fig.5b: HFM
Fig.5c: IHS Fig.5d: PCA
Table 2: The Spatial Quality Results of Fused Images
Method Band MG SG HPDI FCC
EF
1 25 64 0.025 0.464
2 25 65 0.025 0.446
3 25 65 0.022 0.426
HFA
1 11 51 -0.017 0.857
2 12 52 -0.013 0.856
3 12 52 -0.009 0.854
HFM
1 12 54 0.058 0.845
2 12 54 0.069 0.834
3 12 53 0.077 0.821
IHS
1 9 36 0.029 0.853
2 9 36 0.032 0.857
3 9 36 0.034 0.856
PCA
1 6 33 -0.006 0.718
2 6 34 -0.006 0.710
3 6 35 -0.006 0.704
RVS
1 13 54 -0.064 0.539
2 12 53 -0.053 0.540
3 12 52 -0.046 0.536
SF
1 11 48 0.074 0.769
2 11 49 0.080 0.766
3 11 49 0.080 0.761
PAN 1 10 42 -- --
490 Computer Science & Information Technology ( CS & IT )
Fig.5: The Representation of Fused Images
Fig.5e: RVS Fig.5f: SF
Fig.5g: EF
Continue Fig.5: The Representation of Fused Images
5. CONCLUSION
This study proposed a new measure to test the efficiency of spatial resolution of fusion images
applied to a number of methods of merge images. These methods have obtained the best results in
previous studies and some of them depend on the pixel level fusion including HFA, HFM, IHS
and RVS methods while the other methods based on features level fusion like PCA, EF and SF
method. Results of the study show the importance to propose a new HPDI as a criterion to
measure the quality evaluation for the spatial resolution of the fused images in which the results
showed the effectiveness of high efficiency when compared with the other criterion methods for
measurement such as the FCC. The proposed analytical technique of HPDI is much more useful
for measuring the spatial enhancement of fused image corresponding to the spatial resolution of
Computer Science & Information Technology ( CS & IT ) 491
the PAN image than the other methods, since the FCC or SG and MG gave the same results for
some methods; but the HPDI gave the smallest different ratio between those methods, therefore, it
is strongly recommended to use HPDI for measuring the spatial enhancement of fused image with
PAN image because of its mathematical and more precision as quality indicator.
Experimental results with spatial and spectral quality matrices evaluation further show that the SF
technique based on feature level fusion maintains the spectral integrity for MS image as well as
improved as much as possible the spatial quality of the PAN image. The use of the SF based
fusion technique is strongly recommended if the goal of merging is to achieve the best
representation of the spectral information of multispectral image and the spatial details of a high-
resolution panchromatic image. Because it is based on Component Substitution fusion techniques
coupled with a spatial domain filtering. It utilizes the statistical variable between the brightness
values of the image bands to adjust the contribution of individual bands to the fusion results to
reduce the color distortion. The analytical technique of SG is much more useful for measuring
gradient than MG as the MG gave the smallest gradient results.
REFERENCES
[1] Leviner M., M. Maltz ,2009. “A new multi-spectral feature level image fusion method for human
interpretation”. Infrared Physics & Technology 52 (2009) pp. 79–88.
[2] Aiazzi B., S. Baronti , M. Selva,2008. “Image fusion through multiresolution oversampled
decompositions”. In Image Fusion: Algorithms and Applications “.Edited by: Stathaki T. “Image
Fusion: Algorithms and Applications”. 2008 Elsevier Ltd.
[3] Nedeljko C., A. Łoza, D. Bull and N. Canagarajah, 2006. “A Similarity Metric for Assessment of
Image Fusion Algorithms”. International Journal of Information and Communication Engineering Vol.
No. 3, pp. 178 – 182.
[4] ŠVab A.and Oštir K., 2006. “High-Resolution Image Fusion: Methods to Preserve Spectral and
Spatial Resolution”. Photogrammetric Engineering & Remote Sensing, Vol. 72, No. 5, May 2006, pp.
565–572.
[5] Shi W., Changqing Z., Caiying Z., and Yang X., 2003. “Multi-Band Wavelet For Fusing SPOT
Panchromatic And Multispectral Images”. Photogrammetric Engineering & Remote Sensing Vol. 69,
No. 5, May 2003, pp. 513–520.
[6] Hui Y. X.And Cheng J. L., 2008. “Fusion Algorithm For Remote Sensing Images Based On
Nonsubsampled Contourlet Transform”. ACTA AUTOMATICA SINICA, Vol. 34, No. 3.pp. 274-
281.
[7] Firouz A. Al-Wassai, N.V. Kalyankar, A. A. Al-zuky ,2011. “ Multisensor Images Fusion Based on
Feature-Level”. International Journal of Advanced Research in Computer Science, Volume 2, No. 4,
July-August 2011, pp. 354 – 362.
[8] Hsu S. H., Gau P. W., I-Lin Wu I., and Jeng J. H., 2009,“Region-Based Image Fusion with Artificial
Neural Network”. World Academy of Science, Engineering and Technology, 53, pp 156 -159.
[9] Zhang J., 2010. “Multi-source remote sensing data fusion: status and trends”, International Journal of
Image and Data Fusion, Vol. 1, No. 1, pp. 5–24.
[10] Ehlers M., S. Klonusa, P. Johan, A. strand and P. Rosso ,2010. “Multi-sensor image fusion for
pansharpening in remote sensing”. International Journal of Image and Data Fusion, Vol. 1, No. 1,
March 2010, pp. 25–45.
[11] Alparone L., Baronti S., Garzelli A., Nencini F., 2004. “Landsat ETM+ and SAR Image Fusion Based
on Generalized Intensity Modulation”. IEEE Transactions on Geoscience and Remote Sensing, Vol.
42, No. 12, pp. 2832-2839.
492 Computer Science & Information Technology ( CS & IT )
[12] Dong J.,Zhuang D., Huang Y.,Jingying Fu,2009. “Advances In Multi-Sensor Data Fusion: Algorithms
And Applications “. Review, ISSN 1424-8220 Sensors 2009, 9, pp.7771-7784.
[13] Amarsaikhan D., H.H. Blotevogel, J.L. van Genderen, M. Ganzorig, R. Gantuya and B. Nergui, 2010.
“Fusing high-resolution SAR and optical imagery for improved urban land cover study and
classification”. International Journal of Image and Data Fusion, Vol. 1, No. 1, March 2010, pp. 83–97.
[14] Vrabel J., 1996. “Multispectral imagery band sharpening study”. Photogrammetric Engineering and
Remote Sensing, Vol. 62, No. 9, pp. 1075-1083.
[15] Vrabel J., 2000. “Multispectral imagery Advanced band sharpening study”. Photogrammetric
Engineering and Remote Sensing, Vol. 66, No. 1, pp. 73-79.
[16] Wenbo W.,Y.Jing, K. Tingjun ,2008. “Study Of Remote Sensing Image Fusion And Its Application In
Image Classification” The International Archives of the Photogrammetry, Remote Sensing and Spatial
Information Sciences. Vol. XXXVII. Part B7. Beijing 2008, pp.1141-1146.
[17] Parcharidis I. and L. M. K. Tani, 2000. “Landsat TM and ERS Data Fusion: A Statistical Approach
Evaluation for Four Different Methods”. 0-7803-6359- 0/00/ 2000 IEEE, pp.2120-2122.
[18] Pohl C. and Van Genderen J. L., 1998. “Multisensor Image Fusion In Remote Sensing: Concepts,
Methods And Applications”.(Review Article), International Journal Of Remote Sensing, Vol. 19, No.5,
pp. 823-854.
[19] Firouz A. Al-Wassai, N.V. Kalyankar, A. A. Al-zuky, 2011b. “The IHS Transformations Based Image
Fusion”. Journal of Global Research in Computer Science, Volume 2, No. 5, May 2011, pp. 70 – 77.
[20] Firouz A. Al-Wassai, N.V. Kalyankar , A.A. Al-Zuky, 2011a. “Arithmetic and Frequency Filtering
Methods of Pixel-Based Image Fusion Techniques “.IJCSI International Journal of Computer Science
Issues, Vol. 8, Issue 3, No. 1, May 2011, pp. 113- 122.
[21] Firouz A. Al-Wassai, N.V. Kalyankar , A.A. Al-Zuky, 2011c.” The Statistical methods of Pixel-Based
Image Fusion Techniques”. International Journal of Artificial Intelligence and Knowledge Discovery
Vol.1, Issue 3, July, 2011 5, pp. 5- 14.
[22] Li S., Kwok J. T., Wang Y.., 2002. “Using the Discrete Wavelet Frame Transform To Merge Landsat
TM And SPOT Panchromatic Images”. Information Fusion 3 (2002), pp.17–23.
[23] Liao. Y. C., T.Y. Wang, and W. T. Zheng, 1998. “Quality Analysis of Synthesized High Resolution
Multispectral Imagery”. URL: http://www. gisdevelopment.net/AARS/ACRS 1998/DigitalImage
Processing (Last date accessed:28 Oct. 2008).
[24] Gonzales R. C, and R. Woods, 1992. Digital Image Procesing. A ddison-Wesley Publishing Company.
[25] De Béthume S., F. Muller, and J. P. Donnay, 1998. “Fusion of multi-spectral and panchromatic images
by local mean and variance matching filtering techniques”. In: Proceedings of the Second International
Conference: Fusion of Earth Data: Merging Point Measurements, Raster Maps and Remotely Sensed
Images, Sophia-Antipolis, France, 1998, pp. 31–36.
[26] De Bèthune. S and F. Muller, 2002. “Multisource Data Fusion Applied research”. URL:http://www.
abricmuller.be/realisations/ fusion.html .(Last date accessed:28 Oct. 2002).
[27] Sangwine S. J., and R.E.N. Horne, 1989. The Colour Image Processing Handbook. Chapman & Hall.
[28] Ryan. R., B. Baldridge, R.A. Schowengerdt, T. Choi, D.L. Helder and B. Slawomir, 2003. “IKONOS
Spatial Resolution And Image Interpretability Characterization”, Remote Sensing of Environment,
Vol. 88, No. 1, pp. 37–52.
[29] Pradham P., Younan N. H. and King R. L., 2008. “Concepts of image fusion in remote sensing
applications”. Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier Ltd.
[30] Mark S. Nand A. S. A.,2008 “Feature Extraction and Image Processing”. Second edition, 2008
Elsevier Ltd.
Computer Science & Information Technology ( CS & IT )
[31] Richards J. A. · X. Jia, 2006. “Remote Sensing Digital Image Analysis An Introduction”.4th Edition,
Springer-Verlag Berlin Heidelberg 2006.
[32] Li S. and B. Yang , 2008. “Region
Applications “.Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier
Ltd.
[33] Zhou J., D. L. Civico, and J. A. Silander. “A wavelet transform method to merge landsat TM and
SPOT panchromatic data”. International Journal of Remote S
Authors
Firouz Abdullah Al-Wassai. Received the B.Sc. degree in physics
Sana’a, Yemen in 1993; the M. Sc. degree from Bagdad University, Iraq in 2003.
Currently, she is Ph. D. scholar in computer Science
science (S.R.T.M.U), Nanded, India.
Dr. N.V. Kalyankar,He is a Principal
completed M.Sc.(Physics) from Dr. B.A.M.U, Aurangabad. In 1980 he joined as a
leturer in department of physics at Yeshwant Mahavidyalaya, Nanded. In 1984 he
completed his DHE. He completed his Ph.D. from Dr.B.A.M.U
From 2003 he is working as a Principal to till date in Yeshwant Mahavidyalaya,
Nanded. He is also research guide for Physics and Co
Nanded. 03 research students are successfully awarded Ph.D in Computer Science
under his guidance. 12 research students are successfully awarded M.Phil in Computer Science under his
guidance He is also worked on various boides
is S.R.T.M.U, Nanded. He also published 34 research papers in various international/national journals. He
is peer team member of NAAC (National Assessment and Accreditation Council,
book entitled “DBMS concepts and programming in Foxpro”. He also get various educational wards in
which “Best Principal” award from S.R.T.M.U, Nanded in 2009 and “Best Teacher” award from Govt. of
Maharashtra, India in 2010. He is life member of Indian “Fellowship of
(F.L.S.)” on 11 National Congress, Kolkata (India). He is also honored with November 2009.
Computer Science & Information Technology ( CS & IT )
Richards J. A. · X. Jia, 2006. “Remote Sensing Digital Image Analysis An Introduction”.4th Edition,
Verlag Berlin Heidelberg 2006.
and B. Yang , 2008. “Region-based multi-focus image fusion”. in Image Fusion: Algorithms and
Applications “.Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier
Zhou J., D. L. Civico, and J. A. Silander. “A wavelet transform method to merge landsat TM and
SPOT panchromatic data”. International Journal of Remote Sensing, 19(4), 1998.
Received the B.Sc. degree in physics from University of
Sana’a, Yemen in 1993; the M. Sc. degree from Bagdad University, Iraq in 2003.
Ph. D. scholar in computer Science at department of computer
science (S.R.T.M.U), Nanded, India.
Principal of Yeshwant Mahvidyalaya, Nanded(India)
completed M.Sc.(Physics) from Dr. B.A.M.U, Aurangabad. In 1980 he joined as a
at Yeshwant Mahavidyalaya, Nanded. In 1984 he
completed his DHE. He completed his Ph.D. from Dr.B.A.M.U, Aurangabad in 1995.
From 2003 he is working as a Principal to till date in Yeshwant Mahavidyalaya,
Nanded. He is also research guide for Physics and Computer Science in S.R.T.M.U,
Nanded. 03 research students are successfully awarded Ph.D in Computer Science
under his guidance. 12 research students are successfully awarded M.Phil in Computer Science under his
guidance He is also worked on various boides in S.R.T.M.U, Nanded. He is also worked on various bodies
is S.R.T.M.U, Nanded. He also published 34 research papers in various international/national journals. He
is peer team member of NAAC (National Assessment and Accreditation Council, India). He publi
“DBMS concepts and programming in Foxpro”. He also get various educational wards in
which “Best Principal” award from S.R.T.M.U, Nanded in 2009 and “Best Teacher” award from Govt. of
Maharashtra, India in 2010. He is life member of Indian “Fellowship of Linnaean Society of
F.L.S.)” on 11 National Congress, Kolkata (India). He is also honored with November 2009.
493
Richards J. A. · X. Jia, 2006. “Remote Sensing Digital Image Analysis An Introduction”.4th Edition,
focus image fusion”. in Image Fusion: Algorithms and
Applications “.Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier
Zhou J., D. L. Civico, and J. A. Silander. “A wavelet transform method to merge landsat TM and
under his guidance. 12 research students are successfully awarded M.Phil in Computer Science under his
in S.R.T.M.U, Nanded. He is also worked on various bodies
is S.R.T.M.U, Nanded. He also published 34 research papers in various international/national journals. He
. He published a
“DBMS concepts and programming in Foxpro”. He also get various educational wards in
which “Best Principal” award from S.R.T.M.U, Nanded in 2009 and “Best Teacher” award from Govt. of
Society of London
F.L.S.)” on 11 National Congress, Kolkata (India). He is also honored with November 2009.

More Related Content

What's hot

COMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHING
COMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHINGCOMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHING
COMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHINGijfcstjournal
 
10.1109@tip.2020.2990346
10.1109@tip.2020.299034610.1109@tip.2020.2990346
10.1109@tip.2020.2990346Joseph Franklin
 
Paper id 28201446
Paper id 28201446Paper id 28201446
Paper id 28201446IJRAT
 
Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317Yashwant Kurmi
 
IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...
IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...
IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...IRJET Journal
 
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...IOSR Journals
 
Strong Image Alignment for Meddling Recognision Purpose
Strong Image Alignment for Meddling Recognision PurposeStrong Image Alignment for Meddling Recognision Purpose
Strong Image Alignment for Meddling Recognision PurposeIJMER
 
Property based fusion for multifocus images
Property based fusion for multifocus imagesProperty based fusion for multifocus images
Property based fusion for multifocus imagesIAEME Publication
 
Rough Set based Natural Image Segmentation under Game Theory Framework
Rough Set based Natural Image Segmentation under Game Theory FrameworkRough Set based Natural Image Segmentation under Game Theory Framework
Rough Set based Natural Image Segmentation under Game Theory Frameworkijsrd.com
 
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...ijcseit
 
An Unsupervised Cluster-based Image Retrieval Algorithm using Relevance Feedback
An Unsupervised Cluster-based Image Retrieval Algorithm using Relevance FeedbackAn Unsupervised Cluster-based Image Retrieval Algorithm using Relevance Feedback
An Unsupervised Cluster-based Image Retrieval Algorithm using Relevance FeedbackIJMIT JOURNAL
 
A GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENT
A GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENTA GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENT
A GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENTpharmaindexing
 
A New Approach of Medical Image Fusion using Discrete Wavelet Transform
A New Approach of Medical Image Fusion using Discrete Wavelet TransformA New Approach of Medical Image Fusion using Discrete Wavelet Transform
A New Approach of Medical Image Fusion using Discrete Wavelet TransformIDES Editor
 

What's hot (15)

COMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHING
COMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHINGCOMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHING
COMPUTATIONALLY EFFICIENT TWO STAGE SEQUENTIAL FRAMEWORK FOR STEREO MATCHING
 
10.1109@tip.2020.2990346
10.1109@tip.2020.299034610.1109@tip.2020.2990346
10.1109@tip.2020.2990346
 
Paper id 28201446
Paper id 28201446Paper id 28201446
Paper id 28201446
 
Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317Kurmi 2015-ijca-905317
Kurmi 2015-ijca-905317
 
IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...
IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...
IRJET-A Review on Implementation of High Dimension Colour Transform in Domain...
 
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
Automatic Determination Number of Cluster for NMKFC-Means Algorithms on Image...
 
B42020710
B42020710B42020710
B42020710
 
Strong Image Alignment for Meddling Recognision Purpose
Strong Image Alignment for Meddling Recognision PurposeStrong Image Alignment for Meddling Recognision Purpose
Strong Image Alignment for Meddling Recognision Purpose
 
Property based fusion for multifocus images
Property based fusion for multifocus imagesProperty based fusion for multifocus images
Property based fusion for multifocus images
 
Rough Set based Natural Image Segmentation under Game Theory Framework
Rough Set based Natural Image Segmentation under Game Theory FrameworkRough Set based Natural Image Segmentation under Game Theory Framework
Rough Set based Natural Image Segmentation under Game Theory Framework
 
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...
 
An Unsupervised Cluster-based Image Retrieval Algorithm using Relevance Feedback
An Unsupervised Cluster-based Image Retrieval Algorithm using Relevance FeedbackAn Unsupervised Cluster-based Image Retrieval Algorithm using Relevance Feedback
An Unsupervised Cluster-based Image Retrieval Algorithm using Relevance Feedback
 
A GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENT
A GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENTA GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENT
A GENERAL STUDY ON HISTOGRAM EQUALIZATION FOR IMAGE ENHANCEMENT
 
93 98
93 9893 98
93 98
 
A New Approach of Medical Image Fusion using Discrete Wavelet Transform
A New Approach of Medical Image Fusion using Discrete Wavelet TransformA New Approach of Medical Image Fusion using Discrete Wavelet Transform
A New Approach of Medical Image Fusion using Discrete Wavelet Transform
 

Similar to Novel metric evaluates spatial enhancement of pan-sharpened images

Survey paper on image compression techniques
Survey paper on image compression techniquesSurvey paper on image compression techniques
Survey paper on image compression techniquesIRJET Journal
 
Development and Comparison of Image Fusion Techniques for CT&MRI Images
Development and Comparison of Image Fusion Techniques for CT&MRI ImagesDevelopment and Comparison of Image Fusion Techniques for CT&MRI Images
Development and Comparison of Image Fusion Techniques for CT&MRI ImagesIJERA Editor
 
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...CSCJournals
 
SINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDY
SINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDYSINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDY
SINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDYcsandit
 
Object Shape Representation by Kernel Density Feature Points Estimator
Object Shape Representation by Kernel Density Feature Points Estimator Object Shape Representation by Kernel Density Feature Points Estimator
Object Shape Representation by Kernel Density Feature Points Estimator cscpconf
 
PCA & CS based fusion for Medical Image Fusion
PCA & CS based fusion for Medical Image FusionPCA & CS based fusion for Medical Image Fusion
PCA & CS based fusion for Medical Image FusionIJMTST Journal
 
Super-Resolution of Multispectral Images
Super-Resolution of Multispectral ImagesSuper-Resolution of Multispectral Images
Super-Resolution of Multispectral Imagesijsrd.com
 
MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...
MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...
MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...cscpconf
 
Optimal Coefficient Selection For Medical Image Fusion
Optimal Coefficient Selection For Medical Image FusionOptimal Coefficient Selection For Medical Image Fusion
Optimal Coefficient Selection For Medical Image FusionIJERA Editor
 
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONINFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONIJCI JOURNAL
 
Query Image Searching With Integrated Textual and Visual Relevance Feedback f...
Query Image Searching With Integrated Textual and Visual Relevance Feedback f...Query Image Searching With Integrated Textual and Visual Relevance Feedback f...
Query Image Searching With Integrated Textual and Visual Relevance Feedback f...IJERA Editor
 
Super-Spatial Structure Prediction Compression of Medical
Super-Spatial Structure Prediction Compression of MedicalSuper-Spatial Structure Prediction Compression of Medical
Super-Spatial Structure Prediction Compression of Medicalijeei-iaes
 
OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...
OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...
OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...ijcsit
 
One dimensional vector based pattern
One dimensional vector based patternOne dimensional vector based pattern
One dimensional vector based patternijcsit
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD Editor
 
Spectral approach to image projection with cubic
Spectral approach to image projection with cubicSpectral approach to image projection with cubic
Spectral approach to image projection with cubiciaemedu
 

Similar to Novel metric evaluates spatial enhancement of pan-sharpened images (20)

40120140507003
4012014050700340120140507003
40120140507003
 
40120140507003
4012014050700340120140507003
40120140507003
 
F010224446
F010224446F010224446
F010224446
 
Survey paper on image compression techniques
Survey paper on image compression techniquesSurvey paper on image compression techniques
Survey paper on image compression techniques
 
Development and Comparison of Image Fusion Techniques for CT&MRI Images
Development and Comparison of Image Fusion Techniques for CT&MRI ImagesDevelopment and Comparison of Image Fusion Techniques for CT&MRI Images
Development and Comparison of Image Fusion Techniques for CT&MRI Images
 
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
Ijip 742Image Fusion Quality Assessment of High Resolution Satellite Imagery ...
 
SINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDY
SINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDYSINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDY
SINGLE IMAGE SUPER RESOLUTION: A COMPARATIVE STUDY
 
Object Shape Representation by Kernel Density Feature Points Estimator
Object Shape Representation by Kernel Density Feature Points Estimator Object Shape Representation by Kernel Density Feature Points Estimator
Object Shape Representation by Kernel Density Feature Points Estimator
 
PCA & CS based fusion for Medical Image Fusion
PCA & CS based fusion for Medical Image FusionPCA & CS based fusion for Medical Image Fusion
PCA & CS based fusion for Medical Image Fusion
 
Super-Resolution of Multispectral Images
Super-Resolution of Multispectral ImagesSuper-Resolution of Multispectral Images
Super-Resolution of Multispectral Images
 
MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...
MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...
MULTIFOCUS IMAGE FUSION USING MULTIRESOLUTION APPROACH WITH BILATERAL GRADIEN...
 
Optimal Coefficient Selection For Medical Image Fusion
Optimal Coefficient Selection For Medical Image FusionOptimal Coefficient Selection For Medical Image Fusion
Optimal Coefficient Selection For Medical Image Fusion
 
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONINFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
 
Query Image Searching With Integrated Textual and Visual Relevance Feedback f...
Query Image Searching With Integrated Textual and Visual Relevance Feedback f...Query Image Searching With Integrated Textual and Visual Relevance Feedback f...
Query Image Searching With Integrated Textual and Visual Relevance Feedback f...
 
Super-Spatial Structure Prediction Compression of Medical
Super-Spatial Structure Prediction Compression of MedicalSuper-Spatial Structure Prediction Compression of Medical
Super-Spatial Structure Prediction Compression of Medical
 
OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...
OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...
OBTAINING SUPER-RESOLUTION IMAGES BY COMBINING LOW-RESOLUTION IMAGES WITH HIG...
 
One dimensional vector based pattern
One dimensional vector based patternOne dimensional vector based pattern
One dimensional vector based pattern
 
557 480-486
557 480-486557 480-486
557 480-486
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
 
Spectral approach to image projection with cubic
Spectral approach to image projection with cubicSpectral approach to image projection with cubic
Spectral approach to image projection with cubic
 

More from cscpconf

ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR
ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR
ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR cscpconf
 
4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATION
4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATION4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATION
4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATIONcscpconf
 
MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...
MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...
MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...cscpconf
 
PROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIES
PROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIESPROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIES
PROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIEScscpconf
 
A SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGIC
A SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGICA SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGIC
A SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGICcscpconf
 
DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS
DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS
DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS cscpconf
 
INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS
INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS
INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS cscpconf
 
TWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTIC
TWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTICTWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTIC
TWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTICcscpconf
 
DETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAIN
DETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAINDETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAIN
DETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAINcscpconf
 
GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...
GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...
GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...cscpconf
 
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEMIMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEMcscpconf
 
EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...
EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...
EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...cscpconf
 
AUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEWAUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEWcscpconf
 
CLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORK
CLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORKCLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORK
CLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORKcscpconf
 
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...cscpconf
 
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATA
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATAPROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATA
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATAcscpconf
 
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCH
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCHCHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCH
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCHcscpconf
 
SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...
SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...
SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...cscpconf
 
SOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGE
SOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGESOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGE
SOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGEcscpconf
 
GENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXT
GENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXTGENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXT
GENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXTcscpconf
 

More from cscpconf (20)

ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR
ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR
ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR
 
4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATION
4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATION4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATION
4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATION
 
MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...
MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...
MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...
 
PROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIES
PROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIESPROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIES
PROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIES
 
A SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGIC
A SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGICA SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGIC
A SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGIC
 
DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS
DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS
DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS
 
INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS
INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS
INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS
 
TWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTIC
TWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTICTWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTIC
TWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTIC
 
DETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAIN
DETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAINDETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAIN
DETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAIN
 
GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...
GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...
GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...
 
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEMIMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
 
EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...
EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...
EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...
 
AUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEWAUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEW
 
CLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORK
CLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORKCLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORK
CLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORK
 
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
 
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATA
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATAPROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATA
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATA
 
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCH
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCHCHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCH
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCH
 
SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...
SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...
SOCIAL MEDIA ANALYTICS FOR SENTIMENT ANALYSIS AND EVENT DETECTION IN SMART CI...
 
SOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGE
SOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGESOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGE
SOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGE
 
GENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXT
GENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXTGENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXT
GENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXT
 

Recently uploaded

18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersChitralekhaTherkar
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docxPoojaSen20
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 

Recently uploaded (20)

TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of Powders
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docx
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 

Novel metric evaluates spatial enhancement of pan-sharpened images

  • 1. Natarajan Meghanathan, et al. (Eds): SIPM, FCST, ITCA, WSE, ACSIT, CS & IT 06, pp. 479–493, 2012. © CS & IT-CSCP 2012 DOI : 10.5121/csit.2012.2347 A NOVEL METRIC APPROACH EVALUATION FOR THE SPATIAL ENHANCEMENT OF PAN-SHARPENED IMAGES Firouz Abdullah Al-Wassai1 and Dr. N.V. Kalyankar2 1 Department of Computer Science, (SRTMU), Nanded, India fairozwaseai@yahoo.com 2 Principal, Yeshwant Mahavidyala College, Nanded, India drkalyankarnv@yahoo.com ABSTRACT Various and different methods can be used to produce high-resolution multispectral images from high-resolution panchromatic image (PAN) and low-resolution multispectral images (MS), mostly on the pixel level. The Quality of image fusion is an essential determinant of the value of processing images fusion for many applications. Spatial and spectral qualities are the two important indexes that used to evaluate the quality of any fused image. However, the jury is still out of fused image’s benefits if it compared with its original images. In addition, there is a lack of measures for assessing the objective quality of the spatial resolution for the fusion methods. So, an objective quality of the spatial resolution assessment for fusion images is required. Therefore, this paper describes a new approach proposed to estimate the spatial resolution improve by High Past Division Index (HPDI) upon calculating the spatial-frequency of the edge regions of the image and it deals with a comparison of various analytical techniques for evaluating the Spatial quality, and estimating the colour distortion added by image fusion including: MG, SG, FCC, SD, En, SNR, CC and NRMSE. In addition, this paper devotes to concentrate on the comparison of various image fusion techniques based on pixel and feature fusion technique. KEYWORDS image quality; spectral metrics; spatial metrics; HPDI, Image Fusion. 1. INTRODUCTION The Quality of image fusion is an essential determinant of the value of processing images fusion for many applications. Spatial and spectral qualities are the two important indexes that used to evaluate the quality of any fused image. Generally, one aims to preserve as much source information as possible in the fused image with the expectation that performance with the fused image will be better than, or at least as good as, performance with the source images [1]. Several authors describe different spatial and spectral quality analysis techniques of the fused images.
  • 2. 480 Computer Science & Information Technology ( CS & IT ) Some of them enable subjective, the others objective, numerical definition of spatial or spectral quality of the fused data [2-5]. The evaluation of the spatial quality of the pan-sharpened images is equally important since the goal is to retain the high spatial resolution of the PAN image. A survey of the pan sharpening literature revealed there were very few papers that evaluated the spatial quality of the pan-sharpened imagery [6]. However, the jury is still out on the benefits of a fused image compared to its original images. There is also a lack of measures for assessing the objective quality of the spatial resolution of the fusion methods. As a result of that, an objective quality of the spatial resolution assessment for fusion images is required. This study presented a new approach to assess the spatial quality of a fused image based on HPDI, depends upon the spatial-frequency of the edge regions of the image and comparing it with other methods as [27, 28, 33]. In addition, many spectral quality metrics, to compare the properties of fused images and their ability to preserve the similarity with respect to the MS image while incorporating the spatial resolution of the PAN image, should increase the spectral fidelity while retaining the spatial resolution of the PAN. In addition, this study focuses on cambering that the best methods based on pixel fusion techniques (see section 2) are those with the fallowing feature fusion techniques: Segment Fusion (SF), Principal Component Analysis based Feature Fusion (PCA) and Edge Fusion (EF) in [7]. The paper organized as follows .Section 2 gives the image fusion techniques; Section 3 includes the quality of evaluation of the fused images; Section 4 covers the experimental results and analysis then subsequently followed by the conclusion. 2. IMAGE FUSION TECHNIQUES Image fusion techniques can be divided into three levels, namely: pixel level, feature level and decision level of representation [8-10]. The image fusion techniques based on pixel can be grouped into several techniques depending on the tools or the processing methods for image fusion procedure summarized as fallow: 1) Arithmetic Combination techniques: such as Bovey Transform (BT) [11-13]; Color Normalized Transformation (CN) [14, 15]; Multiplicative Method (MLT) [17, 18]. 2) Component Substitution fusion techniques: such as IHS, HSV, HLS and YIQ in [19]. 3) Frequency Filtering Methods :such as in [20] High-Pass Filter Additive Method (HPFA) , High –Frequency- Addition Method (HFA) , High Frequency Modulation Method (HFM) and The Wavelet transform-based fusion method (WT). 4) Statistical Methods: such as in [21] Local Mean Matching (LMM), Local Mean and Variance Matching (LMVM), Regression variable substitution (RVS), and Local Correlation Modeling (LCM). All the above techniques employed in our previous studies [19-21]. Therefore, the best method for each group selected in this study as the fallowing: (HFA), (HFM) [20], (RVS) [21] and the IHS method by [22]. To explain the algorithms through this study, Pixels should have the same spatial resolution from two different sources that are manipulated to obtain the resultant image. Here, The PAN image have a different spatial resolution from that of the MS image. Therefore, re-sampling of MS image to the spatial resolution of PAN is an essential step in some fusion methods to bring the
  • 3. Computer Science & Information Technology ( CS & IT ) 481 MS image to the same size of PAN, thus the re-sampled MS image will be noted by Μ୩ that represents the set of DN of band k in the re-sampled MS image. 3. QUALITY EVALUATION OF THE FUSED IMAGES This section describes the various spatial and spectral quality metrics used to evaluate them. The spectral fidelity of the fused images with respect to the MS images is described. When analyzing the spectral quality of the fused images we compare spectral characteristics of images obtained from the different methods, with the spectral characteristics of re-sampled MS images. Since the goal is to preserve the radiometry of the original MS images, any metric used must measure the amount of change in DN values in the pan-sharpened image F୩compared to the original imageM୩. Also, In order to evaluate the spatial properties of the fused images, a PAN image and intensity image of the fused image have to be compared since the goal is to retain the high spatial resolution of the PAN image. In the following F୩ , M୩are the measurements of each the brightness values pixels of the result image and the original MS image of bandk, Mഥ୩ and Fത୩are the mean brightness values of both images and are of size n ∗ m . BV is the brightness value of image data Mഥ୩ and Fത୩. 3.1 SPECTRAL QUALITY METRICS: 1) Standard Deviation (‫)۲܁‬ The SD, which is the square root of variance, reflects the spread in the data. Thus, a high contrast image will have a larger variance, and a low contrast image will have a low variance. It indicates the closeness of the fused image to the original MS image at a pixel level. The ideal value is zero. ߪ = ට ∑ ∑ (஻௏(௡೙ ೕసభ ,௠)ିఓ)మ೘ ೔సభ ௠×௡ (1) 2222)))) Entropy(۳‫)ܖ‬ The En of an image is a measure of information content but has not been used to assess the effects of information change in fused images. En reflects the capacity of the information carried by images. The larger En means that high information in the image [6]. By applying Shannon’s entropy in evaluation the information content of an image, the formula is modified as [23]: En = − ∑ P(i)logଶP(i)ଶହହ ୧ୀ଴ (2) Where P(i) is the ratio of the number of the pixels with gray value equal to ݅ over the total number of the pixels. 3) Signal-to Noise Ratio (‫܀ۼ܁‬) The signal is the information content of the data of MS imageM୩, while the merging ‫ܨ‬௞ can cause the noise, as error that is added to the signal. The ܴ‫ܵܯ‬௞ of the SNR can be used to calculate the ܴܵܰ௞, given by [24]: ܴܵܰ௞ = ඨ ∑ ∑ (ிೖ(௜,௝))మ೘ ೕ ೙ ೔ ∑ ∑ (ிೖ(௜,௝)ିெೖ(௜,௝))మ೘ ೕ ೙ ೔ (3)
  • 4. 482 Computer Science & Information Technology ( CS & IT ) 4) Correlation Coefficient (۱۱) The CC measures the closeness or similarity between two images. It can vary between – 1 to +1. A value close to +1 indicates that the two images are very similar, while a value close to –1 indicates that they are highly dissimilar. The formula to compute the correlation between F୩ , M୩: ࡯࡯ = ∑ ∑ (ࡲ࢑(࢏,࢐)ିࡲ࢑)(ࡹ࢑(࢏,࢐)ିࡹ࢑)࢓ ࢐ ࢔ ࢏ ට∑ ∑ (ࡲ࢑(࢏,࢐)ିࡲ࢑)૛࢓ ࢐ ࢔ ࢏ ට∑ ∑ (ࡹ࢑(࢏,࢐)ିࡹ࢑)૛࢓ ࢐ ࢔ ࢏ (4) Since the pan-sharpened image larger (more pixels) than the original MS image it is not possible to compute the CC or apply any other mathematical operation between them. Thus, the up-sampled MS image M୩is used for this comparison. 5) Normalization Root Mean Square Error (NRMSE) The NRMSE used in order to assess the effects of information changing for the fused image. When level of information loss can be expressed as a function of the original MS pixel M୩ and the fused pixelF୩, by using the NRMSE between M୩ and F୩images in band k. The Normalized Root- Mean-Square Error NRMSE୩ between F୩ and M୩ is a point analysis in multispectral space representing the amount of change the original MS pixel and the corresponding output pixels using the following equation [27]: ܴܰ‫ܧܵܯ‬௞ = ට ଵ ௡௠∗ଶହହమ ∑ ∑ (‫ܨ‬௞(݅, ݆) − ‫ܯ‬௞(݅, ݆))ଶ௠ ௝ ௡ ௜ (5) 6) The Histogram Analysis The histograms of the multispectral original MS and the fused bands must be evaluated [4]. If the spectral information preserved in the fused image, its histogram will closely resemble the histogram of the MS image. The analysis of histogram deals with the brightness value histograms of all RGB-color bands, and L-component of the resample MS image and the fused A greater difference of the shape of the corresponding histograms represents a greater spectral change [31]. 3.2 SPATIAL QUALITY METRICS 1) Mean Grades (MG) MG has been used as a measure of image sharpness by [27, 28]. The gradient at any pixel is the derivative of the DN values of neighboring pixels. Generally, sharper images have higher gradient values. Thus, any image fusion method should result in increased gradient values because this process makes the images sharper compared to the low-resolution image. The calculation formula is [6]: ‫ܩ‬̅ = ଵ (௠ିଵ)(௡ିଵ) ∑ ∑ ට ∆ூೣ మ ା∆ூ೤ మ ଶ ௡ିଵ ௝ୀଵ ௠ିଵ ௜ୀଵ (6) Where ∆‫ܫ‬௫ = ݂(݅ + 1, ݆) − ݂(݅, ݆) ∆‫ܫ‬௬ = ݂(݅, ݆ + 1) − ݂(݅, ݆) (7)
  • 5. Computer Science & Information Technology ( CS & IT ) 483 Where ∆‫ܫ‬௫and ∆‫ܫ‬௬ are the horizontal and vertical gradients per pixel of the image fused ݂(݅, ݆). generally, the larger ‫ܩ‬̅, the more the hierarchy, and the more definite the fused image. 2) Soble Grades (SG) This approach developed in this study by used the Soble operator. That by computes discrete gradient in the horizontal and vertical directions at the pixel location ݅, ݆ of an image ݂(݅, ݆). The Soble operator was the most popular edge detection operator until the development of edge detection techniques with a theoretical basis. It proved popular because it gave a better performance contemporaneous edge detection operator than other such as the Prewitt operator [30]. For this, which is clearly more costly to evaluate, the orthogonal components of gradient as the following [31]: ‫ݔܩ‬ = ሼ݂(݅ − 1, ݆ + 1) + 2݂(݅ − 1, ݆) + ݂(݅ − 1, ݆ − 1)ሽ − ሼ݂(݅ + 1, ݆ + 1) + 2݂(݅ + 1, ݆) + ݂(݅ + 1, ݆ − 1)ሽ And ‫ݕܩ‬ = ሼ݂(݅ − 1, ݆ + 1) + 2݂(݅, ݆ + 1) + ݂(݅ + 1, ݆ + 1)ሽ − ሼ݂(݅ − 1, ݆ − 1) + 2݂(݅, ݆ − 1) + ݂(݅ + 1, ݆ − 1)ሽ (8) It can be seen that the Soble operator is equivalent to simultaneous application of the templates as the following [32]: ‫ܩ‬௫ = ൥ 1 2 1 0 0 0 −1 −2 −1 ൩ ‫ܩ‬௬ = ൥ −1 0 1 −2 0 2 −1 0 1 ൩ (9) Then the discrete gradient ‫ܩ‬ of an image ݂(݅, ݆) is given by ‫ܩ‬̅ = ଵ (௠ିଵ)(௡ିଵ) ∑ ∑ ටீೣ మ ାீ೤ మ ଶ (௡ିଵ) ௝ୀଵ (௠ିଵ) ௜ୀଵ (10) Where G୶and G୷ are the horizontal and vertical gradients per pixel. Generally, the larger values forGഥ , the more the hierarchy and the more definite the fused image. 3) Filtered Correlation Coefficients (FCC) This approach was introduced [33]. In the Zhou’s approach, the correlation coefficients between the high-pass filtered fused PAN and TM images and the high-pass filtered PAN image are taken as an index of the spatial quality. The high-pass filter is known as a Laplacian filter as illustrated in eq. (11) :mask = ൥ −1 −1 −1 −1 8 −1 −1 −1 −1 ൩ (11) However, the magnitude of the edges does not necessarily have to coincide, which is the reason why Zhou et al proposed to look at their correlation coefficients [33]. So, in this method the average correlation coefficient of the faltered PAN image and all faltered bands is calculated to obtain FCC. An FCC value close to one indicates high spatial quality. 4) HPDI a New Scheme Of Spatial Evaluation Quality To explain the new proposed technique of HPDI to evaluation the quality of the spatial resolution specifying the edges in the image by using the Laplacian filter (eq.11). The
  • 6. 484 Computer Science & Information Technology ( CS & IT ) Laplacian filtered PAN image is taken as an index of the spatial quality to measure the amount of edge information from the PAN image is transferred into the fused images. The deviation index between the high pass filtered ܲ and the fused ‫ܨ‬௞ images would indicate how much spatial information from the PAN image has been incorporated into the ‫ܵܯ‬ image to obtain HPDI as follows: HPDI୩ = 1 nm ෍ ෍ |F୩(i, j) − P(i, j)| P(i, j) ୫ ୨ ୬ ୧ (12) The larger value HPDI the better image quality. Indicates that the fusion result it has a high spatial resolution quality of the image. 4. EXPERIMENTAL RESULTS The above assessment techniques are tested on fusion of Indian IRS-1C PAN of the 5.8- m resolution panchromatic band and the Landsat TM the red (0.63 - 0.69 µm), green (0.52 - 0.60 µm) and blue (0.45 - 0.52 µm) bands of the 30 m resolution multispectral image were used in this work. Fig.1 shows the IRS-1C PAN and multispectral TM images. Hence, this work is an attempt to study the quality of the images fused from different sensors with various characteristics. The size of the PAN is 600 * 525 pixels at 6 bits per pixel and the size of the original multispectral is 120 * 105 pixels at 8 bits per pixel, but this is up-sampled by nearest neighbor to same size the PAN image. The pairs of images were geometrically registered to each other. The HFA, HFM, HIS, RVS, PCA, EF, and SF methods are employed to fuse IRS-C PAN and TM multi- spectral images. 4.1. Spectral Quality Metrics Results From table1 and Fig. 2 shows those parameters for the fused images using various methods. It can be seen that from Fig. 2a and table1 the SD results of the fused images remains constant for all methods except the IHS. According to the computation results En in table1, the increased En indicates the change in quantity of information content for spectral resolution through the merging. From table1 and Fig.2b, it is obvious that En of the fused images have been changed when compared to the original MS except the PCA. In Fig.2c and table1 the maximum correlation values was for PCA. In Fig.2d and table1 the maximum results of SNR were with the SF, and HFA. Results SNR and NRMSE appear changing significantly. It can be observed from table1 with the diagram Fig. 2d & Fig. 2e for results SNR and NRMSE of the fused image, the SF and HFA methods gives the best results Fig.1: The Representation of Original PAN and MS Images
  • 7. Computer Science & Information Technology ( CS & IT ) 485 with respect to the other methods. Means that this method maintains most of information spectral content of the original MS data set which gets the same values presented the lowest value of the NRMSE as well as the high of the CC and SNR. Hence, the SF and HFA fused images for preservation of the spectral resolution original MS image much better techniques than the other methods. Fig. 2a: Chart Representation of SD Fig. 2b: Chart Representation of En Fig.2c: Chart Representation of CC Fig. 2d: Chart Representation of SNR Fig. 2e: Chart Representation of NRMSE Fig. 2: Chart Representation of SD, En, CC, SNR, NRMSE of Fused Images 0 10 20 30 40 50 60 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 ORG EF HFA HFM IHS PCA RVS SF SD 0 1 2 3 4 5 6 7 8 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 ORG EF HFA HFM IHS PCA RVS SF En 0.84 0.86 0.88 0.9 0.92 0.94 0.96 0.98 1 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 EF HFA HFM IHS PCA RVS SF CC 0.000 5.000 10.000 15.000 20.000 25.000 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 EF HFA HFM IHS PCA RVS SF SNR 0.000 0.050 0.100 0.150 0.200 0.250 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 EF HFA HFM IHS PCA RVS SF NRMSE
  • 8. 486 Computer Science & Information Technology ( CS & IT ) The spectral distortion introduced by the fusion can be analyzed the histogram for all RGB color bands and L-component that appears changing significantly. Fig.2 noted that the matching for R &G color bands between the original MS with the fused images. Many of the image fusion methods examined in this study and the best matching for the intensity values between the original MS image and the fused image for each of the R&G color bands obtained by SF. There are also matching for the B color band in Fig.3 and L- component in Fig.3 except at the values of intensity that ranging in value 253 to 255 not appear the values intensity of the original image whereas highlight the values of intensity of the merged images clearly in the Fig.2 & Fig.3. That does not mean its conflicting values or the spectral resolution if we know that the PAN band (0.50 - 0.75 µm) does not spectrally overlap the blue band of the MS (0.45 - 0.52 µm). Means that during the process of merging been added intensity values found in the PAN image and there have been no in the original MS image which are subject to short wavelengths affected by many factors during the transfer and There can be no to talk about these factors in this context. Most researchers histogram match the PAN band to each MS band before merging them and substituting the high frequency coefficients of the PAN image in place of the MS image’s coefficients such as HIS &PCA methods . However, they have been found where the radiometric normalization as EF &PCA methods is left out Fig. 2 &3. Also, by analyzing the histogram of the Fig. 3 for the fused image, we found that the values of intensity are more significantly when values of 255 for the G&B color bands of the original MS image. The extremism in the Fig. 3 for the intensity of luminosity disappeared. Table 1: The Spectral Quality Metrics Results for the Original MS and Fused Image Methods Method Band SD En CC SNR NRMSE ORG 1 51.018 5.2093 -- --- ---- 2 51.477 5.2263 --- --- ---- 3 51.983 5.2326 --- --- --- EF 1 55.184 6.0196 0.896 2.742 0.173 2 55.792 6.0415 0.896 2.546 0.173 3 56.308 6.0423 0.898 2.359 0.173 HFA 1 52.793 5.7651 0.943 9.107 0.068 2 53.57 5.7833 0.943 8.518 0.069 3 54.498 5.7915 0.943 7.946 0.070 HFM 1 52.76 5.9259 0.934 8.660 0.071 2 53.343 5.8979 0.94 8.581 0.068 3 54.136 5.8721 0.945 8.388 0.066 IHS 1 41.164 7.264 0.915 4.063 0.189 2 41.986 7.293 0.917 3.801 0.196 3 42.709 7.264 0.917 3.690 0.192 PCA 1 47.875 5.1968 0.984 21.611 0.030 2 49.313 5.2485 0.985 19.835 0.031 3 51.092 5.2941 0.986 18.515 0.031 RVS 1 51.323 5.8841 0.924 5.936 0.102 2 51.769 5.8475 0.932 5.887 0.098 3 52.374 5.8166 0.938 5.800 0.094 SF 1 51.603 5.687 0.944 11.422 0.056 2 52.207 5.7047 0.944 10.732 0.056 3 53.028 5.7123 0.945 10.144 0.056
  • 9. Computer Science & Information Technology ( CS & IT ) 487 Generally, the best method through the previous analysis of the Fig.2 and Fig.3 to preservation of the maximum spectral characteristics as possible to the original image for each RGB band and L- component was with SF method. Because the edges are affected more than homogenous regions through the process of merge by moving spatial details to the multispectral MS image and consequently affect on its features and that showed in the image after the merged. Fig.3: Histogram Analysis for All RGB-Color Band and L-Component of Fused Images with MS Image EF HFA HFM IHS PCA RVS SF 0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04 0.045 0.05 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(R) Intensity MS EF 0 0.005 0.01 0.015 0.02 0.025 0.03 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(G) Intensity MS EF 0 0.001 0.002 0.003 0.004 0.005 0.006 0.007 0.008 0.009 0.01 0.011 0.012 0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 P(B) Intensity MS EF 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(L) Intensity MS EF 0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04 0.045 0.05 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(R) Intensity MS HFA 0 0.005 0.01 0.015 0.02 0.025 0.03 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(G) Intensity MS HFA 0 0.001 0.002 0.003 0.004 0.005 0.006 0.007 0.008 0.009 0.01 0.011 0.012 0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 P(B) Intensity MS HFA 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(L) Intensity MS HFA 0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04 0.045 0.05 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(R) Intensity MS HFM 0 0.005 0.01 0.015 0.02 0.025 0.03 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(G) Intensity MS HFM 0 0.001 0.002 0.003 0.004 0.005 0.006 0.007 0.008 0.009 0.01 0.011 0.012 0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 P(B) Intensity MS HFM 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(L) Intensity MS HFM 0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04 0.045 0.05 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(R) Intensity MS IHS 0 0.005 0.01 0.015 0.02 0.025 0.03 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(G) Intensity MS IHS 0 0.001 0.002 0.003 0.004 0.005 0.006 0.007 0.008 0.009 0.01 0.011 0.012 0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 P(B) Intensity MS IHS 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(L) Intensity MS IHS 0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04 0.045 0.05 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(R) Intensity MS PCA 0 0.005 0.01 0.015 0.02 0.025 0.03 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(G) Intensity MS PCA 0 0.001 0.002 0.003 0.004 0.005 0.006 0.007 0.008 0.009 0.01 0.011 0.012 0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 P(B) Intensity MS PCA 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(L) Intensity MS PCA 0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04 0.045 0.05 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(R) Intensity MS RVS 0 0.005 0.01 0.015 0.02 0.025 0.03 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(G) Intensity MS RVS 0 0.001 0.002 0.003 0.004 0.005 0.006 0.007 0.008 0.009 0.01 0.011 0.012 0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 P(B) Intensity MS RVS 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(L) Intensity MS RVS 0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04 0.045 0.05 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(R) Intensity MS SV 0 0.005 0.01 0.015 0.02 0.025 0.03 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(G) Intensity MS SV 0 0.001 0.002 0.003 0.004 0.005 0.006 0.007 0.008 0.009 0.01 0.011 0.012 0 12 24 36 48 60 72 84 96 108 120 132 144 156 168 180 192 204 216 228 240 252 P(B) Intensity MS SV 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0 11 22 33 44 55 66 77 88 99 110 121 132 143 154 165 176 187 198 209 220 231 242 253 P(L) Intensity MS SV
  • 10. 488 Computer Science & Information Technology ( CS & IT ) 4.2 Spatial Quality Metrics Results: Table 2 and Fig. 5 show the result of the fused images using various methods. It is clearly that the seven fusion methods are capable of improving the spatial resolution with respect to the original MS image. From table2 and Fig. 4 shows those parameters for the fused images using various methods. It can be seen that from Fig. 4a and table2 the MG results of the fused images increase the spatial resolution for all methods except the PCA and IHS. Also, in Fig.4a the maximum gradient for SG was 64 edge but for MG in table2 and Fig.4b the maximum gradient was 25 edge means that the SG it gave, overall, a better performance than MG to edge detection. In addition, the SG appears the results of the fused images increase the gradient for all methods except the PCA means that the decreasing in gradient that it dose not enhance the spatial quality. The maximum results of MG and SG for sharpen image methods was for the EF but the nearest to the PAN it was SF has the same results approximately. However, when comparing them to the PAN it can be seen that the SF close to the Result of the PAN. Other means the SF added the details of the PAN image to the MS image as well as the maximum preservation of the spatial resolution of the PAN. According to the computation results, FCC in table2 and Fig.4c the increase FCC indicates the amount of edge information from the PAN image transferred into the fused images in quantity of spatial resolution through the merging. The maximum results of FCC From table2 and Fig.4c were for the HFA, HFM and SF. The purposed approach of HPDI as the spatial quality metric is more important than the other spatial quality matrices to distinguish the best spatial enhancement through the merging. Also, the analytical technique of HPDI is much more useful for measuring the spatial enhancement corresponding to the Pan image than the other methods since the FCC or SG and MG gave the same results for some methods; but the HPDI gave the smallest different ratio between those methods. It can be observed that from Fig.4d and table2 the maximum results of the purpose approach HPDI it were with the SF followed HFM methods. Fig. 4a: Chart Representation of MG Fig. 4b: Chart Representation of SG Fig. 4c: Chart Representation of FCC Fig. 4d: Chart Representation of HPDI Fig. 4: Chart Representation of MG, SG, FCC & HPDI of Fused Images 0 5 10 15 20 25 30 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 EF HFA HFM IHS PCA RVS MG 0 10 20 30 40 50 60 70 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 EF HFA HFM IHS PCA RVS SF SG 0.000 0.100 0.200 0.300 0.400 0.500 0.600 0.700 0.800 0.900 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 EF HFA HFM IHS PCA RVS FCC -0.080 -0.060 -0.040 -0.020 0.000 0.020 0.040 0.060 0.080 0.100 1 2 3 1 2 3 1 2 3 1 2 3 1 2 3 1 2 EF HFA HFM IHS PCA RVS HPDI
  • 11. Computer Science & Information Technology ( CS & IT ) 489 Fig.5a: HFA Fig.5b: HFM Fig.5c: IHS Fig.5d: PCA Table 2: The Spatial Quality Results of Fused Images Method Band MG SG HPDI FCC EF 1 25 64 0.025 0.464 2 25 65 0.025 0.446 3 25 65 0.022 0.426 HFA 1 11 51 -0.017 0.857 2 12 52 -0.013 0.856 3 12 52 -0.009 0.854 HFM 1 12 54 0.058 0.845 2 12 54 0.069 0.834 3 12 53 0.077 0.821 IHS 1 9 36 0.029 0.853 2 9 36 0.032 0.857 3 9 36 0.034 0.856 PCA 1 6 33 -0.006 0.718 2 6 34 -0.006 0.710 3 6 35 -0.006 0.704 RVS 1 13 54 -0.064 0.539 2 12 53 -0.053 0.540 3 12 52 -0.046 0.536 SF 1 11 48 0.074 0.769 2 11 49 0.080 0.766 3 11 49 0.080 0.761 PAN 1 10 42 -- --
  • 12. 490 Computer Science & Information Technology ( CS & IT ) Fig.5: The Representation of Fused Images Fig.5e: RVS Fig.5f: SF Fig.5g: EF Continue Fig.5: The Representation of Fused Images 5. CONCLUSION This study proposed a new measure to test the efficiency of spatial resolution of fusion images applied to a number of methods of merge images. These methods have obtained the best results in previous studies and some of them depend on the pixel level fusion including HFA, HFM, IHS and RVS methods while the other methods based on features level fusion like PCA, EF and SF method. Results of the study show the importance to propose a new HPDI as a criterion to measure the quality evaluation for the spatial resolution of the fused images in which the results showed the effectiveness of high efficiency when compared with the other criterion methods for measurement such as the FCC. The proposed analytical technique of HPDI is much more useful for measuring the spatial enhancement of fused image corresponding to the spatial resolution of
  • 13. Computer Science & Information Technology ( CS & IT ) 491 the PAN image than the other methods, since the FCC or SG and MG gave the same results for some methods; but the HPDI gave the smallest different ratio between those methods, therefore, it is strongly recommended to use HPDI for measuring the spatial enhancement of fused image with PAN image because of its mathematical and more precision as quality indicator. Experimental results with spatial and spectral quality matrices evaluation further show that the SF technique based on feature level fusion maintains the spectral integrity for MS image as well as improved as much as possible the spatial quality of the PAN image. The use of the SF based fusion technique is strongly recommended if the goal of merging is to achieve the best representation of the spectral information of multispectral image and the spatial details of a high- resolution panchromatic image. Because it is based on Component Substitution fusion techniques coupled with a spatial domain filtering. It utilizes the statistical variable between the brightness values of the image bands to adjust the contribution of individual bands to the fusion results to reduce the color distortion. The analytical technique of SG is much more useful for measuring gradient than MG as the MG gave the smallest gradient results. REFERENCES [1] Leviner M., M. Maltz ,2009. “A new multi-spectral feature level image fusion method for human interpretation”. Infrared Physics & Technology 52 (2009) pp. 79–88. [2] Aiazzi B., S. Baronti , M. Selva,2008. “Image fusion through multiresolution oversampled decompositions”. In Image Fusion: Algorithms and Applications “.Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier Ltd. [3] Nedeljko C., A. Łoza, D. Bull and N. Canagarajah, 2006. “A Similarity Metric for Assessment of Image Fusion Algorithms”. International Journal of Information and Communication Engineering Vol. No. 3, pp. 178 – 182. [4] ŠVab A.and Oštir K., 2006. “High-Resolution Image Fusion: Methods to Preserve Spectral and Spatial Resolution”. Photogrammetric Engineering & Remote Sensing, Vol. 72, No. 5, May 2006, pp. 565–572. [5] Shi W., Changqing Z., Caiying Z., and Yang X., 2003. “Multi-Band Wavelet For Fusing SPOT Panchromatic And Multispectral Images”. Photogrammetric Engineering & Remote Sensing Vol. 69, No. 5, May 2003, pp. 513–520. [6] Hui Y. X.And Cheng J. L., 2008. “Fusion Algorithm For Remote Sensing Images Based On Nonsubsampled Contourlet Transform”. ACTA AUTOMATICA SINICA, Vol. 34, No. 3.pp. 274- 281. [7] Firouz A. Al-Wassai, N.V. Kalyankar, A. A. Al-zuky ,2011. “ Multisensor Images Fusion Based on Feature-Level”. International Journal of Advanced Research in Computer Science, Volume 2, No. 4, July-August 2011, pp. 354 – 362. [8] Hsu S. H., Gau P. W., I-Lin Wu I., and Jeng J. H., 2009,“Region-Based Image Fusion with Artificial Neural Network”. World Academy of Science, Engineering and Technology, 53, pp 156 -159. [9] Zhang J., 2010. “Multi-source remote sensing data fusion: status and trends”, International Journal of Image and Data Fusion, Vol. 1, No. 1, pp. 5–24. [10] Ehlers M., S. Klonusa, P. Johan, A. strand and P. Rosso ,2010. “Multi-sensor image fusion for pansharpening in remote sensing”. International Journal of Image and Data Fusion, Vol. 1, No. 1, March 2010, pp. 25–45. [11] Alparone L., Baronti S., Garzelli A., Nencini F., 2004. “Landsat ETM+ and SAR Image Fusion Based on Generalized Intensity Modulation”. IEEE Transactions on Geoscience and Remote Sensing, Vol. 42, No. 12, pp. 2832-2839.
  • 14. 492 Computer Science & Information Technology ( CS & IT ) [12] Dong J.,Zhuang D., Huang Y.,Jingying Fu,2009. “Advances In Multi-Sensor Data Fusion: Algorithms And Applications “. Review, ISSN 1424-8220 Sensors 2009, 9, pp.7771-7784. [13] Amarsaikhan D., H.H. Blotevogel, J.L. van Genderen, M. Ganzorig, R. Gantuya and B. Nergui, 2010. “Fusing high-resolution SAR and optical imagery for improved urban land cover study and classification”. International Journal of Image and Data Fusion, Vol. 1, No. 1, March 2010, pp. 83–97. [14] Vrabel J., 1996. “Multispectral imagery band sharpening study”. Photogrammetric Engineering and Remote Sensing, Vol. 62, No. 9, pp. 1075-1083. [15] Vrabel J., 2000. “Multispectral imagery Advanced band sharpening study”. Photogrammetric Engineering and Remote Sensing, Vol. 66, No. 1, pp. 73-79. [16] Wenbo W.,Y.Jing, K. Tingjun ,2008. “Study Of Remote Sensing Image Fusion And Its Application In Image Classification” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B7. Beijing 2008, pp.1141-1146. [17] Parcharidis I. and L. M. K. Tani, 2000. “Landsat TM and ERS Data Fusion: A Statistical Approach Evaluation for Four Different Methods”. 0-7803-6359- 0/00/ 2000 IEEE, pp.2120-2122. [18] Pohl C. and Van Genderen J. L., 1998. “Multisensor Image Fusion In Remote Sensing: Concepts, Methods And Applications”.(Review Article), International Journal Of Remote Sensing, Vol. 19, No.5, pp. 823-854. [19] Firouz A. Al-Wassai, N.V. Kalyankar, A. A. Al-zuky, 2011b. “The IHS Transformations Based Image Fusion”. Journal of Global Research in Computer Science, Volume 2, No. 5, May 2011, pp. 70 – 77. [20] Firouz A. Al-Wassai, N.V. Kalyankar , A.A. Al-Zuky, 2011a. “Arithmetic and Frequency Filtering Methods of Pixel-Based Image Fusion Techniques “.IJCSI International Journal of Computer Science Issues, Vol. 8, Issue 3, No. 1, May 2011, pp. 113- 122. [21] Firouz A. Al-Wassai, N.V. Kalyankar , A.A. Al-Zuky, 2011c.” The Statistical methods of Pixel-Based Image Fusion Techniques”. International Journal of Artificial Intelligence and Knowledge Discovery Vol.1, Issue 3, July, 2011 5, pp. 5- 14. [22] Li S., Kwok J. T., Wang Y.., 2002. “Using the Discrete Wavelet Frame Transform To Merge Landsat TM And SPOT Panchromatic Images”. Information Fusion 3 (2002), pp.17–23. [23] Liao. Y. C., T.Y. Wang, and W. T. Zheng, 1998. “Quality Analysis of Synthesized High Resolution Multispectral Imagery”. URL: http://www. gisdevelopment.net/AARS/ACRS 1998/DigitalImage Processing (Last date accessed:28 Oct. 2008). [24] Gonzales R. C, and R. Woods, 1992. Digital Image Procesing. A ddison-Wesley Publishing Company. [25] De Béthume S., F. Muller, and J. P. Donnay, 1998. “Fusion of multi-spectral and panchromatic images by local mean and variance matching filtering techniques”. In: Proceedings of the Second International Conference: Fusion of Earth Data: Merging Point Measurements, Raster Maps and Remotely Sensed Images, Sophia-Antipolis, France, 1998, pp. 31–36. [26] De Bèthune. S and F. Muller, 2002. “Multisource Data Fusion Applied research”. URL:http://www. abricmuller.be/realisations/ fusion.html .(Last date accessed:28 Oct. 2002). [27] Sangwine S. J., and R.E.N. Horne, 1989. The Colour Image Processing Handbook. Chapman & Hall. [28] Ryan. R., B. Baldridge, R.A. Schowengerdt, T. Choi, D.L. Helder and B. Slawomir, 2003. “IKONOS Spatial Resolution And Image Interpretability Characterization”, Remote Sensing of Environment, Vol. 88, No. 1, pp. 37–52. [29] Pradham P., Younan N. H. and King R. L., 2008. “Concepts of image fusion in remote sensing applications”. Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier Ltd. [30] Mark S. Nand A. S. A.,2008 “Feature Extraction and Image Processing”. Second edition, 2008 Elsevier Ltd.
  • 15. Computer Science & Information Technology ( CS & IT ) [31] Richards J. A. · X. Jia, 2006. “Remote Sensing Digital Image Analysis An Introduction”.4th Edition, Springer-Verlag Berlin Heidelberg 2006. [32] Li S. and B. Yang , 2008. “Region Applications “.Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier Ltd. [33] Zhou J., D. L. Civico, and J. A. Silander. “A wavelet transform method to merge landsat TM and SPOT panchromatic data”. International Journal of Remote S Authors Firouz Abdullah Al-Wassai. Received the B.Sc. degree in physics Sana’a, Yemen in 1993; the M. Sc. degree from Bagdad University, Iraq in 2003. Currently, she is Ph. D. scholar in computer Science science (S.R.T.M.U), Nanded, India. Dr. N.V. Kalyankar,He is a Principal completed M.Sc.(Physics) from Dr. B.A.M.U, Aurangabad. In 1980 he joined as a leturer in department of physics at Yeshwant Mahavidyalaya, Nanded. In 1984 he completed his DHE. He completed his Ph.D. from Dr.B.A.M.U From 2003 he is working as a Principal to till date in Yeshwant Mahavidyalaya, Nanded. He is also research guide for Physics and Co Nanded. 03 research students are successfully awarded Ph.D in Computer Science under his guidance. 12 research students are successfully awarded M.Phil in Computer Science under his guidance He is also worked on various boides is S.R.T.M.U, Nanded. He also published 34 research papers in various international/national journals. He is peer team member of NAAC (National Assessment and Accreditation Council, book entitled “DBMS concepts and programming in Foxpro”. He also get various educational wards in which “Best Principal” award from S.R.T.M.U, Nanded in 2009 and “Best Teacher” award from Govt. of Maharashtra, India in 2010. He is life member of Indian “Fellowship of (F.L.S.)” on 11 National Congress, Kolkata (India). He is also honored with November 2009. Computer Science & Information Technology ( CS & IT ) Richards J. A. · X. Jia, 2006. “Remote Sensing Digital Image Analysis An Introduction”.4th Edition, Verlag Berlin Heidelberg 2006. and B. Yang , 2008. “Region-based multi-focus image fusion”. in Image Fusion: Algorithms and Applications “.Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier Zhou J., D. L. Civico, and J. A. Silander. “A wavelet transform method to merge landsat TM and SPOT panchromatic data”. International Journal of Remote Sensing, 19(4), 1998. Received the B.Sc. degree in physics from University of Sana’a, Yemen in 1993; the M. Sc. degree from Bagdad University, Iraq in 2003. Ph. D. scholar in computer Science at department of computer science (S.R.T.M.U), Nanded, India. Principal of Yeshwant Mahvidyalaya, Nanded(India) completed M.Sc.(Physics) from Dr. B.A.M.U, Aurangabad. In 1980 he joined as a at Yeshwant Mahavidyalaya, Nanded. In 1984 he completed his DHE. He completed his Ph.D. from Dr.B.A.M.U, Aurangabad in 1995. From 2003 he is working as a Principal to till date in Yeshwant Mahavidyalaya, Nanded. He is also research guide for Physics and Computer Science in S.R.T.M.U, Nanded. 03 research students are successfully awarded Ph.D in Computer Science under his guidance. 12 research students are successfully awarded M.Phil in Computer Science under his guidance He is also worked on various boides in S.R.T.M.U, Nanded. He is also worked on various bodies is S.R.T.M.U, Nanded. He also published 34 research papers in various international/national journals. He is peer team member of NAAC (National Assessment and Accreditation Council, India). He publi “DBMS concepts and programming in Foxpro”. He also get various educational wards in which “Best Principal” award from S.R.T.M.U, Nanded in 2009 and “Best Teacher” award from Govt. of Maharashtra, India in 2010. He is life member of Indian “Fellowship of Linnaean Society of F.L.S.)” on 11 National Congress, Kolkata (India). He is also honored with November 2009. 493 Richards J. A. · X. Jia, 2006. “Remote Sensing Digital Image Analysis An Introduction”.4th Edition, focus image fusion”. in Image Fusion: Algorithms and Applications “.Edited by: Stathaki T. “Image Fusion: Algorithms and Applications”. 2008 Elsevier Zhou J., D. L. Civico, and J. A. Silander. “A wavelet transform method to merge landsat TM and under his guidance. 12 research students are successfully awarded M.Phil in Computer Science under his in S.R.T.M.U, Nanded. He is also worked on various bodies is S.R.T.M.U, Nanded. He also published 34 research papers in various international/national journals. He . He published a “DBMS concepts and programming in Foxpro”. He also get various educational wards in which “Best Principal” award from S.R.T.M.U, Nanded in 2009 and “Best Teacher” award from Govt. of Society of London F.L.S.)” on 11 National Congress, Kolkata (India). He is also honored with November 2009.