isprsarchives XL 1 W3 139 2013
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
A New Developed GIHS-BT-SFIM Fusion Method Based On Edge and Class Data
S. Dehnavi1, A. Mohammadzadeh1
1
Faculty of Geodesy and Geomatics, K.N.Toosi University of Technology, Tehran, Iran.
dehnavi.kntu@gmail.com
almoh2@gmail.com
KEY WORDS: GIHS_BT_SFIM, Fusion, Edge adaptive, Class based
ABSTRACT
The objective of image fusion (or sometimes pan sharpening) is to produce a single image containing the best aspects of the
source images. Some desirable aspects are high spatial resolution and high spectral resolution. With the development of space
borne imaging sensors, a unified image fusion approach suitable for all employed imaging sources becomes necessary. Among
various image fusion methods, intensity-hue-saturation (IHS) and Brovey Transforms (BT) can quickly merge huge amounts of
imagery. However they often face color distortion problems with fused images. The SFIM fusion is one of the most frequently
employed approaches in practice to control the tradeoff between the spatial and spectral information. In addition it preserves
more spectral information but suffer more spatial information loss. Its effectiveness is heavily depends on the filter design. In this
work, two modifications were tested to improve the spectral quality of the images and also investigating class-based fusion
results. First, a Generalized Intensity-Hue-Saturation (GIHS), Brovey Transform (BT) and smoothing-filter based intensity
modulation (SFIM) approach was implemented. This kind of algorithm has shown computational advantages among other fusion
methods like wavelet, and can be extended to different number of bands as in literature discussed. The used IHS-BT-SFIM
algorithm incorporates IHS, IHS-BT, BT, BT-SFIM and SFIM methods by two adjustable parameters. Second, a method was
proposed to plus edge information in previous GIHS_BT_SFIM and edge enhancement by panchromatic image. Adding
panchromatic data to images had no much improvement. Third, an edge adaptive GIHS_BT_SFIM was proposed to enforce
fidelity away from the edges. Using MS image off edges has shown spectral improvement in some fusion methods. Fourth, a
class based fusion was tested, which tests different coefficients for each method due to its class. The best parameters for
vegetated areas was k1=0.6, k2=0.8; and for urban region it was k1=0.4, k2=0.4. Results might be useful for future studies on
fusion methods and their generalization.
1. INTRODUCTION
I
N RECENT years, the number of spectral bands in a
satellite
image
has
increased
significantly.
Additionally, the spatial resolution of the image has
also been enhanced. For remote sensing applications, the
satellite images which can simultaneously offer high spatial
and spectral resolutions are often required. It means that the
ground and spectrum details must be accurately presented
by a single image (Tu, 2012). The benefit of merged image
has been demonstrated in many practical applications,
especially for vegetation, land-use, precision farming, and
urban studies (Tu, 2001). Due to this, various methods for
image fusion have been described earlier (Amolins, 2007;
Tu, 2001, 2004, 2005, 2012; Duo, 2007; Sh. Li, 2002; Z.
Li,2005; Ling, 2007; Pajares, 2004; Yang, 2007; Rahmani,
2010; Chu, 2008). Among various image fusion methods,
intensity-hue-saturation (IHS) and Brovey Transforms (BT)
can quickly merge huge amounts of imagery and
simultaneously preserving most of the spatial information
provided that the color distortion is mitigated in the fusion;
as they often face color distortion problems with fused
images (Tu, 2005).
Furthermore, IHS-like and BT-like approaches are indeed
efficient fusion methods since they require only
arithmetical operations without any statistical analysis or
filter design. The Smoothing-filter-based intensity
modulation (SFIM) fusion is one of the most frequently
employed approaches in practice to control the trade of
between the spatial and spectral information. However,
SFIM is not as efficient as BT-like and IHS-like methods in
the fusion of high resolution images. In addition it
preserves more spectral information but suffer more spatial
information loss. Its effectiveness is heavily depends on the
filter design (Tu, 2012). GIHS-BT-SFIM is an approach
which was proposed by Tu et al. as an adjustable pansharpening approach for high resolution satellite imageries.
This method has shown computational advantages among
other fusion methods like wavelet, and could be extended
to different number of bands. The introduced IHS-BTSFIM algorithm incorporates IHS, IHS-BT, BT, BT-SFIM
and SFIM methods by two adjustable parameters. On the
other hand Rahmani et al. proposed an adaptive HIS Pansharpening method, whereby the edges of panchromatic
image combine with the multispectral image to increase the
spatial information as well as preserving the spectral
quality. Another method is to incorporate edge information
on GIHS_BT_SFIM fused image. An improvement is
proposed in this study to combine previous approaches and
test the results. Furthermore, a class-based image fusion
method is introduced in this work to choose the best
parameters for each class.
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
139
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
RBT
R
Pan
G
G BT
BT
I
BBT
B
2. GIHS-BT-SFIM
2.1. Generalized Intensity-Hue-Saturation (GIHS)
In the HIS color space, intensity (I) is a measure of
brightness, which is sometimes called luminance (L), Hue,
(H) is the color, measured as the angle around a color
wheel or color hexagon (Almoni, 2007), and Saturation is
the amount of color, representing the orthogonal distance
from intensity axis in the HIS color space (Gonzalez,
2002). Generalized IHS (GIHS) or Fast IHS (FIHS) is a
unifying image fusion method in which the low resolution
intensity component (I0) in HIS space is replaced by a graylevel image with higher spatial resolution (Inew or pan) and
transformed back to the original RGB space with the
original H and S components (Tu, 2001).
Rnew 1 1 / 2
Gnew 1 1 / 2
Bnew 1
2
1 / 2 I new
1 / 2 v10
0 v 20
(1)
R
G
B
(3)
And I, is the intensity of resized MS or HS image
calculated as the above.
2.3. Smoothing-Filter-Based
(SFIM)
Intensity
Modulation
SFIM is a BT-like approach, uses a smooth version of pan
image (PL), in lieu of MS image intensity component as
denominator of defined ratio ( BT ). Thereupon, this
method is defined by
RBT
R
Pan
G
G
BT
PL
BBT
B
(4)
where, PL is often obtained from a 7*7 mean filter. It is
To develop a computationally efficient method without the
coordinate transformation, (1) can be rewritten as
known that the spatial resolution can be improved by
increasing the mask size of the low-pass filter in the SFIM
method (Tu, 2012).
Rnew 1 1 / 2
Gnew 1 1 / 2
2
Bnew 1
2.4. An Adjustable GIHS-BT-SFIM
1 / 2 I 0 ( I new I 0 ) R0
v10
1 / 2
G0
v 20
0
B0
(2)
where I new I0 . So, the fused image [ Rnew , Gnew , Bnew ]T
can be easily obtained from the resized original image
[ R0 , G0 , B0 ]T , simply by addition. A merit of the method
lies not only on its fast computing capability for fused
images but also its ability to extend traditional three order
transformations to arbitrary order. That is F i=Mi+ , where
Mi denotes the resized hyperspectral (HS) or multispectral
(MS) image of band i, Pan I 0 , and I 0 ((1 / l )
l
i 1
Mi ) ,
l represents the order or the number of bands (Tu, 2001). I0
value can be computed as the band distances, their
weighted (adjusted) average or simply their average same
as the above mentioned, in WorldView-1(Tu, 2012). Using
weighting coefficients on green and blue band (for
IKONOS image) in an attempt to minimize the difference
between I and panchromatic image, leads to spectral
adjusted IHS (SA-IHS) method.
2.2. Brovey Transform (BT)
In contrast to the IHS method, the BT method is a ratio
fusion technique that preserves the relative spectral
contribution of each pixel, but replaces its overall
brightness by the high-resolution panchromatic image (Tu,
2005). It is operated by
As it would be mentioned in the next part, IHS technique
generates significant saturation compression, and the BT
method suffers from saturation stretch. Motivated by this
idea, an adjustable IHS-BT approach was used (Tu, 2005),
then the SFIM method incorporates into the IHS-BT-SFIM
approach (Tu, 2012) by the following equation.
Rnew
P
Gnew I k .( Pˆ I )
1
Bnew
R k2 .( Pˆ I )
G k2 .( Pˆ I )
B k2 .( Pˆ I )
Pˆ P or PL
(5)
ki is the module selection parameters and defined as to be
0 ki 1 .
3. EDGE ADAPTIVE IHS
An Edge adaptive procedure was introduced by Rahmani et
al. to enforce fidelity away from the edges. They have used
an edge detecting function, h(x) in the IHS fusion method
in the manner of
Fi M i h( x)( P I )
(6)
Where, Fi is the fused image and Mi is the band i of the MS
image. In this case h(x) is equal to one on edges and equal
to zero off edges. Edge adaptive method produces images
with high spectral resolution while maintaining the highquality spatial resolution of the original IHS.
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
140
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
6. RESULTS
SID (0)
0.0038344
0.0016724
SAM (0)
2.041255
0.8914564
RASE (0)
36.16468
36.237589
RMSE (0)
145.56914
145.86261
QI (1)
0.9427865
0.927844
ERGAS (0)
9.797232
9.7672105
CC MS (1)
)
Grad. (
31.578637
0.2226208
0.2196149
k1 k 2
31.435052
Pˆ I
0.5
36.434766
2.29E-07
36.215599
1.64E+00
0.003099
17.119531
2.29E-07
0.0001111
0.0003077
146.65628
145.7741
68.909097
0.8959333
0.9313291
9.7863311
0.9745913
k1 k 2
9.7911045
Pˆ PL
4.2892944
BTSFIM
0.2215887
0
0.2209034
k1 k 2
0.8339869
BT
Pˆ I
31.560622
Because of different distortion extent due to each class, it
was interesting to test every method in a class-based
manner. This part was just tested on urban image because
of more distinct nature of the classes and only two different
types of objects defined as urban and vegetation cover. A
class based fusion was tested, which gives different
coefficients for each method due to its class. It means that,
two adjustable parameters, introduced in first part for
defining combination of fusion methods, has changed in a
LUT form and their best values were selected according to
the fusion results assessment in each class. There were 16
miscellaneous cases which were tested and acquainted in
the following part.
1
IHSBT
31.591407
5. CLASS-BASED FUSION APPROACH
k1 k 2
38.038308
) ; 109 and 1010
| P |4
(9) The best edge detector in this study was Roberts, as
experienced.
h( x) exp(
IHS
Pˆ I
CC- p(1)
(8)
Pˆ P or PL
The edges can be extracted using standard edge detection
methods such as the canny detector (Canny, 1986). There is
another edge detector suggested by Prona and Malik (1990)
whereby the considerable results can be achieved in a
fusion task which was introduced as the best in this study
among other edge detectors in (Rahmani, et al. 2008),
0.9456128
I k 2.h x . Pˆ I
P
I k1.h x . Pˆ I
0.949865
I"
0.9429588
Another approach, which uses panchromatic image on
edges and GIHS-BT-SFIM off edges, can be introduced as
equation (8),
The first study area is an urban region located in Shahriar,
(Longitude: 513 , latitude: 3539 ) Tehran, Iran. The
IKONOS image with four bands (Red, Green, Blue and
Near infrared) and 3 meters spatial resolution in MS and
one panchromatic band with 1 meter spatial resolution was
used in this region (Figure 1). Results (Table 1) are
0.9441899
Pˆ P or PL
6.1. Urban area
0.4477592
(7)
Spatial (1)
I k 2.h x . Pˆ I
0.9793949
0.9800935
0.9718691
P
P ( I P).h( x) k1.h x . Pˆ I
0.9738466
I"
0.8217719
In many applications such as precision farming and urban
classification, there’s a need to have more precise
information on edges because of the error occurrence is
especially there. Using MS image data off edges and the
minimum saturation on edges presents a good quality on
spatial and spectral information for image based analysis
purposes similar to classification. So, in this improved
method edges are transferred from the GIHS-BT-SFIM
image to the MS image. This approach extracts the edges
from the pan-chromatic image; where there are edges the
IHS-BT-SFIM product was imposed, otherwise, the MS
image was used. In this case the fused multichannel image
can be formed by the new formula,
Two datasets have been tested in this study. First one was a
high resolution image in an urban area and the other a
medium resolution image of an agricultural field. Setting ki
parameters of the GIHS-BT-SFIM, leads us to different
fusion techniques and comparison for each method and
their spectral and spatial quality was simply implemented.
Spatial quality can be judged visually, but subtle color
changes are more difficult to notice in this manner
(Rahmani et al, 2010). In order to compare method results,
spatial and spectral qualities by relying on both visual
inspection and metric performance data were evaluated.
Band-to-band correlation coefficient (CC), distortion
extent, ERGAS, SAM, Q Index (Q-average), RMSE and
RASE are the spectral metrics, and CC and average
gradient are the spatial metrics for analyzing fusion
methods fulfillment (Rahmani, 2008; Alparone, 2008;
Zhang, 2008).
(Reference
values)
4. EDGE ADAPTIVE GIHS-BT-SFIM
1
SFIM
Pˆ PL
k1 1
k2 0
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
141
0.680253
0.831546
40.24464
0.321928
8.846769
0.952452
133.9764
33.28463
1.692252
0.003682
Edge
+
IHSBT
(k=
0.5)
0.680024
0.83469
40.16913
0.320423
8.821502
0.940162
134.2395
33.34998
0.749581
0.001413
Edge
+ BT
0.676058
0.828699
40.2711
0.321717
8.839665
0.913299
134.9936
33.53733
2.30E-07
0.000259
Edge
+ BTSFIM
0.677401
0.830308
40.26946
0.320433
8.841822
0.941824
134.1846
33.33634
1.34E+00
0.002647
Edge
+
SFIM
0.782949
0.42262
37.34387
0.842689
4.142541
0.976479
66.69437
16.56931
2.29E-07
0.000112
0.320391
0.214581
19.92138
0.936497
2.32683
0.99688
0.031071
9.26287
0.086614
0.001128
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Edge
+ IHS
Edge:
HIS
+MS
pan
MS
IHS
IHS+Edge
HIS-BT
HIS-BT+Edge
BT
BT+Edge
BT-SFIM
BT-SFIM+Edge
SFIM
SFIM+Edge
Figure 1:Results for Urban area
SID (0)
0.001672
0.003834
SAM (0)
2.041255
0.891456
RASE (0)
36.16468
36.23759
RMSE (0)
0.121308
0.121552
ERGAS (0)
Grad. (
QI (1)
0.942786
k1 k 2
0.927844
Pˆ I
9.797232
1
IHSBT
9.767211
0.222621
0.219615
k1 k 2
CC MS (1)
31.57864
31.43505
)
0.945613
0.949865
Pˆ I
CC- p(1)
Spatial (1)
0.97939495
It is clear that edge adaptive method has improved spectral
information in all procedures. Fusion results are provided
in figure 1.
IHS
0.98009354
Table 1: Assessing the fusion methods in an urban region
The area of interest deals with agricultural fields located
around the village of Biddinghuizen in Flevoland, the
Netherlands, which is characterized by the cultivation of
arable crops (Abkar, 1994). This area represented a typical
agricultural region in the Netherlands. The main crops are
grass, potatoes, cereals, sugar-beets, beans, peas, and
onions. The three spectral bands 3, 4, and 5 of the Landsat
Thematic Mapper (TM) image that was acquired on 7 July
1987 has been used in this study. Results on this region are
in (Table 2 and figure 2).
(Reference
values)
0.000123
1.66E-05
0.000182
0.000104
0.039367
2.35E-07
0.050541
2.35E-07
9.277774
9.348285
9.296099
7.024678
0.031121
0.031357
0.031182
0.023563
0.996142
0.994497
0.995262
0.996388
2.322272
2.330462
2.325192
1.752521
0.93693
0.936093
0.936637
0.964614
19.9217
19.93042
19.9294
18.9778
0.214507
0.214143
0.214311
0.189779
Edge:
SFIM
+MS
0.318512
Edge:
BTSFIM
+MS
0.319598
Edge:
BT
+MS
0.354764
Edge:
HIS_
BT
+MS
0.320256
6.2. Agricultural fields
0.5
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
142
SFIM
Edge
+ IHS
0.435972
0.446119
0.449469
0.469816
0.945269
0.944735
0.950741
0.946362
0.447759
0.94419
0.942959
8.154986
8.15694
8.16501
38.85721
31.68285
31.67301
31.5236
31.67542
38.03831
31.59141
31.56062
0.670549
0.64756
0.644215
0.809155
0.219103
0.21895
0.217567
0.220669
0.833987
0.220903
0.221589
8.3003
10.37553
11.32527
4.576026
9.80137
9.797678
9.779525
9.806361
4.289294
9.791105
9.786331
0.737582
0.740714
0.741658
0.95064
0.908503
0.873873
0.904138
0.918343
0.974591
0.931329
0.895933
0.048911
0.044889
0.04455
0.061239
0.121625
0.122314
0.121714
0.121483
0.057424
0.121478
0.122214
0.33311
0.305718
0.303412
18.25671
36.25917
36.46473
36.28582
36.21683
17.11953
36.2156
36.43477
0.182216
0.438284
0.597513
0.18204
1.769849
0.18204
1.034129
2.136681
2.29E-07
1.63835
2.29E-07
0.00511
0.008474
0.069668
0.000776
0.003627
0.000969
0.002236
0.004311
0.000111
0.003099
0.000308
BT
Pˆ PL
143
0.401306
0.303222
0.545576
0.442026
6.006096
8.203174
0.899399
0.64545
3.383021
10.2822
0.741403
0.738535
0.019747
0.045262
0.13449
0.308258
0.182216
0.534773
0.00511
0.01595
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Edge:
BTSFIM
+MS
Edge:
SFIM
+MS
MS
IHS+Edge
HIS-BT+Edge
BT+Edge
BT-SFIM+Edge
Table 2: Assessing the fusion methods in an agricultural field
pan
IHS
HIS-BT
BT
BT-SFIM
RMSE (0)c2
SFIM+Edge
CC MS (1)c2
Figure 2: Results for Agricultural field
RMSE (0)c1
CC- p(1)-c2
Class2:
urban
SFIM
CC MS (1)c1
6.3. Class-Based Results (LUT-form)
CC- p(1)-c1
In this part, just the outputs are presented for each group of
modulation parameters. It is worth knowing that class part
results are defined due to some ROIs for each class,
because of the memory limitation for processing the whole
image and it may have some effects on the last results.
Look up table for these parameters are here (Table 3).
(Reference
values)
Class1: veg.
Pˆ I
0.97186908
k1 k 2
0.97384662
0
0.82177187
BTSFIM
0.96838169
k1 k 2
0.96946102
1
0.95794516
Pˆ PL
0.96310779
k1 1
0.76962972
k2 0
0.310889
Edge
+
IHSBT
(k=
0.5)
Edge
+ BTSFIM
0.308092
Edge
+ BT
Edge
+
SFIM
Edge:
HIS
+MS
Edge:
HIS_
BT
+MS
Edge:
BT
+MS
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
0.299276
9.907305
9.256606
9.590602
8.939193
0.133917
0.154702
0.145262
0.16386
0.963485
0.947669
0.963256
0.903145
4.727121
3.375842
3.952274
2.988708
0.118663
0.154609
0.14442
10.43399
0.13196
10.14125
0.109632
0.957978
0.118897
0.947262
0.821616
9.858437
0.957006
7.579864
K1=0.4
K2=0.2
0.920791
0.129779
6.298416
0.084486
K1=0.8
K2=0.8
0.956358
9.586439
0.962885
0.095161
K1=0.2
K2=0.8
9.305324
0.142437
5.10187
0.927364
K1=0.8
K2=0.6
0.156431
0.961073
0.109794
K1=0.2
K2=0.6
0.947313
4.065888
0.945551
K1=0.8
K2=0.4
3.351619
0.129064
K1=0.2
K2=0.4
0.146586
0.938727
K1=0.8
K2=0.2
0.892511
K1=0.2
K2=0.2
0.905118
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Table 3: Class-based and modulation parameter effects on fusion results. c1;
is for class one (vegetation cover) and c2; is for class2 (urban region).
9.586911
9.880744
8.991131
9.282394
10.18576
0.143304
0.131701
0.167822
0.155797
0.121715
0.962375
0.963491
0.907764
0.94813
0.957035
4.008613
4.863096
2.99262
3.365754
5.828853
0.130391
0.11549
0.103565
0.953542
0.155555
K1=0.6
K2=0.6
0.144887
K1=0.6
K2=0.4
0.940992
K1=0.4
K2=0.8
0.793783
K1=0.4
K2=0.6
0.91122
K1=0.4
K2=0.4
0.947905
7. CONCLUSION
In contrast with previous studies (Rahmani, 2010 & Tu,
2012), which only contemplate the spectral metrics in their
work; there’s a need to have both spatial and spectral
metrics because of the fusion goal; in this study, different
spatial and spectral metrics have been presented and
considering all is obvious that adding edge information
form panchromatic image into GIHS_BT_SFIM product
has not much improvement in various processes (Eq.8).
Using edge information on edges and MS data off edges
has shown spectral enhanced fused images according to
metrics in various methods (Eq.7). Nonetheless, there is
some spatial information loss. So, it is preferable to choose
the best method of fusion due to task aims and image data.
BT method in both images offers the most balanced results
among all according to both spectral and spatial
information, and SFIM has the most spectral information.
Moreover, some of edge detectors’ performance was tested,
and results show the high dependence of edge adaptive
method on the edge detector used.
It is clear that, method sort (the modulation parameter) is
important in image fusion process and the fusion results are
affected by the class type. So, choosing the best modulation
parameters (methods) in a fusion task should be considered.
It is recommended to study on class based results more, in
the future.
9.588235
0.144235
0.963114
3.973864
0.131253
K1=0.6
K2=0.8
0.953234
8. REFERENCES
Abkar, A.A., 1994. Knowledge-based classification method
for crop inventory using high spatial resolution data. A
thesis, ITC, ENSCHEDE, The Netherlands.
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
144
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Alparone, L., Alazzi, B., Baronti, S., Garzelli, A., Nencini,
F., Selva, M., 2008. Multispectral and panchromatic data
fusion assessment without reference. In: Photogrammetric
Engineering & Remote Sensing.
Amolins, K., Zhang, Y., Dare, P., 2007. Wavelet based
image fusion techniques- An introduction, review and
comparison. In: Photogrammetry & Remote Sensing.
International Archives of the Photogrammetry, Remote
Sensing and Spatial Information Sciences.
Zhang, Y., Hong, G., 2005. An HIS and wavelet integrated
approach to improve pan-sharpening visual quality of
natural colour IKONOS and QuickBird images. In:
Information Fusion.
Chue, H., Zhu, W., 2008. Fusion of IKONOS satellite
imagery using HIS transform and local variation. In: IEEE
Geoscience and Remote Sensing Letters.
Dou, W., Chen, Y., Li, X., Sui, D., 2007. A general
framework for component substitution image fusion: an
implementation using the fast image fusion method. In:
Computers & Geosciences.
Gonzalez, R., Woods, R., 2002, Digital Image Processing,
Prentice Hall, pp. 282-302.
Li, Sh., Kwok, J., Wang, Y., 2002. Using the discrete
wavelet frame transform to merge Landsat TM and SPOT
panchromatic images. In: Information Fusion.
Li, Z., Jing, Z., Yang, X., Sun, Sh., 2005. Color transfer
based remote sensing image fusion using non-separable
wavelet frame transform. In: Pattern Recognition Letters.
Rahmani, Sh., Strait, M., Merkurjev, D., Moeller, M.,
Wittman, T., 2010. An adaptive HIS pan-sharpening
method. In: IEEE Geoscience and Remote Sensing Letters.
Rahmani, Sh., Strait, M., Merkurjev, 2008. Evaluation of
pan-sharpening methods.
Tu, T., Haung, Su, Sh., Shyu, H., P.S., Hung, 2001. A new
look at HIS-like image fusion methods. In: Information
Fusion.
Tu, T., Haung, P.S., Hung, Ch., Chang, Ch., 2004. A fast
Intensity-Hue-Saturation fusion technique with spectral
adjustment for IKONOS imagery. In: IEEE Geoscience and
Remote Sensing Letters.
Tu, T., Lee, Y., Chang, Ch., Huang, P., 2005. Adjustable
intensity-hue-saturation and brovey transform fusion
technique for IKONOS/Quickbird imagery. In: Optical
Engineering.
Tu, T., Hsu, Ch., Lee, Ch., 2012. An adjustable pansharpening approach for IKONOS/Quickbird/GeoEye1/Worldview-2 imagery. In: IEEE Journal of Selected
Topics in applied Earth Observations and Remote Sensing.
Yang, X., Jing, Z., Liu, G., Hua, L., Ma, D., 2007. Fusion
of multispectral and panchromatic images using fuzzy rule.
In: Communications in Nonlinear Science and Numerical
Simulation.
Zhang, Y., 2008. Methods for image fusion quality
assessment-A Review, comparison and analysis. In: The
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
145
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
A New Developed GIHS-BT-SFIM Fusion Method Based On Edge and Class Data
S. Dehnavi1, A. Mohammadzadeh1
1
Faculty of Geodesy and Geomatics, K.N.Toosi University of Technology, Tehran, Iran.
dehnavi.kntu@gmail.com
almoh2@gmail.com
KEY WORDS: GIHS_BT_SFIM, Fusion, Edge adaptive, Class based
ABSTRACT
The objective of image fusion (or sometimes pan sharpening) is to produce a single image containing the best aspects of the
source images. Some desirable aspects are high spatial resolution and high spectral resolution. With the development of space
borne imaging sensors, a unified image fusion approach suitable for all employed imaging sources becomes necessary. Among
various image fusion methods, intensity-hue-saturation (IHS) and Brovey Transforms (BT) can quickly merge huge amounts of
imagery. However they often face color distortion problems with fused images. The SFIM fusion is one of the most frequently
employed approaches in practice to control the tradeoff between the spatial and spectral information. In addition it preserves
more spectral information but suffer more spatial information loss. Its effectiveness is heavily depends on the filter design. In this
work, two modifications were tested to improve the spectral quality of the images and also investigating class-based fusion
results. First, a Generalized Intensity-Hue-Saturation (GIHS), Brovey Transform (BT) and smoothing-filter based intensity
modulation (SFIM) approach was implemented. This kind of algorithm has shown computational advantages among other fusion
methods like wavelet, and can be extended to different number of bands as in literature discussed. The used IHS-BT-SFIM
algorithm incorporates IHS, IHS-BT, BT, BT-SFIM and SFIM methods by two adjustable parameters. Second, a method was
proposed to plus edge information in previous GIHS_BT_SFIM and edge enhancement by panchromatic image. Adding
panchromatic data to images had no much improvement. Third, an edge adaptive GIHS_BT_SFIM was proposed to enforce
fidelity away from the edges. Using MS image off edges has shown spectral improvement in some fusion methods. Fourth, a
class based fusion was tested, which tests different coefficients for each method due to its class. The best parameters for
vegetated areas was k1=0.6, k2=0.8; and for urban region it was k1=0.4, k2=0.4. Results might be useful for future studies on
fusion methods and their generalization.
1. INTRODUCTION
I
N RECENT years, the number of spectral bands in a
satellite
image
has
increased
significantly.
Additionally, the spatial resolution of the image has
also been enhanced. For remote sensing applications, the
satellite images which can simultaneously offer high spatial
and spectral resolutions are often required. It means that the
ground and spectrum details must be accurately presented
by a single image (Tu, 2012). The benefit of merged image
has been demonstrated in many practical applications,
especially for vegetation, land-use, precision farming, and
urban studies (Tu, 2001). Due to this, various methods for
image fusion have been described earlier (Amolins, 2007;
Tu, 2001, 2004, 2005, 2012; Duo, 2007; Sh. Li, 2002; Z.
Li,2005; Ling, 2007; Pajares, 2004; Yang, 2007; Rahmani,
2010; Chu, 2008). Among various image fusion methods,
intensity-hue-saturation (IHS) and Brovey Transforms (BT)
can quickly merge huge amounts of imagery and
simultaneously preserving most of the spatial information
provided that the color distortion is mitigated in the fusion;
as they often face color distortion problems with fused
images (Tu, 2005).
Furthermore, IHS-like and BT-like approaches are indeed
efficient fusion methods since they require only
arithmetical operations without any statistical analysis or
filter design. The Smoothing-filter-based intensity
modulation (SFIM) fusion is one of the most frequently
employed approaches in practice to control the trade of
between the spatial and spectral information. However,
SFIM is not as efficient as BT-like and IHS-like methods in
the fusion of high resolution images. In addition it
preserves more spectral information but suffer more spatial
information loss. Its effectiveness is heavily depends on the
filter design (Tu, 2012). GIHS-BT-SFIM is an approach
which was proposed by Tu et al. as an adjustable pansharpening approach for high resolution satellite imageries.
This method has shown computational advantages among
other fusion methods like wavelet, and could be extended
to different number of bands. The introduced IHS-BTSFIM algorithm incorporates IHS, IHS-BT, BT, BT-SFIM
and SFIM methods by two adjustable parameters. On the
other hand Rahmani et al. proposed an adaptive HIS Pansharpening method, whereby the edges of panchromatic
image combine with the multispectral image to increase the
spatial information as well as preserving the spectral
quality. Another method is to incorporate edge information
on GIHS_BT_SFIM fused image. An improvement is
proposed in this study to combine previous approaches and
test the results. Furthermore, a class-based image fusion
method is introduced in this work to choose the best
parameters for each class.
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
139
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
RBT
R
Pan
G
G BT
BT
I
BBT
B
2. GIHS-BT-SFIM
2.1. Generalized Intensity-Hue-Saturation (GIHS)
In the HIS color space, intensity (I) is a measure of
brightness, which is sometimes called luminance (L), Hue,
(H) is the color, measured as the angle around a color
wheel or color hexagon (Almoni, 2007), and Saturation is
the amount of color, representing the orthogonal distance
from intensity axis in the HIS color space (Gonzalez,
2002). Generalized IHS (GIHS) or Fast IHS (FIHS) is a
unifying image fusion method in which the low resolution
intensity component (I0) in HIS space is replaced by a graylevel image with higher spatial resolution (Inew or pan) and
transformed back to the original RGB space with the
original H and S components (Tu, 2001).
Rnew 1 1 / 2
Gnew 1 1 / 2
Bnew 1
2
1 / 2 I new
1 / 2 v10
0 v 20
(1)
R
G
B
(3)
And I, is the intensity of resized MS or HS image
calculated as the above.
2.3. Smoothing-Filter-Based
(SFIM)
Intensity
Modulation
SFIM is a BT-like approach, uses a smooth version of pan
image (PL), in lieu of MS image intensity component as
denominator of defined ratio ( BT ). Thereupon, this
method is defined by
RBT
R
Pan
G
G
BT
PL
BBT
B
(4)
where, PL is often obtained from a 7*7 mean filter. It is
To develop a computationally efficient method without the
coordinate transformation, (1) can be rewritten as
known that the spatial resolution can be improved by
increasing the mask size of the low-pass filter in the SFIM
method (Tu, 2012).
Rnew 1 1 / 2
Gnew 1 1 / 2
2
Bnew 1
2.4. An Adjustable GIHS-BT-SFIM
1 / 2 I 0 ( I new I 0 ) R0
v10
1 / 2
G0
v 20
0
B0
(2)
where I new I0 . So, the fused image [ Rnew , Gnew , Bnew ]T
can be easily obtained from the resized original image
[ R0 , G0 , B0 ]T , simply by addition. A merit of the method
lies not only on its fast computing capability for fused
images but also its ability to extend traditional three order
transformations to arbitrary order. That is F i=Mi+ , where
Mi denotes the resized hyperspectral (HS) or multispectral
(MS) image of band i, Pan I 0 , and I 0 ((1 / l )
l
i 1
Mi ) ,
l represents the order or the number of bands (Tu, 2001). I0
value can be computed as the band distances, their
weighted (adjusted) average or simply their average same
as the above mentioned, in WorldView-1(Tu, 2012). Using
weighting coefficients on green and blue band (for
IKONOS image) in an attempt to minimize the difference
between I and panchromatic image, leads to spectral
adjusted IHS (SA-IHS) method.
2.2. Brovey Transform (BT)
In contrast to the IHS method, the BT method is a ratio
fusion technique that preserves the relative spectral
contribution of each pixel, but replaces its overall
brightness by the high-resolution panchromatic image (Tu,
2005). It is operated by
As it would be mentioned in the next part, IHS technique
generates significant saturation compression, and the BT
method suffers from saturation stretch. Motivated by this
idea, an adjustable IHS-BT approach was used (Tu, 2005),
then the SFIM method incorporates into the IHS-BT-SFIM
approach (Tu, 2012) by the following equation.
Rnew
P
Gnew I k .( Pˆ I )
1
Bnew
R k2 .( Pˆ I )
G k2 .( Pˆ I )
B k2 .( Pˆ I )
Pˆ P or PL
(5)
ki is the module selection parameters and defined as to be
0 ki 1 .
3. EDGE ADAPTIVE IHS
An Edge adaptive procedure was introduced by Rahmani et
al. to enforce fidelity away from the edges. They have used
an edge detecting function, h(x) in the IHS fusion method
in the manner of
Fi M i h( x)( P I )
(6)
Where, Fi is the fused image and Mi is the band i of the MS
image. In this case h(x) is equal to one on edges and equal
to zero off edges. Edge adaptive method produces images
with high spectral resolution while maintaining the highquality spatial resolution of the original IHS.
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
140
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
6. RESULTS
SID (0)
0.0038344
0.0016724
SAM (0)
2.041255
0.8914564
RASE (0)
36.16468
36.237589
RMSE (0)
145.56914
145.86261
QI (1)
0.9427865
0.927844
ERGAS (0)
9.797232
9.7672105
CC MS (1)
)
Grad. (
31.578637
0.2226208
0.2196149
k1 k 2
31.435052
Pˆ I
0.5
36.434766
2.29E-07
36.215599
1.64E+00
0.003099
17.119531
2.29E-07
0.0001111
0.0003077
146.65628
145.7741
68.909097
0.8959333
0.9313291
9.7863311
0.9745913
k1 k 2
9.7911045
Pˆ PL
4.2892944
BTSFIM
0.2215887
0
0.2209034
k1 k 2
0.8339869
BT
Pˆ I
31.560622
Because of different distortion extent due to each class, it
was interesting to test every method in a class-based
manner. This part was just tested on urban image because
of more distinct nature of the classes and only two different
types of objects defined as urban and vegetation cover. A
class based fusion was tested, which gives different
coefficients for each method due to its class. It means that,
two adjustable parameters, introduced in first part for
defining combination of fusion methods, has changed in a
LUT form and their best values were selected according to
the fusion results assessment in each class. There were 16
miscellaneous cases which were tested and acquainted in
the following part.
1
IHSBT
31.591407
5. CLASS-BASED FUSION APPROACH
k1 k 2
38.038308
) ; 109 and 1010
| P |4
(9) The best edge detector in this study was Roberts, as
experienced.
h( x) exp(
IHS
Pˆ I
CC- p(1)
(8)
Pˆ P or PL
The edges can be extracted using standard edge detection
methods such as the canny detector (Canny, 1986). There is
another edge detector suggested by Prona and Malik (1990)
whereby the considerable results can be achieved in a
fusion task which was introduced as the best in this study
among other edge detectors in (Rahmani, et al. 2008),
0.9456128
I k 2.h x . Pˆ I
P
I k1.h x . Pˆ I
0.949865
I"
0.9429588
Another approach, which uses panchromatic image on
edges and GIHS-BT-SFIM off edges, can be introduced as
equation (8),
The first study area is an urban region located in Shahriar,
(Longitude: 513 , latitude: 3539 ) Tehran, Iran. The
IKONOS image with four bands (Red, Green, Blue and
Near infrared) and 3 meters spatial resolution in MS and
one panchromatic band with 1 meter spatial resolution was
used in this region (Figure 1). Results (Table 1) are
0.9441899
Pˆ P or PL
6.1. Urban area
0.4477592
(7)
Spatial (1)
I k 2.h x . Pˆ I
0.9793949
0.9800935
0.9718691
P
P ( I P).h( x) k1.h x . Pˆ I
0.9738466
I"
0.8217719
In many applications such as precision farming and urban
classification, there’s a need to have more precise
information on edges because of the error occurrence is
especially there. Using MS image data off edges and the
minimum saturation on edges presents a good quality on
spatial and spectral information for image based analysis
purposes similar to classification. So, in this improved
method edges are transferred from the GIHS-BT-SFIM
image to the MS image. This approach extracts the edges
from the pan-chromatic image; where there are edges the
IHS-BT-SFIM product was imposed, otherwise, the MS
image was used. In this case the fused multichannel image
can be formed by the new formula,
Two datasets have been tested in this study. First one was a
high resolution image in an urban area and the other a
medium resolution image of an agricultural field. Setting ki
parameters of the GIHS-BT-SFIM, leads us to different
fusion techniques and comparison for each method and
their spectral and spatial quality was simply implemented.
Spatial quality can be judged visually, but subtle color
changes are more difficult to notice in this manner
(Rahmani et al, 2010). In order to compare method results,
spatial and spectral qualities by relying on both visual
inspection and metric performance data were evaluated.
Band-to-band correlation coefficient (CC), distortion
extent, ERGAS, SAM, Q Index (Q-average), RMSE and
RASE are the spectral metrics, and CC and average
gradient are the spatial metrics for analyzing fusion
methods fulfillment (Rahmani, 2008; Alparone, 2008;
Zhang, 2008).
(Reference
values)
4. EDGE ADAPTIVE GIHS-BT-SFIM
1
SFIM
Pˆ PL
k1 1
k2 0
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
141
0.680253
0.831546
40.24464
0.321928
8.846769
0.952452
133.9764
33.28463
1.692252
0.003682
Edge
+
IHSBT
(k=
0.5)
0.680024
0.83469
40.16913
0.320423
8.821502
0.940162
134.2395
33.34998
0.749581
0.001413
Edge
+ BT
0.676058
0.828699
40.2711
0.321717
8.839665
0.913299
134.9936
33.53733
2.30E-07
0.000259
Edge
+ BTSFIM
0.677401
0.830308
40.26946
0.320433
8.841822
0.941824
134.1846
33.33634
1.34E+00
0.002647
Edge
+
SFIM
0.782949
0.42262
37.34387
0.842689
4.142541
0.976479
66.69437
16.56931
2.29E-07
0.000112
0.320391
0.214581
19.92138
0.936497
2.32683
0.99688
0.031071
9.26287
0.086614
0.001128
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Edge
+ IHS
Edge:
HIS
+MS
pan
MS
IHS
IHS+Edge
HIS-BT
HIS-BT+Edge
BT
BT+Edge
BT-SFIM
BT-SFIM+Edge
SFIM
SFIM+Edge
Figure 1:Results for Urban area
SID (0)
0.001672
0.003834
SAM (0)
2.041255
0.891456
RASE (0)
36.16468
36.23759
RMSE (0)
0.121308
0.121552
ERGAS (0)
Grad. (
QI (1)
0.942786
k1 k 2
0.927844
Pˆ I
9.797232
1
IHSBT
9.767211
0.222621
0.219615
k1 k 2
CC MS (1)
31.57864
31.43505
)
0.945613
0.949865
Pˆ I
CC- p(1)
Spatial (1)
0.97939495
It is clear that edge adaptive method has improved spectral
information in all procedures. Fusion results are provided
in figure 1.
IHS
0.98009354
Table 1: Assessing the fusion methods in an urban region
The area of interest deals with agricultural fields located
around the village of Biddinghuizen in Flevoland, the
Netherlands, which is characterized by the cultivation of
arable crops (Abkar, 1994). This area represented a typical
agricultural region in the Netherlands. The main crops are
grass, potatoes, cereals, sugar-beets, beans, peas, and
onions. The three spectral bands 3, 4, and 5 of the Landsat
Thematic Mapper (TM) image that was acquired on 7 July
1987 has been used in this study. Results on this region are
in (Table 2 and figure 2).
(Reference
values)
0.000123
1.66E-05
0.000182
0.000104
0.039367
2.35E-07
0.050541
2.35E-07
9.277774
9.348285
9.296099
7.024678
0.031121
0.031357
0.031182
0.023563
0.996142
0.994497
0.995262
0.996388
2.322272
2.330462
2.325192
1.752521
0.93693
0.936093
0.936637
0.964614
19.9217
19.93042
19.9294
18.9778
0.214507
0.214143
0.214311
0.189779
Edge:
SFIM
+MS
0.318512
Edge:
BTSFIM
+MS
0.319598
Edge:
BT
+MS
0.354764
Edge:
HIS_
BT
+MS
0.320256
6.2. Agricultural fields
0.5
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
142
SFIM
Edge
+ IHS
0.435972
0.446119
0.449469
0.469816
0.945269
0.944735
0.950741
0.946362
0.447759
0.94419
0.942959
8.154986
8.15694
8.16501
38.85721
31.68285
31.67301
31.5236
31.67542
38.03831
31.59141
31.56062
0.670549
0.64756
0.644215
0.809155
0.219103
0.21895
0.217567
0.220669
0.833987
0.220903
0.221589
8.3003
10.37553
11.32527
4.576026
9.80137
9.797678
9.779525
9.806361
4.289294
9.791105
9.786331
0.737582
0.740714
0.741658
0.95064
0.908503
0.873873
0.904138
0.918343
0.974591
0.931329
0.895933
0.048911
0.044889
0.04455
0.061239
0.121625
0.122314
0.121714
0.121483
0.057424
0.121478
0.122214
0.33311
0.305718
0.303412
18.25671
36.25917
36.46473
36.28582
36.21683
17.11953
36.2156
36.43477
0.182216
0.438284
0.597513
0.18204
1.769849
0.18204
1.034129
2.136681
2.29E-07
1.63835
2.29E-07
0.00511
0.008474
0.069668
0.000776
0.003627
0.000969
0.002236
0.004311
0.000111
0.003099
0.000308
BT
Pˆ PL
143
0.401306
0.303222
0.545576
0.442026
6.006096
8.203174
0.899399
0.64545
3.383021
10.2822
0.741403
0.738535
0.019747
0.045262
0.13449
0.308258
0.182216
0.534773
0.00511
0.01595
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Edge:
BTSFIM
+MS
Edge:
SFIM
+MS
MS
IHS+Edge
HIS-BT+Edge
BT+Edge
BT-SFIM+Edge
Table 2: Assessing the fusion methods in an agricultural field
pan
IHS
HIS-BT
BT
BT-SFIM
RMSE (0)c2
SFIM+Edge
CC MS (1)c2
Figure 2: Results for Agricultural field
RMSE (0)c1
CC- p(1)-c2
Class2:
urban
SFIM
CC MS (1)c1
6.3. Class-Based Results (LUT-form)
CC- p(1)-c1
In this part, just the outputs are presented for each group of
modulation parameters. It is worth knowing that class part
results are defined due to some ROIs for each class,
because of the memory limitation for processing the whole
image and it may have some effects on the last results.
Look up table for these parameters are here (Table 3).
(Reference
values)
Class1: veg.
Pˆ I
0.97186908
k1 k 2
0.97384662
0
0.82177187
BTSFIM
0.96838169
k1 k 2
0.96946102
1
0.95794516
Pˆ PL
0.96310779
k1 1
0.76962972
k2 0
0.310889
Edge
+
IHSBT
(k=
0.5)
Edge
+ BTSFIM
0.308092
Edge
+ BT
Edge
+
SFIM
Edge:
HIS
+MS
Edge:
HIS_
BT
+MS
Edge:
BT
+MS
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
0.299276
9.907305
9.256606
9.590602
8.939193
0.133917
0.154702
0.145262
0.16386
0.963485
0.947669
0.963256
0.903145
4.727121
3.375842
3.952274
2.988708
0.118663
0.154609
0.14442
10.43399
0.13196
10.14125
0.109632
0.957978
0.118897
0.947262
0.821616
9.858437
0.957006
7.579864
K1=0.4
K2=0.2
0.920791
0.129779
6.298416
0.084486
K1=0.8
K2=0.8
0.956358
9.586439
0.962885
0.095161
K1=0.2
K2=0.8
9.305324
0.142437
5.10187
0.927364
K1=0.8
K2=0.6
0.156431
0.961073
0.109794
K1=0.2
K2=0.6
0.947313
4.065888
0.945551
K1=0.8
K2=0.4
3.351619
0.129064
K1=0.2
K2=0.4
0.146586
0.938727
K1=0.8
K2=0.2
0.892511
K1=0.2
K2=0.2
0.905118
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Table 3: Class-based and modulation parameter effects on fusion results. c1;
is for class one (vegetation cover) and c2; is for class2 (urban region).
9.586911
9.880744
8.991131
9.282394
10.18576
0.143304
0.131701
0.167822
0.155797
0.121715
0.962375
0.963491
0.907764
0.94813
0.957035
4.008613
4.863096
2.99262
3.365754
5.828853
0.130391
0.11549
0.103565
0.953542
0.155555
K1=0.6
K2=0.6
0.144887
K1=0.6
K2=0.4
0.940992
K1=0.4
K2=0.8
0.793783
K1=0.4
K2=0.6
0.91122
K1=0.4
K2=0.4
0.947905
7. CONCLUSION
In contrast with previous studies (Rahmani, 2010 & Tu,
2012), which only contemplate the spectral metrics in their
work; there’s a need to have both spatial and spectral
metrics because of the fusion goal; in this study, different
spatial and spectral metrics have been presented and
considering all is obvious that adding edge information
form panchromatic image into GIHS_BT_SFIM product
has not much improvement in various processes (Eq.8).
Using edge information on edges and MS data off edges
has shown spectral enhanced fused images according to
metrics in various methods (Eq.7). Nonetheless, there is
some spatial information loss. So, it is preferable to choose
the best method of fusion due to task aims and image data.
BT method in both images offers the most balanced results
among all according to both spectral and spatial
information, and SFIM has the most spectral information.
Moreover, some of edge detectors’ performance was tested,
and results show the high dependence of edge adaptive
method on the edge detector used.
It is clear that, method sort (the modulation parameter) is
important in image fusion process and the fusion results are
affected by the class type. So, choosing the best modulation
parameters (methods) in a fusion task should be considered.
It is recommended to study on class based results more, in
the future.
9.588235
0.144235
0.963114
3.973864
0.131253
K1=0.6
K2=0.8
0.953234
8. REFERENCES
Abkar, A.A., 1994. Knowledge-based classification method
for crop inventory using high spatial resolution data. A
thesis, ITC, ENSCHEDE, The Netherlands.
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
144
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W3, 2013
SMPR 2013, 5 – 8 October 2013, Tehran, Iran
Alparone, L., Alazzi, B., Baronti, S., Garzelli, A., Nencini,
F., Selva, M., 2008. Multispectral and panchromatic data
fusion assessment without reference. In: Photogrammetric
Engineering & Remote Sensing.
Amolins, K., Zhang, Y., Dare, P., 2007. Wavelet based
image fusion techniques- An introduction, review and
comparison. In: Photogrammetry & Remote Sensing.
International Archives of the Photogrammetry, Remote
Sensing and Spatial Information Sciences.
Zhang, Y., Hong, G., 2005. An HIS and wavelet integrated
approach to improve pan-sharpening visual quality of
natural colour IKONOS and QuickBird images. In:
Information Fusion.
Chue, H., Zhu, W., 2008. Fusion of IKONOS satellite
imagery using HIS transform and local variation. In: IEEE
Geoscience and Remote Sensing Letters.
Dou, W., Chen, Y., Li, X., Sui, D., 2007. A general
framework for component substitution image fusion: an
implementation using the fast image fusion method. In:
Computers & Geosciences.
Gonzalez, R., Woods, R., 2002, Digital Image Processing,
Prentice Hall, pp. 282-302.
Li, Sh., Kwok, J., Wang, Y., 2002. Using the discrete
wavelet frame transform to merge Landsat TM and SPOT
panchromatic images. In: Information Fusion.
Li, Z., Jing, Z., Yang, X., Sun, Sh., 2005. Color transfer
based remote sensing image fusion using non-separable
wavelet frame transform. In: Pattern Recognition Letters.
Rahmani, Sh., Strait, M., Merkurjev, D., Moeller, M.,
Wittman, T., 2010. An adaptive HIS pan-sharpening
method. In: IEEE Geoscience and Remote Sensing Letters.
Rahmani, Sh., Strait, M., Merkurjev, 2008. Evaluation of
pan-sharpening methods.
Tu, T., Haung, Su, Sh., Shyu, H., P.S., Hung, 2001. A new
look at HIS-like image fusion methods. In: Information
Fusion.
Tu, T., Haung, P.S., Hung, Ch., Chang, Ch., 2004. A fast
Intensity-Hue-Saturation fusion technique with spectral
adjustment for IKONOS imagery. In: IEEE Geoscience and
Remote Sensing Letters.
Tu, T., Lee, Y., Chang, Ch., Huang, P., 2005. Adjustable
intensity-hue-saturation and brovey transform fusion
technique for IKONOS/Quickbird imagery. In: Optical
Engineering.
Tu, T., Hsu, Ch., Lee, Ch., 2012. An adjustable pansharpening approach for IKONOS/Quickbird/GeoEye1/Worldview-2 imagery. In: IEEE Journal of Selected
Topics in applied Earth Observations and Remote Sensing.
Yang, X., Jing, Z., Liu, G., Hua, L., Ma, D., 2007. Fusion
of multispectral and panchromatic images using fuzzy rule.
In: Communications in Nonlinear Science and Numerical
Simulation.
Zhang, Y., 2008. Methods for image fusion quality
assessment-A Review, comparison and analysis. In: The
This contribution has been peer-reviewed. The peer-review was conducted on the basis of the abstract.
145