Anda di halaman 1dari 7

A new method of Quickbird own image fusion

Ying HAN a, Hong JIANG a,b* Xiuying ZHANG a a International Institute for Earth System Science, Nanjing University, Nanjing 210093, China; b International Center of Spatial Ecology and Ecosystem Ecology, Zhejiang Forestry University, Hangzhou 311300, China

With the rapid development of remote sensing technology, the means of accessing to remote sensing data become increasingly abundant, thus the same area can form a large number of multi-temporal, different resolution image sequence. At present, the fusion methods are mainly: HPFIHS transform methodPCA methodBroveyMallat algorithm and wavelet transform and so on. There exists a serious distortion of the spectrums in the IHS transform Mallat algorithm omits low-frequency information of the high spatial resolution images, the integration results of which has obvious blocking effects. Wavelet multi-scale decomposition for different sizes, the directions, details and the edges can have achieved very good results, but different fusion rules and algorithms can achieve different effects. This article takes the Quickbird own image fusion as an example, basing on wavelet transform and HVS, wavelet transform and IHS integration. The result shows that the former better. This paper introduces the correlation coefficient, the relative average spectral error index and usual index to evaluate the quality of image. Key words: Image fusion; Quick bird; IHS transform and wavelet transform; HVS transform and wavelet transform



In the field of remote sensing application, it is often efficient and economic, to acquire high spatial-resolution multispectral image by combining a high spatial-resolution and low resolution data[1]. At the same time, most earth resources satellites such as Landsat, SPOT and Quickbird have developed both panchromatic and multispectral images. To utilize these image effectively, it is important to merge the higher resolution panchromatic images and lower resolution multispectral images. Popular data merging methods includes intensity-hue-saturation (IHS) [2], principal component analysis (PCA) [3], high pass filtering (HPF) [4], Brovey, Mallat and wavelet transforms [5-7]. However, every fusion method has its specific applicationespecially for the very high resolution image fusion. For example, Zhang et al [8] posed Quickbird panchromatic and mul-spectral image fusion based on wavelet packet transformation; Zhang et al [9] studied the information influence on Quickbird images by Brovey and Wavelet fusion, and the result showed that wavelet fusion kept higher spatial resolution and better spectrum character; and Brovey fusion may keep higher spatial resolution, but spectrum information losses relatively heavy. Ye et al [10] posed an integrated technique for IKONOS images fusion combining IHS transformation and wavelet transformation algorithms. Therefore, combining different methods is has advantage for merging imagers. Human vision system (HVS) has been used to study the human vision system characteristic theory. Zhou et al (11) used wavelet decomposition to estimate the HVS characteristic quantitatively.
*Email:; Phone 86-025-83595969

In order to improve the eye-interpretation effect of multi-spectral images of Quickbird and overcome the most color distortion shortcoming, this paper focuses on discussing a new image fusion method based on IHS and wavelet transformation and an analysis of different fusion results. Some usual indices are used as quality measures.

2.1 Wavelet transformation


Features that ate important for a certain application can be of arbitrary sizes or, in other words, can be dominant in any particular scale. Therefore a multi-resolution analysis such as the wavelet transform provides a very useful mathematical tool for image analysis and computer vision

. The wavelet transform represents any arbitrary function as a

superposition of wavelets, which are functions generated from a mother wavelet by dilations and translations, as (1) Where a is the dilation of the wavelet and b is the time translation. Figure1 shows the base stage of the wavelet transform: wavelet decomposition and wavelet reconstruction. In figure1, A is the original image, H is a low-filter and G is a high-filter. The details of wavelet transform are mostly based on [13-14]. Through the below stage, the final image can keep the original signals.

Fig1.One stage of wavelet transform and inverse wavelet transform.

2.2 Fusion Scheme

According to the method based on IHS transform and wavelet transform, this study use the color model of HVS to perform fusion, the process is as follows: Step1. Select the most suitable bands by calculating the correlation as RGB. Step2. The multispectral image is super-sampled and geometrically registered onto the panchromatic image. This ensures that they occupy the same geographic space and the same pixel size. Step3. The multispectral image is transformed with HVS, obtaining the HVS. Step4. The histograms of the V and the panchromatic image are matched. Step5. The image V and the panchromatic image are decomposed using wavelet with the same number of decomposition levels. In the paper, the decomposed level is 2. Step6. The approximation sub-band of the fused stationary wavelet representation is the mean of sub-band derived from

the V and panchromatic image. The detail sub-bands are the maximum of the V and panchromatic image. Step7. Inverse wavelet transformation is performed on the fused sub-bands to obtain one band of the final fusion image. Step8. Inverse HVS transformation is performed.



Firstly, the experiment was performed on the ENVI own Quickbird images. The original panchromatic Quickbird image has 0.7m pixels while the original multispectral Quickbird image has 2.8m pixels. The original Quickbird bands are selected to the (RGB) system based on the smaller correlation (Table 1). To keep the information and match the wavelength range of the participating in fusion of multispectral and panchromatic image, band4, Band3 and band1 was for R G B, respectively. Secondly, HVS transform and wavelet transform was performed by using a compiled program method in the Matlab software. The decomposition level is fixed to 2.The re-sampled and the registered multispectral image and panchromatic image of the same area were presented in Figure1 (a) and Figure1 (b) respectively. Theirs size are 512 pixel * 512 pixel. Obviously, the spatial resolution of the Pan image was better than that of the Mul image, the spectral information of the Mul image was more than Pans, for example, the object of green label.
Table 1. Correlation Band 2 Band 3 Band 4 The different bands correlation of the original Quickbird image Band 1 0.972 0.857 0.641 Band 2 1.000 0.919 0.718 Band 3 0.919 1.000 0.884 Band 4 0.718 0.884 1.000

The purpose of image fusion was to enhance the spatial and spectral resolution from low-resolution image. It was necessary to propose quality indicators to measure the quality of the images generated from different image fusion methods. In this study, usual several indices were selected to assess the quality of fused image from qualitative (or subjective) methods and quantitative (or objective) methods.

3.1 Qualitative analysis

The remotely sensed imagery contains spectral and spatial information. From Fig 2, especially in the green label, the objects in Fig 2(c) were obviously much clearer than that in Fig 2(a). Similary it had much color than Fig 2(b), which showed that the HVS+ wavelet image kept more spectral information from the Mul image (Fig 2 (a)), as well obtained more detail information from the Pan image (Fig 2 (b)), but not totally. In other words, the HVS transform and wavelet transform also lost some spectral and spatial information. Moreover, comparing Fig 2(d) with Fig 2(a) and Fig 2(b), the color distortion of Fig 2(d) was serious, may because IHS color transform was performed in ERDAS.

Fig 2.The re-sampled and registered multispectral image, the panchromatic image and the fusion results.

3.2 Quantitative analysis

From table2, the every band correlation value of the HVS+wavelet was higher than the IHS+wavelet, which showed the first method was better in preserving spectral and spatial information from source image than the second method. Similarly, there were some negative correlation with IHS transform and wavelet transform. It means serious color distortion in R band and G band. This mainly because the spectral range of Quickbird multispectral bands differed from these panchromatic bands, so the radiation energy of the objects in panchromatic image was different from that in multispectral image. In addition, the information difference of source image and fused image were used to evaluate the quality of fusion results (Table3). From the indicator data, the Std GSf[15] value of the fusion image was higher than the source image, and HVS+wavelet image was higher than IHS+wavelet image, indicating that two transform methods both increased the

information than source and improve the contrast details texture and definition, so HVS+wavelet was better. Furthermore, the HVS+wavelet image was small than the IHS+wavelet image in the value of DDindex[15] and CEN[16], especially RASE[17] mirroring the spectral quality comprehensive index was close to 0 and was more small, therefore color distortion was smaller, and spectral quality was better.
Table 2. The correlation of the fusion image and the source image for figure1 Correlation R HVS+ wavelet R G B IHS+ wavelet R G B 0.896 0.672 0.508 -0.483 0.344 0.301 Mul G 0.670 0.910 0.883 0.069 -0.222 0.643 B 0.463 0.816 0.932 0.326 -0.422 0.603 R 0.922 0.672 0.508 -0.213 0.090 0.449 Pan G 0.670 0.910 0.883 -0.213 0.090 0.449 B 0.463 0.816 0.932 -0.213 0.090 0.449

Table3. Entropy Pan R Mul G B HVS+ Wavelet R G B R

IHS+ Wavelet

The difference of fusion image and source image for figure1 Std 58.235 48.467 46.754 57.738 52.077 49.605 57.315 44.864 47.384 82.039 Sf 29.084 11.834 11.534 13.751 24.818 22.652 23.335 16.316 17.420 18.041 G 13.363 6.746 6.536 7.785 11.475 10.427 10.845 7.167 7.020 7.340 D 3.913 3.458 3.253 17.134 25.649 2.186 D index 0.003 0.003 0.003 0.117 0.214 0.200 Mul 0.095 0.068 0.037 0.582 0.303 1.192 CEN Pan 0.138 0.096 0.104 0.546 0.498 1.257 0.065 0.002 RASE

Mean 142.398 158.668 140.381 137.722 145.445 128.774 126.031 129.966 137.436 82.039

7.698 7.492 7.506 7.720 7.610 7.579 7.758 7.268 7.289 6.724


However, compared with source multispectral image, HVS+wavelet image increased for the Entropy, while IHS+wavelet image lost. For Mean, two images both decreased values, because the approximation sub-band was the mean of mul and pan in fusion scheme. HVS+wavelet image was still more than IHS+wavelet image. In total, HSV transform and wavelet transform could keep more and complete spectral information, and preserve spatial and spectral characteristic from source image than HIS transform and wavelet transform. At the same time, the landmark was more regular and not very abundant in the images, thus the method has some limitation.



The combination of different transform methods was efficient for the remote sensing image fusion. However, spectral range can affect the fused image, and determine the selection of the method and algorithm. In this study, the effects of the two methods on fusion information were investigated. The fusion image quality was measured with common indicator. Empirical results showed that the HVS transform and wavelet transform was suitable for Quickbird image. ACKNOWLEDGEMENT Funding support partially from the state key fundamental science funds of china (2005CB422207 &2005CB422208), NSF-China project (40671132), State international cooperation projection project (200831810) and the state data synthesis and analysis funds of china (2006DKA32300-08,2007FY110300-08), the Zhejiang province key project (2008C13G2100010).

REFERENCE 1. 2. 3. 4. 5. 6. 7. 8. 9.
C. Pohl, Multisensor image fusion in remote sensing: Concepts and application, Int. J. Remote sens. 19, 825-854 (1998). W. J. Carper, T. W. Lilesand, et al. The use of intensity-hue-saturation transformation for merging SPOT Panchromatic and multispectral image fusion, Photogramm. Eng. Remote Sensing. 56, 459-467 (1990). P. S. Chavez, S. C. Sildes, et al. Comparison of three different methods to merge multi-resolution and multispectral data: LANDSAT TM and SPOT panchromatic, Photogramm. Eng. Remote Sensing. 57, 295-303(1991). V. K. Sheffigara, A generalized component substitution technique for spatial enhancement of multispectral images using a higher resolution data set, Photogramm. Eng. Remote Sensing 58, 561-567(1992). Z. J. Wang, D. R. Li, et al. Wavelet theory based IKONOS panchromatic and multispectral image fusion, Acta Geodaetica et Cartographica Sinica (AGCS) 5, 112-116(2001). (In Chinese) J. Nunez, Multiresolution-based image fusion with additive wavelet decomposition [J]. IEEE Transon Geosciences and Remote Rensing, 37(3): 1204-1211(1999). H. Li, B. S. Manjunath, et al. Multisensor image fusion using the wavelet transform [J]. Graphical Models and Image Processing. 27(3): 235-244(1995). W. J. Zhang, J.Y. Kang, Quickbird panchromatic and mul-spectral image fusion based on wavelet packet transformation. Space electronic technology. 2, 48-52(2005). (In Chinese) N. Y. Zhang, Q.Y. Wu, Information influence on Quickbird images by Brovey fusion and wavelet fusion. Remote sensing technology and application. 21(1): 67-70(2006). (In Chinese)

10. B. J. Ye, S. J. Xu, et al. Integrated technique for MITRAS: Multisensor image fusion using the wavelet transform [J].
Graphical models and image processing, 57, 234-245(1995).

11. L. Zhou, Zh. Y. Wang, et al., A new wavelet image fusion algorithm based on human visual system [J]. Journal of
image and graphics, 9(9): 1088-1094(2004). (In Chinese)

12. H. Li, B. S. Manjunath, S. MITRA, Multi-sensor image fusion using the wavelet transform [J]. Graphical model and
image processing.57, 234-245(1995).

13. S. G. Mallat, A theory for multi-resolution signal decomposion: The wavelet representation, IEEE Trans. Acoust,
Speech signal process. ASSp-37, 12, 2091-2110(1989).

14. M. LIghtstone, E. Majani, The wavelet transform and data compression, JPL technique report, IAS Group, Section
384, Oct, (1993).

15. H. H. Wang, J. X. Peng, et al. A study of evaluation methods on performance of the multi-source remote sensing
image fusion [J]. Compute engineering and application, 25, 33-37(2003) (In Chinese).

16. S. X. Xu, Y. Q. XU, Application of matlab in remote sensing image fusion algorithm and quality evaluation [J].
Application of the computer system, 11, 91-95(2007) (In Chinese).

17. G. A. Maia, L. S. Jose, et al. Fusion of multispectral and panchromatic images using improve IHS and PCA merges
based on wavelet decomposition [A]. IEEE Transaction on geo-science and Remote Sensing, 1291-1299(2004).