Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery
<p>(<b>a</b>) The input image (one band of multispectral image); (<b>c</b>) The guidance image (the panchromatic band); (<b>b</b>) The outputs: from left to right, the regularization parameter <math display="inline"> <semantics> <mi>ε</mi> </semantics> </math> is set to 0.1<sup>2</sup>, and the radius <math display="inline"> <semantics> <mi>r</mi> </semantics> </math> is set to 2, 4, 6, and 8; (<b>d</b>) The outputs: from left to right, the radius <math display="inline"> <semantics> <mi>r</mi> </semantics> </math> is set to 4, and the regularization parameter <math display="inline"> <semantics> <mi>ε</mi> </semantics> </math> is set to 0.1<sup>2</sup>, 0.2<sup>2</sup>, 0.4<sup>2</sup>, and 0.8<sup>2</sup>. As the filter size <math display="inline"> <semantics> <mi>r</mi> </semantics> </math> increases, the edge information gradually becomes visible. As the degree of smoothing <math display="inline"> <semantics> <mi>ε</mi> </semantics> </math> increases, the changes in filtering outputs are slight.</p> "> Figure 2
<p>The flow chart of the proposed method. The numbers in parentheses reference the related equations.</p> "> Figure 3
<p>1-D illustrations of different detail injection models: (<b>a</b>) The proposed model, detail = <span class="html-italic">SP</span> × <span class="html-italic">α<sub>i</sub></span>; (<b>b</b>) Equal proportion injection model, detail = <span class="html-italic">SP</span> (spatial detail) × 1; (<b>c</b>) Generic GS based model [<a href="#B30-ijgi-06-00404" class="html-bibr">30</a>].</p> "> Figure 4
<p>Example of the fused results with different injection weights: (<b>a</b>) The original MS (multispectral) image; (<b>b</b>) The proposed model, detail = <span class="html-italic">SP</span> × <span class="html-italic">α<sub>i</sub></span>; (<b>c</b>) Equal proportion injection model, detail = <span class="html-italic">SP</span> × 1; (<b>d</b>) Generic GS (Gram-Schmidt transformation) based model [<a href="#B30-ijgi-06-00404" class="html-bibr">30</a>]; (<b>e</b>) Local patches from (<b>b</b>) to (<b>d</b>).</p> "> Figure 4 Cont.
<p>Example of the fused results with different injection weights: (<b>a</b>) The original MS (multispectral) image; (<b>b</b>) The proposed model, detail = <span class="html-italic">SP</span> × <span class="html-italic">α<sub>i</sub></span>; (<b>c</b>) Equal proportion injection model, detail = <span class="html-italic">SP</span> × 1; (<b>d</b>) Generic GS (Gram-Schmidt transformation) based model [<a href="#B30-ijgi-06-00404" class="html-bibr">30</a>]; (<b>e</b>) Local patches from (<b>b</b>) to (<b>d</b>).</p> "> Figure 5
<p>Analysis of the influence of the parameter <math display="inline"> <semantics> <mi>r</mi> </semantics> </math>.</p> "> Figure 6
<p>Analysis of the influence of the parameter <math display="inline"> <semantics> <mi>ε</mi> </semantics> </math>.</p> "> Figure 7
<p>Analysis of the influence of the parameter <math display="inline"> <semantics> <mi>R</mi> </semantics> </math>.</p> "> Figure 8
<p>(<b>a</b>) Details of the building edges with different window radii. (<b>b</b>) The corresponding spectral profile curves of specific parts in (<b>a</b>).</p> "> Figure 9
<p>The fused results and some local patches using different methods on GF-2 images over an urban area: (<b>a</b>) original Pan (panchromatic) image; (<b>b</b>) original MS image; (<b>c</b>) GS; (<b>d</b>) NND (nearest-neighbor diffusion); (<b>e</b>) UNB (University of New Brunswick method); (<b>f</b>) GSA (adaptive GS method); (<b>g</b>) GD; and (<b>h</b>) the proposed method.</p> "> Figure 10
<p>The fused results using different methods on GF-2 images over water bodies: (<b>a</b>) original Pan image; (<b>b</b>) original MS image; (<b>c</b>) GS; (<b>d</b>) NND; (<b>e</b>) UNB; (<b>f</b>) GSA; (<b>g</b>) GD; and (<b>h</b>) the proposed method.</p> "> Figure 11
<p>The fused results using different methods on GF-2 images over cropland: (<b>a</b>) original Pan image; (<b>b</b>) original MS image; (<b>c</b>) GS; (<b>d</b>) NND; (<b>e</b>) UNB; (<b>f</b>) GSA; (<b>g</b>) GD; and (<b>h</b>) the proposed method.</p> "> Figure 12
<p>The fused results using different methods on GF-2 images over a forest area: (<b>a</b>) original Pan image; (<b>b</b>) original MS image; (<b>c</b>) GS; (<b>d</b>) NND; (<b>e</b>) UNB; (<b>f</b>) GSA; (<b>g</b>) GD; and (<b>h</b>) the proposed method.</p> "> Figure 12 Cont.
<p>The fused results using different methods on GF-2 images over a forest area: (<b>a</b>) original Pan image; (<b>b</b>) original MS image; (<b>c</b>) GS; (<b>d</b>) NND; (<b>e</b>) UNB; (<b>f</b>) GSA; (<b>g</b>) GD; and (<b>h</b>) the proposed method.</p> "> Figure 13
<p>The classification results of the fused image produced from different methods: (<b>a</b>) the original image displayed in true color; (<b>b</b>) and (<b>c</b>) ground truth data; (<b>d</b>) GS; (<b>e</b>) NND; (<b>f</b>) UNB; (<b>g</b>) GSA; (<b>h</b>) GD; and (<b>i</b>) the proposed method.</p> ">
Abstract
:1. Introduction
2. Guided Image Filtering
2.1. Guided Image Filtering
2.2. Influence of Parameters
3. Proposed Algorithm for GaoFen-2 (GF-2) Datasets
3.1. Problem Formulation and Notations
3.2. Guided Filtering Based Pan-Sharpening
- (i)
- The original multispectral image is registered and resampled to be the same size as the original Pan image .
- (ii)
- By minimizing the residual sum of squares (Equation (5)), the weights (with ) can be easily estimated.Thereafter, a synthetic low-resolution panchromatic image can be obtained with Equation (6).
- (iii)
- Each (with ) is taken as the guidance image to guide the filtering process of the low-resolution Pan image , and the filter output (with ) is obtained as follows:
- (iv)
- The pan-sharpening result is obtained by extracting the spatial information from the Pan image and injecting it into the resampled MS image according to the weight . This process can be formulated as shown in Equations (8) and (9):
3.3. Effectiveness of the Proposed Method
4. Datasets and Experimental Settings
4.1. GF-2 Datasets
4.2. Methods Considered for Comparison
- (1)
- Gram-Schmidt Transformation (GS) [30]: The general GS method uses the Blue, Green, Red and NIR bands of an MS image to simulate a low-resolution Pan band according to corresponding predefined weights. Thereafter, the GS transformation is applied to the synthetic Pan and low-resolution MS images using the first band from the former. Finally, the high-resolution Pan image replaces the first band of the GS transformed bands, and the inverse GS transformation is employed to produce a fused MS image. This method has been integrated into the ENVI 5.3 software. The average of the low-resolution multispectral bands is calculated as the low-resolution Pan band.
- (2)
- Adaptive GS method [12]. The GSA method has the same processing procedures as the general GS method except that the GS method uses equal weight coefficients for each MS band to obtain an intensity image, whereas the GSA method employs the weight coefficients derived using a regression between the MS and degraded low-resolution Pan images. Both methods enjoy the injection gains given by Equation (10):
- (3)
- Nearest-neighbor Diffusion-based Pan-Sharpening (NND) [17]: The NND method first downsamples the high-resolution Pan image to match the size of the MS image. Then, it calculates the spectral band contribution vector using linear regression and obtains the difference factors from the neighboring super pixels of each pixel in the original Pan image. Finally, this method applies a linear mixture model to acquire a fused image. Two important external parameters, an intensity smoothness factor and a spatial smoothness factor, are set based on the intended application. In this study, the default values of the parameters were utilized across all the experiments.
- (4)
- University of New Brunswick method (UNB) [14]: The UNB pan-sharpening method first equalizes the histogram of the MS image and Pan image. Then, the spectral bands of the MS image, that are covered by the Pan band, are employed to produce a new synthetic image using the least squares technique. Finally, all the equalized bands of the MS image are fused with the synthesized image to obtain a high-resolution multispectral image. This method is currently integrated into the PCI Geomatica software. In this study, the method was executed using the default parameters.
- (5)
- GD method: Zhao et al. [31] proposed a fusion method based on a guided image filter that takes the resampled MS image as the guidance image and the original Pan image as the input image to implement the filtering process. Next, it obtains the filtered image with the related spatial information. Finally, the spatial details of the original Pan image are extracted and injected into each MS band according to the weight defined by Equation (11) [12] to obtain the fused image. Equation (11) calculates the optimal coefficient, which indicates the amount of spatial detail that should be injected into the corresponding MS band:
4.3. Evaluation Methods
- (1)
- Entropy is used to measure the spatial information contained in a fused image. The higher the Entropy is, the richer the spatial information possessed by the fused image is. Entropy is expressed as follows:
- (2)
- CC [36] measures the correlation between the MS image and fused image. The value of CC ranges from 0 to 1. A higher correlation value indicates a better correspondence between the MS image and fused image, the ideal correlation coefficient value is 1. CC is defined as follows:
- (3)
- UIQI [37] models any distortion as a combination of three different factors: loss of correlation, luminance distortion, and contrast distortion. It is suitable for most image evaluations, and the best value is 1. UIQI is given by:
- (4)
- ERGAS [32] evaluates the overall spectral distortion of the pan-sharpened image. The lower the ERGAS value is, the better the spectrum quality of the fused image is. The best ERGAS value is 0. The definition of ERGAS is as follows:
5. Results and Discussion
5.1. Analysis of the Influence of Parameters
5.1.1. Parameter Influences in the Guided Filter
5.1.2. The Influence of the Window Radius for Calculating Weights
5.2. Comparison of Different Pan-Sharpening Approaches
5.3. Computational Complexity
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Zhang, Y.; Mishra, R.K. A review and comparison of commercially available pan-sharpening techniques for high resolution satellite image fusion. In Proceedings of the Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 182–185. [Google Scholar]
- Dong, J.; Zhuang, D.; Huang, Y.; Fu, J. Advances in multi-sensor data fusion: Algorithms and applications. Sensors 2009, 9, 7771–7784. [Google Scholar] [CrossRef] [PubMed]
- Ehlers, M.; Klonus, S.; Johan Åstrand, P.; Rosso, P. Multi-sensor image fusion for pansharpening in remote sensing. Int. J. Image Data Fusion 2010, 1, 25–45. [Google Scholar] [CrossRef]
- Xu, Q.; Li, B.; Zhang, Y.; Ding, L. High-Fidelity Component Substitution Pansharpening by the Fitting of Substitution Data. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7380–7392. [Google Scholar] [CrossRef]
- Gillespie, A.R.; Kahle, A.B.; Walker, R.E. Color enhancement of highly correlated images. II. Channel ratio and chromaticity transformation techniques. Remote Sens. Environ. 1987, 22, 343–365. [Google Scholar] [CrossRef]
- Zhang, Y. A new merging method and its spectral and spatial effects. Int. J. Remote Sens. 1999, 20, 2003–2014. [Google Scholar] [CrossRef]
- Murga, J.N.D.; Otazu, X.; Fors, O.; Arbiol, R. Multiresolution-based image fusion with additive wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1204–1211. [Google Scholar] [CrossRef] [Green Version]
- Tu, T.-M.; Huang, P.S.; Hung, C.-L.; Chang, C.-P. A fast intensity–hue–saturation fusion technique with spectral adjustment for IKONOS imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 309–312. [Google Scholar] [CrossRef]
- Dou, W.; Chen, Y.; Li, X.; Sui, D.Z. A general framework for component substitution image fusion: An implementation using the fast image fusion method. Comput. Geosci. 2007, 33, 219–228. [Google Scholar] [CrossRef]
- Aiazzi, B.; Baronti, S.; Selva, M. Improving Component Substitution Pansharpening Through Multivariate Regression of MS +Pan Data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
- Vivone, G.; Alparone, L.; Chanussot, J. A Critical Comparison Among Pansharpening Algorithms. IEEE Trans. Geosci. Remote Sens. 2015, 53, 2565–2585. [Google Scholar] [CrossRef]
- Zhang, Y. Understanding image fusion. Photogramm. Eng. Remote Sens. 2004, 70, 657–661. [Google Scholar]
- Pohl, C.; Van Genderen, J.L. Review article Multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef]
- Xie, B.; Zhang, H.; Huang, B. Revealing Implicit Assumptions of the Component Substitution Pansharpening Methods. Remote Sens. 2017, 9, 443. [Google Scholar] [CrossRef]
- Fryskowska, A.; Wojtkowska, M.; Delis, P.; Grochala, A. Some aspects of satellite imagery integration from eros b and landsat 8. ISPRS–Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B7, 647–652. [Google Scholar] [CrossRef]
- Santurri, L.; Carlà, R.; Fiorucci, F.; Aiazzi, B.; Baronti, S.; Guzzetti, F. Assessment of very high resolution satellite data fusion techniques for landslide recognition. In Proceedings of the ISPRS Centenary Symposium, Vienna, Austria, 5–7 July 2010; Volume 38, pp. 492–497. [Google Scholar]
- Sun, W.; Chen, B.; Messinger, D.W. Nearest-neighbor diffusion-based pan-sharpening algorithm for spectral images. Opt. Eng. 2014, 53, 013107. [Google Scholar] [CrossRef]
- Hinton, G.E.; Salakhutdinov, R.R. Reducing the dimensionality of data with neural networks. Science 2006, 313, 504–507. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Chen, X.; Peng, H.; Wang, Z. Multi-focus image fusion with a deep convolutional neural network. Inf. Fusion 2017, 36, 191–207. [Google Scholar] [CrossRef]
- Jiang, W.; Baker, M.L.; Wu, Q.; Bajaj, C.; Chiu, W. Applications of a bilateral denoising filter in biological electron microscopy. J. Struct. Biol. 2003, 144, 114–122. [Google Scholar] [CrossRef] [PubMed]
- Fukunaga, K.; Hostetler, L.D. The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. Inf. Theory 1975, 21, 32–40. [Google Scholar] [CrossRef]
- Cheng, Y.Z. Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 1995, 17, 790–799. [Google Scholar] [CrossRef]
- He, K.; Sun, J.; Tang, X. Guided Image Filtering. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1397–1409. [Google Scholar] [CrossRef] [PubMed]
- Levin, A.; Lischinski, D.; Weiss, Y. A closed form solution to natural image matting. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 30, 228–242. [Google Scholar] [CrossRef] [PubMed]
- Durand, F.; Dorsey, J. Fast bilateral filtering for the display of high-dynamic-range images. ACM Trans. Graph. (TOG) 2002, 21, 257–266. [Google Scholar] [CrossRef]
- Petschnigg, G.; Szeliski, R.; Agrawala, M.; Cohen, M.; Hoppe, H.; Toyama, K. Digital photography with flash and no-flash image pairs. ACM Trans. Graph. 2004, 23, 664–672. [Google Scholar] [CrossRef]
- He, K.; Sun, J.; Tang, X. Single Image Haze Removal Using Dark Channel Prior. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 2341–2353. [Google Scholar] [CrossRef] [PubMed]
- Li, S.; Kang, X.; Hu, J. Image Fusion With Guided Filtering. IEEE Trans. Image Process. 2013, 22, 2864–2875. [Google Scholar] [CrossRef] [PubMed]
- Draper, N.; Smith, H. Applied Regression Analysis, 2nd ed.; John Wiley: New York, NY, USA, 1981. [Google Scholar]
- Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. U.S. Patent 6011875 A, 4 January 2000. [Google Scholar]
- Zhao, W.; Dai, Q.; Zheng, Y.; Wang, L. A new pansharpen method based on guided image filtering: A case study over Gaofen-2 imagery. In Proceedings of the IGARSS IEEE International Geoscience and Remote Sensing Symposium, Beijing, China, 10–15 July 2016; pp. 3766–3769. [Google Scholar]
- Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
- Klonus, S.; Ehlers, M. Image fusion using the Ehlers spectral characteristics preserving algorithm. GIS Remote Sens. 2007, 44, 93–116. [Google Scholar] [CrossRef]
- Bovik, A.; Wang, Z. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
- Ranchin, T.; Wald, L. Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation. Photogramm. Eng. Remote Sens. 2000, 66, 49–61. [Google Scholar]
- Yuhas, R.H.; Goetz, A.F.H.; Boardman, J.W. Discrimination among semi-arid landscape endmembers using the Spectral Angle Mapper (SAM) algorithm. In Proceedings of the Summaries 3rd Annual JPL Airborne Geoscience Workshop, Pasadena, CA, USA, 1–5 June 1992; pp. 147–149. [Google Scholar]
- Alparone, L.; Baronti, S.; Garzelli, A.; Nencini, F. A global quality measurement of Pan-sharpened multispectral imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 313–317. [Google Scholar] [CrossRef]
- Boser, B.; Guyon, I.; Vapnik, V. A training algorithm for optimal margin classifiers. In Proceedings of the 5th Annual Workshop on Computational Learning Theory, Pittsburgh, PA, USA, 27–29 July 1992; pp. 144–152. [Google Scholar]
Spatial resolution | MS: 3.2 m | |
Pan: 0.8 m | ||
Spectral range | Blue: 450–520 nm | |
Green: 520–590 nm | ||
Red: 630–690 nm | ||
NIR: 770–890 nm | ||
Pan: 450–900 nm | ||
Image locations | Guangzhou | |
Land cover types | Urban, rural, water body, cropland, forest, concrete buildings, etc. | |
Image size | ① | MS: 250 250 |
Pan: 1000 1000 | ||
② | MS: 1250 1250 | |
Pan: 5000 5000 | ||
③ | MS: 1250 1250 | |
Pan: 5000 5000 | ||
④ | MS: 1250 1250 | |
Pan: 5000 5000 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 7.189 | |||
GS | 6.877 | 0.848 | 0.878 | 22.504 |
NND | 6.863 | 0.779 | 0.881 | 36.835 |
UNB | 6.685 | 0.824 | 0.881 | 26.102 |
GSA | 6.912 | 0.893 | 0.888 | 21.001 |
GD | 6.891 | 0.878 | 0.902 | 25.731 |
Proposed | 7.156 | 0.959 | 0.962 | 14.150 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 4.113 | |||
GS | 3.978 | 0.608 | 0.633 | 31.796 |
NND | 3.997 | 0.393 | 0.593 | 70.754 |
UNB | 3.802 | 0.606 | 0.661 | 40.418 |
GSA | 3.916 | 0.700 | 0.757 | 39.723 |
GD | 3.805 | 0.538 | 0.598 | 44.323 |
Proposed | 3.933 | 0.726 | 0.790 | 39.589 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 6.709 | |||
GS | 6.642 | 0.855 | 0.859 | 14.697 |
NND | 6.765 | 0.661 | 0.766 | 40.106 |
UNB | 6.464 | 0.828 | 0.844 | 18.575 |
GSA | 6.625 | 0.908 | 0.912 | 15.126 |
GD | 6.658 | 0.777 | 0.841 | 29.136 |
Proposed | 6.563 | 0.905 | 0.921 | 16.460 |
Method | Entropy | UIQI | CC | ERGAS |
---|---|---|---|---|
MS | 5.882 | |||
GS | 6.749 | 0.663 | 0.732 | 56.697 |
NND | 6.578 | 0.727 | 0.745 | 28.991 |
UNB | 6.482 | 0.688 | 0.752 | 56.064 |
GSA | 6.588 | 0.720 | 0.792 | 55.977 |
GD | 6.667 | 0.675 | 0.753 | 59.855 |
Proposed | 6.504 | 0.856 | 0.896 | 34.666 |
Method | GS | NND | UNB | GSA | GD | Proposed |
---|---|---|---|---|---|---|
OA (%) | 74.08 | 73.91 | 72.50 | 74.77 | 74.77 | 76.04 |
Kappa | 0.7029 | 0.7009 | 0.6844 | 0.7107 | 0.7103 | 0.7256 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zheng, Y.; Dai, Q.; Tu, Z.; Wang, L. Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery. ISPRS Int. J. Geo-Inf. 2017, 6, 404. https://doi.org/10.3390/ijgi6120404
Zheng Y, Dai Q, Tu Z, Wang L. Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery. ISPRS International Journal of Geo-Information. 2017; 6(12):404. https://doi.org/10.3390/ijgi6120404
Chicago/Turabian StyleZheng, Yalan, Qinling Dai, Zhigang Tu, and Leiguang Wang. 2017. "Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery" ISPRS International Journal of Geo-Information 6, no. 12: 404. https://doi.org/10.3390/ijgi6120404