Multi-Resolution Collaborative Fusion of SAR, Multispectral and Hyperspectral Images for Coastal Wetlands Mapping
<p>Flowchart of the proposed method.</p> "> Figure 2
<p>Coastal wetlands study area. (<b>a</b>) Yellow River Estuary; (<b>b</b>) Yancheng; (<b>c</b>) Hangzhou Bay.</p> "> Figure 3
<p>Sample distribution and proportion. (<b>a</b>) Yancheng; (<b>b</b>) Yellow River Estuary; (<b>c</b>) Hangzhou Bay; (<b>d</b>) Sample proportion.</p> "> Figure 4
<p>Original and fusion images of Yancheng. (<b>a</b>) ZY-1 02D HSI. (<b>b</b>) Sentinel-2A MSI. (<b>c</b>) Sentinel-1A SAR. (<b>d</b>) Proposed M2CF. (<b>e</b>) IHS. (<b>f</b>) GS. (<b>g</b>) Brovey. (<b>h</b>) PRACS. (<b>i</b>) CNMF. (<b>j</b>) NLSTF. (<b>k</b>) SARF. (<b>l</b>) GSA-Hysure. (<b>m</b>) GSA-BDSD. (<b>n</b>) GSA-ATWT.</p> "> Figure 5
<p>Original and fusion images of Yellow River Estuary. (<b>a</b>) ZY-1 02D HSI. (<b>b</b>) Sentinel-2A MSI. (<b>c</b>) Sentinel-1A SAR. (<b>d</b>) Proposed M2CF. (<b>e</b>) IHS. (<b>f</b>) GS. (<b>g</b>) Brovey. (<b>h</b>) PRACS. (<b>i</b>) CNMF. (<b>j</b>) NLSTF. (<b>k</b>) SARF. (<b>l</b>) GSA-Hysure. (<b>m</b>) GSA-BDSD. (<b>n</b>) GSA-ATWT.</p> "> Figure 6
<p>Original and fusion images of Hangzhou Bay. (<b>a</b>) GF-5B AHSI. (<b>b</b>) GF-5B VIMI. (<b>c</b>) Sentinel-1A SAR. (<b>d</b>) Proposed M2CF. (<b>e</b>) IHS. (<b>f</b>) GS. (<b>g</b>) Brovey. (<b>h</b>) PRACS. (<b>i</b>) CNMF. (<b>j</b>) NLSTF. (<b>k</b>) SARF. (<b>l</b>) GSA-Hysure. (<b>m</b>) GSA-BDSD. (<b>n</b>) GSA-ATWT.</p> "> Figure 7
<p>Comparison of the spectral profiles.</p> "> Figure 8
<p>The Classification results of Yancheng. (<b>a</b>) RF-classifier; (<b>b</b>) SVM-classifier.</p> "> Figure 8 Cont.
<p>The Classification results of Yancheng. (<b>a</b>) RF-classifier; (<b>b</b>) SVM-classifier.</p> "> Figure 9
<p>The Classification results of Yellow River Estuary. (<b>a</b>) RF-classifier; (<b>b</b>) SVM-classifier.</p> "> Figure 10
<p>The Classification results of Hangzhou Bay. (<b>a</b>) RF-classifier; (<b>b</b>) SVM-classifier.</p> "> Figure 11
<p>Effect of parameter settings on M2CF.</p> ">
Abstract
:1. Introduction
- This paper firstly proposes a multi-modal collaborative fusion method for SAR, MS, and HS images to obtain the fused image with high spatial resolution, abundant spectral information, and the geometric and polarimetric properties of the SAR;
- In the proposed M2CF, the optimal spectral-spatial weighted modulation and spectral compensation are designed to reconstruct the high-fidelity fusion images. Furthermore, the backscattering gradients of SAR are guided to fuse, which is calculated from saliency gradients with edge-preserving, making it more suitable for cross-modal fusion with complex surface features;
- Fusion yields steady visible benefits, achieving the minimum spectral loss with high PSNR while robustly improving the classification results in coastal wetlands.
2. Methodology
2.1. Framework Description
2.2. Multi-Frequency Extraction of MS
2.3. Multi-Frequency Extraction of SAR
2.3.1. Polarization Synthesis
2.3.2. Guided Filter
2.4. Weight Configuration
2.4.1. Correlation Coefficient Weight
2.4.2. Saliency Gradient Weight
2.4.3. Hyperspectral Band Weight
2.4.4. Spectral Transformation Weight
2.5. M2CF: Multi-Modal MRA Collaborative Fusion
3. Study Area and Datasets
3.1. Study Area
3.2. Preprocess and Datasets
3.3. Sample Selection and Distribution
4. Experimental Results
4.1. Image Fusion Results
4.1.1. Fusion Results of Yancheng
4.1.2. Fusion Results of Yellow River Estuary
4.1.3. Fusion Results of Hangzhou Bay
4.2. Spectral Profiles Comparison
4.3. Classification Results
4.3.1. Classification Results of Yancheng
4.3.2. Classification Results of Yellow River Estuary
4.3.3. Classification Results of Hangzhou Bay
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Chang, M.; Meng, X.; Sun, W.; Yang, G.; Peng, J. Collaborative Coupled Hyperspectral Unmixing Based Subpixel Change Detection for Analyzing Coastal Wetlands. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 8208–8224. [Google Scholar] [CrossRef]
- Kulkarni, S.C.; Rege, P.P. Pixel level fusion techniques for SAR and optical images: A review. Inf. Fusion 2020, 59, 13–29. [Google Scholar] [CrossRef]
- Chandrakanth, R.; Saibaba, J.; Varadan, G.; Raj, P.A. Feasibility of high resolution SAR and multispectral data fusion. In Proceedings of the 2011 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Vancouver, BC, Canada, 24–29 July 2011; pp. 356–359. [Google Scholar]
- Mahdianpari, M.; Salehi, B.; Mohammadimanesh, F.; Brisco, B.; Homayouni, S.; Gill, E.; DeLancey, E.R.; Bourgeau-Chavez, L. Big data for a big country: The first generation of Canadian wetland inventory map at a spatial resolution of 10-m using Sentinel-1 and Sentinel-2 data on the Google Earth Engine cloud computing platform. Can. J. Remote Sens. 2020, 46, 15–33. [Google Scholar] [CrossRef]
- Yokoya, N.; Grohnfeldt, C.; Chanussot, J. Hyperspectral and multispectral data fusion: A comparative review of the recent literature. IEEE Geosci. Remote Sens. Mag. 2017, 5, 29–56. [Google Scholar] [CrossRef]
- Meng, X.; Shen, H.; Yuan, Q.; Li, H.; Zhang, L.; Sun, W. Pansharpening for cloud-contaminated very high-resolution remote sensing images. IEEE Trans. Geosci. Remote Sens. 2018, 57, 2840–2854. [Google Scholar] [CrossRef]
- Drusch, M.; Del Bello, U.; Carlier, S.; Colin, O.; Fernandez, V.; Gascon, F.; Hoersch, B.; Isola, C.; Laberinti, P.; Martimort, P. Sentinel-2: ESA’s optical high-resolution mission for GMES operational services. Remote Sens. Environ. 2012, 120, 25–36. [Google Scholar] [CrossRef]
- Ren, K.; Sun, W.; Meng, X.; Yang, G.; Du, Q. Fusing China GF-5 Hyperspectral Data with GF-1, GF-2 and Sentinel-2A Multispectral Data: Which Methods Should Be Used? Remote Sens. 2020, 12, 882. [Google Scholar] [CrossRef] [Green Version]
- Mleczko, M.; Mróz, M. Wetland mapping using sar data from the sentinel-1a and tandem-x missions: A comparative study in the biebrza floodplain (Poland). Remote Sens. 2018, 10, 78. [Google Scholar] [CrossRef] [Green Version]
- Moreira, A.; Prats-Iraola, P.; Younis, M.; Krieger, G.; Hajnsek, I.; Papathanassiou, K.P. A tutorial on synthetic aperture radar. IEEE Geosci. Remote Sens. Mag. 2013, 1, 6–43. [Google Scholar] [CrossRef] [Green Version]
- Ackermann, N.; Thiel, C.; Borgeaud, M.; Schmullius, C. Potential of fusion of SAR and optical satellite imagery for biomass estimation in temperate forested areas. In Proceedings of the Proc. of the ESA Living Planet Symp, Bergen, Norway, 28 June–2 July 2010; p. 43. [Google Scholar]
- Schmitt, A.; Leichtle, T.; Huber, M.; Roth, A. On the use of dual-co-polarized TerraSAR-X data for wetland monitoring. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, B7. [Google Scholar] [CrossRef] [Green Version]
- Joshi, N.; Baumann, M.; Ehammer, A.; Fensholt, R.; Grogan, K.; Hostert, P.; Jepsen, M.R.; Kuemmerle, T.; Meyfroidt, P.; Mitchard, E.T. A review of the application of optical and radar remote sensing data fusion to land use mapping and monitoring. Remote Sens. 2016, 8, 70. [Google Scholar] [CrossRef] [Green Version]
- Lin, K.; Li, W.; Liu, H.; Wu, J. Different Levels Multi-source Remote Sensing Image Fusion. In Proceedings of the 2019 IEEE International Conference on Signal, Information and Data Processing (ICSIDP), Chongqing, China, 11–13 December 2019; pp. 1–5. [Google Scholar]
- Pereira, L.d.O.; Freitas, C.d.C.; Sant´ Anna, S.J.S.; Lu, D.; Moran, E.F. Optical and radar data integration for land use and land cover mapping in the Brazilian Amazon. GISci. Remote Sens. 2013, 50, 301–321. [Google Scholar] [CrossRef]
- Mahyoub, S.; Fadil, A.; Mansour, E.; Rhinane, H.; Al-Nahmi, F. Fusing of optical and synthetic aperture radar (SAR) remote sensing data: A systematic literature review (SLR). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 127–138. [Google Scholar] [CrossRef] [Green Version]
- Wu, W.; Guo, S.; Cheng, Q. Fusing optical and synthetic aperture radar images based on shearlet transform to improve urban impervious surface extraction. J. Appl. Remote Sens. 2020, 14, 024506. [Google Scholar] [CrossRef]
- Yin, N.; Jiang, Q. Feasibility of multispectral and synthetic aperture radar image fusion. In Proceedings of the 2013 6th International Congress on Image and Signal Processing (CISP), Hangzhou, China, 16–18 December 2013; pp. 835–839. [Google Scholar]
- Gaetano, R.; Cozzolino, D.; D’Amiano, L.; Verdoliva, L.; Poggi, G. Fusion of SAR-optical data for land cover monitoring. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 5470–5473. [Google Scholar]
- Shao, Z.; Wu, W.; Guo, S. IHS-GTF: A fusion method for optical and synthetic aperture radar data. Remote Sens. 2020, 12, 2796. [Google Scholar] [CrossRef]
- Amarsaikhan, D.; Blotevogel, H.; Van Genderen, J.; Ganzorig, M.; Gantuya, R.; Nergui, B. Fusing high-resolution SAR and optical imagery for improved urban land cover study and classification. Int. J. Image Data Fusion 2010, 1, 83–97. [Google Scholar] [CrossRef]
- Kulkarni, S.C.; Rege, P.P.; Parishwad, O. Hybrid fusion approach for synthetic aperture radar and multispectral imagery for improvement in land use land cover classification. J. Appl. Remote Sens. 2019, 13, 034516. [Google Scholar] [CrossRef]
- Chen, S.; Zhang, R.; Su, H.; Tian, J.; Xia, J. SAR and multispectral image fusion using generalized IHS transform based on à trous wavelet and EMD decompositions. IEEE J. Sens. 2010, 10, 737–745. [Google Scholar] [CrossRef]
- Yang, J.; Ren, G.; Ma, Y.; Fan, Y. Coastal wetland classification based on high resolution SAR and optical image fusion. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp. 886–889. [Google Scholar]
- Byun, Y.; Choi, J.; Han, Y. An area-based image fusion scheme for the integration of SAR and optical satellite imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 2212–2220. [Google Scholar] [CrossRef]
- Byun, Y. A texture-based fusion scheme to integrate high-resolution satellite SAR and optical images. Remote Sens. Lett. 2014, 5, 103–111. [Google Scholar] [CrossRef]
- Garzelli, A. Wavelet-based fusion of optical and SAR image data over urban area. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 59–62. [Google Scholar]
- Jia, Y.; Blum, R.S. Fusion method of SAR and optical images for urban object extraction. In Proceedings of the Remote Sensing and GIS Data Processing and Applications; and Innovative Multispectral Technology and Applications (MIPPR), Wuhan, China, 14 November 2007; p. 67900M. [Google Scholar]
- Hu, J.; Hong, D.; Zhu, X.X. MIMA: MAPPER-induced manifold alignment for semi-supervised fusion of optical image and polarimetric SAR data. IEEE Trans. Geosci. Remote Sens. 2019, 57, 9025–9040. [Google Scholar] [CrossRef] [Green Version]
- Hu, J.; Mou, L.; Schmitt, A.; Zhu, X.X. FusioNet: A two-stream convolutional neural network for urban scene classification using PolSAR and hyperspectral data. In Proceedings of the 2017 Joint Urban Remote Sensing Event (JURSE), Dubai, United Arab Emirates, 6–8 March 2017; pp. 1–4. [Google Scholar]
- Li, T.; Zhang, J.; Zhao, H.; Shi, C. Classification-oriented hyperspectral and PolSAR images synergic processing. In Proceedings of the 2013 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Melbourne, Australia, 21–26 July 2013; pp. 1035–1038. [Google Scholar]
- Koch, B. Status and future of laser scanning, synthetic aperture radar and hyperspectral remote sensing data for forest biomass assessment. ISPRS J. Photogram. Remote Sens. 2010, 65, 581–590. [Google Scholar] [CrossRef]
- Chen, C.M.; Hepner, G.; Forster, R. Fusion of hyperspectral and radar data using the IHS transformation to enhance urban surface features. ISPRS J. Photogram. Remote Sens. 2003, 58, 19–30. [Google Scholar] [CrossRef]
- Nasrabadi, N.M. A nonlinear kernel-based joint fusion/detection of anomalies using hyperspectral and SAR imagery. In Proceedings of the 2008 15th IEEE International Conference on Image Processing (ICIP), San Diego, CA, USA, 12–15 October 2008; pp. 1864–1867. [Google Scholar]
- Dabbiru, L.; Samiappan, S.; Nobrega, R.A.; Aanstoos, J.A.; Younan, N.H.; Moorhead, R.J. Fusion of synthetic aperture radar and hyperspectral imagery to detect impacts of oil spill in Gulf of Mexico. In Proceedings of the 2015 IEEE international geoscience and remote sensing symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 1901–1904. [Google Scholar]
- Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A critical comparison among pansharpening algorithms. IEEE Trans. Geosci. Remote Sens. 2014, 53, 2565–2586. [Google Scholar] [CrossRef]
- Meng, X.; Shen, H.; Li, H.; Zhang, L.; Fu, R. Review of the pansharpening methods for remote sensing images based on the idea of meta-analysis: Practical discussion and challenges. Inf. Fusion 2019, 46, 102–113. [Google Scholar] [CrossRef]
- Sun, W.; Ren, K.; Meng, X.; Xiao, C.; Yang, G.; Peng, J. A band divide-and-conquer multispectral and hyperspectral image fusion method. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–13. [Google Scholar] [CrossRef]
- Liu, J. Smoothing filter-based intensity modulation: A spectral preserve image fusion technique for improving spatial details. Int. J. Remote Sens. 2000, 21, 3461–3472. [Google Scholar] [CrossRef]
- Zhouping, Y. Fusion algorithm of optical images and SAR with SVT and sparse representation. Int. J. Smart Sen. Intell. Syst. 2015, 8, 1123–1142. [Google Scholar] [CrossRef] [Green Version]
- Farbman, Z.; Fattal, R.; Lischinski, D.; Szeliski, R. Edge-preserving decompositions for multi-scale tone and detail manipulation. ACM Trans. Graph. (TOG) 2008, 27, 1–10. [Google Scholar] [CrossRef]
- He, K.; Sun, J. Fast guided filter. arXiv 2015, arXiv:1505.00996. [Google Scholar]
- Yokoya, N.; Yairi, T.; Iwasaki, A. Coupled nonnegative matrix factorization unmixing for hyperspectral and multispectral data fusion. IEEE Trans. Geosci. Remote Sens. 2011, 50, 528–537. [Google Scholar] [CrossRef]
- Meng, X.; Yang, G.; Shao, F.; Sun, W.; Shen, H.; Li, S. SARF: A simple, adjustable, and robust fusion method. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Li, W.; Gao, Y.; Zhang, M.; Tao, R.; Du, Q. Asymmetric Feature Fusion Network for Hyperspectral and SAR Image Classification. IEEE Trans. Neural. Netw. Learn. Syst. 2022, 1–14. [Google Scholar] [CrossRef]
- He, K.; Sun, W.; Yang, G.; Meng, X.; Ren, K.; Peng, J.; Du, Q. A Dual Global&Local Attention Network for Hyperspectral Band Selection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–13. [Google Scholar]
- Gao, Y.; Li, W.; Zhang, M.; Wang, J.; Sun, W.; Tao, R.; Du, Q. Hyperspectral and Multispectral Classification for Coastal Wetland Using Depthwise Feature Interaction Network. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–15. [Google Scholar] [CrossRef]
- Gao, Y.; Song, X.; Li, W.; Wang, J.; He, J.; Jiang, X.; Feng, Y. Fusion Classification of HSI and MSI Using a Spatial-Spectral Vision Transformer for Wetland Biodiversity Estimation. Remote Sens. 2022, 14, 850. [Google Scholar] [CrossRef]
- Inglada, J.; Giros, A. On the possibility of automatic multisensor image registration. IEEE Trans. Geosci. Remote Sens. 2004, 42, 2104–2120. [Google Scholar] [CrossRef]
- Dian, R.; Li, S.; Fang, L.; Lu, T.; Bioucas-Dias, J.M. Nonlocal Sparse Tensor Factorization for Semiblind Hyperspectral and Multispectral Image Fusion. IEEE Trans. Cybern. 2019, 50, 4469–4480. [Google Scholar] [CrossRef]
- Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
- Kahraman, S.; Bacher, R. A comprehensive review of hyperspectral data fusion with lidar and sar data. Annu. Rev. Control. 2021, 51, 236–253. [Google Scholar] [CrossRef]
- Hong, D.; Yokoya, N.; Xia, G.-S.; Chanussot, J.; Zhu, X.X. X-ModalNet: A semi-supervised deep cross-modal network for classification of remote sensing data. ISPRS J. Photogram. Remote Sens. 2020, 167, 12–23. [Google Scholar] [CrossRef]
- Hong, D.; Yokoya, N.; Ge, N.; Chanussot, J.; Zhu, X.X. Learnable manifold alignment (LeMA): A semi-supervised cross-modality learning framework for land cover and land use classification. ISPRS J. Photogram. Remote Sens. 2019, 147, 193–205. [Google Scholar] [CrossRef] [PubMed]
- Jin, H.; Mountrakis, G. Fusion of optical, radar and waveform LiDAR observations for land cover classification. ISPRS J. Photogram. Remote Sens. 2022, 187, 171–190. [Google Scholar] [CrossRef]
- Hu, J.; Hong, D.; Wang, Y.; Zhu, X.X. A comparative review of manifold learning techniques for hyperspectral and polarimetric sar image fusion. Remote Sens. 2019, 11, 681. [Google Scholar] [CrossRef] [Green Version]
- Hong, D.; Gao, L.; Yokoya, N.; Yao, J.; Chanussot, J.; Du, Q.; Zhang, B. More diverse means better: Multimodal deep learning meets remote-sensing imagery classification. IEEE Trans. Geosci. Remote Sens. 2020, 59, 4340–4354. [Google Scholar] [CrossRef]
- Zhang, Z.; Xu, N.; Li, Y.; Li, Y. Sub-continental-scale mapping of tidal wetland composition for East Asia: A novel algorithm integrating satellite tide-level and phenological features. Remote Sens. Environ. 2022, 269, 112799. [Google Scholar] [CrossRef]
Notation | Description | Dimensions |
---|---|---|
The x-th remote sensing image. | ||
Low-frequency components of the original image. | ||
Obtained by the difference between the and . | ||
The Frobenius norm of , . | ||
The normalized SAR images, obtained by Equation (7). | ||
SAR under synthesized and histogram matched. | ||
The correlation coefficient, obtained by Equation (20). | ||
Visual weight of the x-th image matrix. | ||
Saliency gradient weight of the x-th image matrix. | ||
Band weight of the hyperspectral image. | ||
Spectral transformation weight for MS image. | ||
Indicates upsampled image. | ||
Spatial penalty term to balance the equation. | ||
The fusion coefficients of subimages. | ||
The final fusion image after spectral compensation. |
Experiment Datasets | Acquisition Date (UTC + 08:00) | Sensors | Mode | Spatial Resolution | Dimensions | Bands |
---|---|---|---|---|---|---|
Yan Cheng | 6 September 2020 10:56 | ZY-1 02D | HSI | 30 (m) | 500 × 500 | 147 |
5 September 2020 10:35 | Sentinel-2B | MSI | 10 (m) | 1500 × 1500 | 4 | |
9 September 2020 17:54 | Sentinel-1B | SAR | 10 (m) | 1500 × 1500 | 2 | |
Yellow River Estuary | 28 June 2020 11:08 | ZY-1 02D | HSI | 30 (m) | 500 × 500 | 117 |
28 June 2020 10:55 | Sentinel-2A | MSI | 10 (m) | 1500 × 1500 | 4 | |
28 June 2020 18:05 | Sentinel-1A | SAR | 10 (m) | 1500 × 1500 | 2 | |
Hangzhou Bay | 29 December 2021 10:39 | GF-5B AHSI | HSI | 30 (m) | 600 × 600 | 210 |
29 December 2021 10:39 | GF-5B VIMI | MSI | 20 (m) | 900 × 900 | 6 | |
27 December 2021 17:55 | Sentinel-1A | SAR | 10 (m) | 1800 × 1800 | 2 |
Fusion Methods | Quantitative Metrics | Running Time | ||||
---|---|---|---|---|---|---|
PSNR | SAM | CC | RMSE | ERGAS | ||
IHS | 23.6280 | 9.6515 | 0.7155 | 0.04242 | 38.5805 | 68.9314 |
GS | 23.7360 | 11.8058 | 0.7459 | 0.04789 | 38.1233 | 119.6040 |
Brovey | 23.5187 | 8.9363 | 0.7634 | 0.04641 | 35.1931 | 868.6111 |
PRACS | 24.9276 | 5.4016 | 0.76366 | 0.27632 | 84.9932 | 1030.0152 |
CNMF | 25.9117 | 5.1363 | 0.8664 | 0.03635 | 26.6683 | 1472.3720 |
NLSTF | 31.5899 | 4.6968 | 0.9792 | 0.01721 | 14.7011 | 700.5636 |
SARF | 28.7493 | 6.2618 | 0.9629 | 0.02574 | 27.1026 | 144.1752 |
GSA-Hysure | 23.6181 | 10.1104 | 0.8377 | 0.04544 | 38.4659 | 3686.5776 |
GSA-BDSD | 29.9061 | 7.8741 | 0.9677 | 0.02918 | 18.4949 | 295.5739 |
GSA-ATWT | 32.4186 | 4.9149 | 0.9772 | 0.02110 | 13.4138 | 192.6265 |
M2CF (Proposed) | 38.1668 | 2.5525 | 0.9938 | 0.01149 | 7.0877 | 355.8300 |
Fusion Methods | Quantitative Metrics | Running Time | ||||
---|---|---|---|---|---|---|
PSNR | SAM | CC | RMSE | ERGAS | ||
IHS | 17.9721 | 9.3361 | 0.4591 | 0.07252 | 43.9962 | 32.0599 |
GS | 18.1503 | 9.4228 | 0.4497 | 0.07692 | 44.0660 | 47.2686 |
Brovey | 17.8701 | 8.1697 | 0.4392 | 0.07590 | 44.2001 | 35.2524 |
PRACS | 25.1538 | 3.5619 | 0.8404 | 0.18539 | 79.1565 | 1219.1587 |
CNMF | 23.8400 | 3.6861 | 0.8306 | 0.04641 | 22.3898 | 1377.0568 |
NLSTF | 28.3358 | 3.9234 | 0.9274 | 0.02911 | 13.0927 | 563.4189 |
SARF | 19.2741 | 6.6625 | 0.8641 | 0.05676 | 25.2628 | 132.0532 |
GSA-Hysure | 20.9068 | 5.1851 | 0.7879 | 0.05665 | 29.0530 | 2869.8089 |
GSA-BDSD | 30.2073 | 3.2387 | 0.9757 | 0.02257 | 9.6567 | 255.6573 |
GSA-ATWT | 30.5702 | 3.0832 | 0.9752 | 0.01888 | 9.3029 | 175.9395 |
M2CF (Proposed) | 30.8864 | 1.9004 | 0.9782 | 0.01914 | 8.9354 | 138.6756 |
Fusion Methods | Quantitative Metrics | Running Time | ||||
---|---|---|---|---|---|---|
PSNR | SAM | CC | RMSE | ERGAS | ||
IHS | 29.8409 | 7.5310 | 0.8036 | 0.03450 | 27.2223 | 167.5567 |
GS | 29.7873 | 4.1202 | 0.8206 | 0.02530 | 17.3417 | 129.8105 |
Brovey | 29.6520 | 3.9635 | 0.8103 | 0.01618 | 17.3140 | 108.3487 |
PRACS | 34.7274 | 5.7495 | 0.8289 | 0.02332 | 25.6469 | 1096.1787 |
CNMF | 31.1619 | 9.3361 | 0.8440 | 0.02993 | 24.5317 | 2061.8031 |
NLSTF | 31.6462 | 3.3214 | 0.9390 | 0.01790 | 10.3979 | 3982.5931 |
SARF | 23.0536 | 7.1064 | 0.8842 | 0.08799 | 21.2838 | 297.1281 |
GSA-Hysure | 35.8663 | 3.7498 | 0.9213 | 0.02050 | 12.2087 | 5982.9252 |
GSA-BDSD | 37.6267 | 3.3107 | 0.9499 | 0.01678 | 7.2889 | 1445.8150 |
GSA-ATWT | 39.5996 | 2.6122 | 0.9607 | 0.01526 | 9.8749 | 352.9102 |
M2CF (Proposed) | 43.4879 | 2.5395 | 0.9732 | 0.01537 | 5.0824 | 285.8053 |
Method Index | RF | SVM | ||||
---|---|---|---|---|---|---|
AA | OA | Kappa | AA | OA | Kappa | |
ZY-1 02D | 0.7624 | 0.8427 | 0.8182 | 0.7946 | 0.8601 | 0.8384 |
MS + HS | 0.8565 | 0.8879 | 0.8716 | 0.9230 | 0.9495 | 0.9423 |
IHS | 0.8138 | 0.9194 | 0.8992 | 0.9448 | 0.9736 | 0.9673 |
GS | 0.8136 | 0.9112 | 0.8894 | 0.9412 | 0.9713 | 0.9643 |
Brovey | 0.7710 | 0.9063 | 0.8830 | 0.8704 | 0.9174 | 0.8979 |
PRACS | 0.8962 | 0.9421 | 0.8350 | 0.7751 | 0.9404 | 0.7972 |
CNMF | 0.6847 | 0.8148 | 0.7695 | 0.7159 | 0.8035 | 0.7562 |
NLSTF | 0.8526 | 0.8686 | 0.8628 | 0.9244 | 0.9474 | 0.9399 |
SARF | 0.8342 | 0.8502 | 0.8420 | 0.9281 | 0.9540 | 0.9484 |
GSA-Hysure | 0.8766 | 0.9375 | 0.9224 | 0.9410 | 0.9713 | 0.9643 |
GSA-BDSD | 0.8635 | 0.9331 | 0.9172 | 0.9513 | 0.9759 | 0.9701 |
GSA-ATWT | 0.7357 | 0.8808 | 0.8509 | 0.9127 | 0.9579 | 0.9478 |
M2CF | 0.8972 | 0.9484 | 0.9359 | 0.9589 | 0.9775 | 0.9721 |
Method Index | RF | SVM | ||||
---|---|---|---|---|---|---|
AA | OA | Kappa | AA | OA | Kappa | |
ZY-1 02D | 0.7903 | 0.8719 | 0.8504 | 0.8563 | 0.9245 | 0.9118 |
MS + HS | 0.8404 | 0.9163 | 0.9023 | 0.9385 | 0.9720 | 0.9674 |
IHS | 0.8791 | 0.9313 | 0.9199 | 0.9487 | 0.9759 | 0.9719 |
GS | 0.8750 | 0.9260 | 0.9138 | 0.9444 | 0.9735 | 0.9691 |
Brovey | 0.8016 | 0.8576 | 0.8339 | 0.9107 | 0.9427 | 0.9332 |
PRACS | 0.8793 | 0.9309 | 0.9195 | 0.8456 | 0.92275 | 0.9096 |
CNMF | 0.8231 | 0.8876 | 0.8690 | 0.8750 | 0.9230 | 0.9103 |
NLSTF | 0.8544 | 0.9144 | 0.9069 | 0.9437 | 0.9587 | 0.9569 |
SARF | 0.8342 | 0.8602 | 0.8400 | 0.9281 | 0.9540 | 0.9474 |
GSA-Hysure | 0.7483 | 0.8509 | 0.8257 | 0.8896 | 0.9432 | 0.9338 |
GSA-BDSD | 0.8818 | 0.9366 | 0.9261 | 0.9407 | 0.9723 | 0.9678 |
GSA-ATWT | 0.8566 | 0.9207 | 0.9075 | 0.9363 | 0.9715 | 0.9668 |
M2CF | 0.8964 | 0.9426 | 0.9331 | 0.9542 | 0.9789 | 0.9754 |
Method Index | RF | SVM | ||||
---|---|---|---|---|---|---|
AA | OA | Kappa | AA | OA | Kappa | |
ZY-1 02D | 0.8549 | 0.9137 | 0.9021 | 0.8863 | 0.9331 | 0.9308 |
MS + HS | 0.9041 | 0.9221 | 0.9219 | 0.9278 | 0.9464 | 0.9426 |
IHS | 0.9167 | 0.9392 | 0.9356 | 0.9397 | 0.9532 | 0.9514 |
GS | 0.9196 | 0.9472 | 0.9432 | 0.9411 | 0.9543 | 0.9532 |
Brovey | 0.9212 | 0.9469 | 0.9428 | 0.9529 | 0.9620 | 0.9607 |
PRACS | 0.9273 | 0.9435 | 0.9407 | 0.9377 | 0.9425 | 0.9410 |
CNMF | 0.8368 | 0.9190 | 0.9047 | 0.8306 | 0.9184 | 0.9056 |
NLSTF | 0.9298 | 0.9437 | 0.9392 | 0.9339 | 0.9514 | 0.9499 |
SARF | 0.9205 | 0.9526 | 0.9461 | 0.9508 | 0.9689 | 0.9657 |
GSA-Hysure | 0.9288 | 0.9489 | 0.9434 | 0.9419 | 0.9590 | 0.9571 |
GSA-BDSD | 0.9282 | 0.9571 | 0.9528 | 0.9547 | 0.9616 | 0.9602 |
GSA-ATWT | 0.9375 | 0.9611 | 0.9583 | 0.9515 | 0.9636 | 0.9619 |
M2CF | 0.9411 | 0.9675 | 0.9650 | 0.9550 | 0.9719 | 0.9705 |
Fusion Methods | Quantitative Metrics | Running Time | ||||
---|---|---|---|---|---|---|
PSNR | SAM | CC | RMSE | ERGAS | ||
Low-pass Filter | 29.4262 | 3.9817 | 0.81032 | 0.01518 | 17.3140 | 268.9314 |
Gaussian Filter | 33.9373 | 3.2910 | 0.96010 | 0.01380 | 12.1189 | 319.6040 |
Uncompensated | 23.5187 | 8.9363 | 0.76335 | 0.04641 | 35.1931 | 283.6111 |
M2CF (Proposed) | 38.1668 | 2.5525 | 0.9938 | 0.01150 | 7.0877 | 355.8300 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yuan, Y.; Meng, X.; Sun, W.; Yang, G.; Wang, L.; Peng, J.; Wang, Y. Multi-Resolution Collaborative Fusion of SAR, Multispectral and Hyperspectral Images for Coastal Wetlands Mapping. Remote Sens. 2022, 14, 3492. https://doi.org/10.3390/rs14143492
Yuan Y, Meng X, Sun W, Yang G, Wang L, Peng J, Wang Y. Multi-Resolution Collaborative Fusion of SAR, Multispectral and Hyperspectral Images for Coastal Wetlands Mapping. Remote Sensing. 2022; 14(14):3492. https://doi.org/10.3390/rs14143492
Chicago/Turabian StyleYuan, Yi, Xiangchao Meng, Weiwei Sun, Gang Yang, Lihua Wang, Jiangtao Peng, and Yumiao Wang. 2022. "Multi-Resolution Collaborative Fusion of SAR, Multispectral and Hyperspectral Images for Coastal Wetlands Mapping" Remote Sensing 14, no. 14: 3492. https://doi.org/10.3390/rs14143492