[go: up one dir, main page]

Skip to main content
Log in

A measure for the evaluation of multi-focus image fusion at feature level

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Most multi-focus image fusion evaluation methods are based on focus detection and measuring the similarity between the fused image and the whole source images including defocused regions, which is liable to result in the difference between the evaluation result and the real image fusion quality. To overcome the problem above, we proposed a novel objective measure for multi-focus image fusion assessment in feature level. Firstly, the corners in source images and the fused image are separately detected based on Smallest Univalue Segment Assimilating Nucleus (SUSAN) algorithm. Then, a corner similarity measure based on overlapping rate is proposed to measure the fusion quality. The proposed method avoids focus detection in the assessment procedure, which make evaluation results more reliable. Experimental results demonstrate that the proposed measure is more consistent with subjective evaluation. Comparing with other objective metrics, two meta-measures including correct ranking (CR) and subjective relevance (R) give our proposed measure the highest scores, 0.8377 and 0.7384, respectively. The area under the ROC curve (AUC) gives our metric the second-best scores of 0.8428.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Bouzos O, Andreadis I, Mitianoudis N (2019) Conditional random field model for robust multi-focus image fusion. IEEE Trans Image Process 28(11):5636–5648

    Article  MathSciNet  Google Scholar 

  2. Chen YB, Guan JW, Cham WK (2018) Robust multi-focus image fusion using edge model and multi-matting. IEEE Trans Image Process 27(3):1526–1541

    Article  MathSciNet  Google Scholar 

  3. Fang Y, Zhu H, Ma K, Wang Z, Li S (2019) Perceptual evaluation for multi-exposure image fusion of dynamic scenes. IEEE Trans Image Process 29:1127–1138

    Article  MathSciNet  Google Scholar 

  4. Han YY (2015) Multimodal gray image fusion metric based on complex wavelet structural similarity. Optik 126(24):5842–5844

    Article  Google Scholar 

  5. Han Y, Cai YZ, Cao Y et al (2013) A new image fusion performance metric based on visual information fidelity. Information fusion 14(2):127–135

    Article  Google Scholar 

  6. Hassen R, Wang Z, Salama MMA (2015) Objective quality assessment for multiexposure multifocus image fusion. IEEE Trans Image Process 24(9):2712–2724

    Article  MathSciNet  Google Scholar 

  7. Hossny M, Nahavandi S, Creighton D (2008) Comments on ‘Information measure for performance of image fusion’. Electron Lett 44(18):1066–1067

    Article  Google Scholar 

  8. Li ST, Kang XD, Fang LY et al (2017) Pixel-level image fusion: a survey of the state of the art. Information fusion 33:100–112

    Article  Google Scholar 

  9. Li H, Zhang L, Jiang M, Li Y (2021) Multi-focus image fusion algorithm based on supervised learning for fully convolutional neural network. Pattern Recogn Lett 141:45–53

    Article  Google Scholar 

  10. Liu Y, Wang L, Cheng J, Li C, Chen X (2020) Multi-focus image fusion: a survey of the state of the art. Information Fusion 64:71–91

    Article  Google Scholar 

  11. Ma K, Zeng K, Wang Z (2015) Perceptual quality assessment for multi-exposure image fusion. IEEE Trans Image Process 24(11):3345–3356

    Article  MathSciNet  Google Scholar 

  12. Martinez J, Pistonesi S, Maciel MC et al (2019) Multi-scale fidelity measure for image fusion quality assessment. Information Fusion 50:197–211

    Article  Google Scholar 

  13. Nejati M, Samavi S, Shirani S (2015) Multi-focus image fusion using dictionary-based sparse representation. Information Fusion 25:72–84. https://doi.org/10.1016/j.inffus.2014.10.004

    Article  Google Scholar 

  14. Nian Z, Jung C (2019) Cnn-based multi-focus image fusion with light field data. 2019 IEEE International Conf Image Process (ICIP), 1044–1048

  15. Petrovic V (2007) Subjective tests for image fusion evaluation and objective metric validation. Information fusion 8(2):208–216

    Article  Google Scholar 

  16. Petrovic V, Dimitrijevic V (2015) Focused pooling for image fusion evaluation. Information fusion 22:119–126

    Article  Google Scholar 

  17. Piella G, Heijmans H (2003) A new quality metric for image fusion. Proceedings of the 10th International Conference on Image Processing, 3: 173–176

  18. Possa PR, Mahmoudi SA, Harb N, Valderrama C, Manneback P (2014) A multi-resolution FPGA-based architecture for real-time edge and corner detection. IEEE Trans Comput 63(10):2376–2388

    Article  MathSciNet  Google Scholar 

  19. Qiu X, Li M, Zhang L, Yuan X (2019) Guided filter-based multi-focus image fusion through focus region detection. Signal Process Image Commun 72:35–46

    Article  Google Scholar 

  20. Qu G, Zhang D, Yan P (2002) Information measure for performance of image fusion. Electron Lett 38(7):313–315

    Article  Google Scholar 

  21. Smith SM, Brady JM (1997) SUSAN-a new approach to low level image processing. Int J Comput Vis 23(1):45–78

    Article  Google Scholar 

  22. Tan J, Zhang T, Zhao L, Luo X, Tang YY (2021) Multi-focus image fusion with geometrical sparse representation. Signal Process Image Commun 92:116130

    Article  Google Scholar 

  23. Tang H, Xiao B, Li WS et al (2018) Pixel convolutional neural network for multi-focus image fusion. Inf Sci 433:125–141

    Article  MathSciNet  Google Scholar 

  24. Tsai CC Standard images for multifocus image fusion (https://www.mathworks.com/matlabcentral/fileexchange/45992-standard-images-for-multifocus-image-fusion). MATLAB Central File Exchange, Retrieved March 10, 2020.

  25. Wang Z, Bovik AC (2002) A universal image quality index. Signal Processing Letters 9(3):81–84

    Article  Google Scholar 

  26. Xing L, Cai L, Zeng HQ et al (2018) A multi-scale contrast-based image quality assessment model for multi-exposure image fusion. Signal Process 145:233–240

    Article  Google Scholar 

  27. Xu K, Qin Z, Wang G et al (2018) Multi-focus image fusion using fully convolutional two-stream network for visual sensors. KSII Transactions on Internet and Information Systems 12(5):2253–2272

    Google Scholar 

  28. Xydeas S, Petrovic V (2000) Objective image fusion performance measure. Electron Lett 36(4):308–309

    Article  Google Scholar 

  29. Yu S, Li X, Ma M, Zhang X, Chen S (2021) Multi-focus image fusion based on L1 image transform. Multimed Tools Appl 80(4):5673–5700

    Article  Google Scholar 

  30. Zhang XL, Li XF, Feng YC et al (2015) The use of ROC and AUC in the validation of objective image fusion evaluation metrics. Signal Process 115:38–48

    Article  Google Scholar 

  31. Zhang Y, Bai X, Wang T (2017) Boundary finding based multi-focus image fusion through multi-scale morphological focus-measure. Information fusion 35:81–101

    Article  Google Scholar 

  32. Zhao W, Wang D, Lu H (2019) Multi-focus image fusion with a natural enhancement via a joint multi-level deeply supervised convolutional neural network. IEEE Transactions on Circuits and Systems for Video Technology 29(4):1102–1115

    Article  Google Scholar 

  33. Zhu H (2020) Image quality assessment model based on multi-feature fusion of energy internet of things. Futur Gener Comput Syst 112:501–506

    Article  Google Scholar 

  34. Xiao Zuzhang. multi-focus-image-fusion-dataset (https://www.mathworks.com/matlabcentral/fileexchange/70109-multi-focus-image-fusion-dataset), MATLAB Central File Exchange Retrieved July 8, 2021.

Download references

Acknowledgements

The work was supported by Youth Growth Science and Technology Plan Project of Jilin Provincial Department of Science and Technology (NO.20210508039RQ), “Thirteenth Five-Year Plan” Scientific Research Planning Project of Education Department of Jilin Province (NO.JJKH20200678KJ, NO.JJKH20210752KJ, NO.JJKH20200677KJ), Fundamental Research Funds for the Central Universities, JLU(NO.93K172020K05), and National Natural Science Foundation of China (NO.61806024, NO.61876070, NO.61801190).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaoli Zhang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feng, Y., Guo, R., Shen, X. et al. A measure for the evaluation of multi-focus image fusion at feature level. Multimed Tools Appl 81, 18053–18071 (2022). https://doi.org/10.1007/s11042-022-11976-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-11976-3

Keywords

Navigation