skip to main content
10.1145/3480651.3480656acmotherconferencesArticle/Chapter ViewAbstractPublication PagesprisConference Proceedingsconference-collections
research-article

Infrared and visible image fusion method based on LatLRR and ICA

Published: 12 October 2021 Publication History

Abstract

To solve the problem of missing lots of texture details in the fusion image, we propose a new fusion method of infrared and visible images based on latent low-rank representation(LatLRR) and independent component analysis(ICA) in this paper. Firstly, the source image is decomposed into low-rank components, sparse components, and noise components by LatLRR. Secondly, ICA is utilized for the low-rank part of infrared image and visible image to obtain the main difference between two source images. Then, the image containing more information is determined by comparing the entropy of two source images and it is employed as a benchmark. Finally, the fused image is accomplished by connecting the benchmark result, the low-rank components, and the sparse components of another image according to the result obtained by ICA. Compared with other fusion methods, experimental results demonstrate that the proposed method has better visual effects and evaluation indicators.

References

[1]
[1] Ma J, Ma Y, Li C. Infrared and visible image fusion methods and applications: A survey[J]. Information Fusion, 2019:153-178.
[2]
[2] Ma J, Yu W, Liang P, et al. FusionGAN: A generative adversarial network for infrared and visible image fusion[J]. Information Fusion, 2019, 48:11-26.
[3]
[3] Li H, Wu X J, Durrani T S. Infrared and Visible Image Fusion with ResNet and zero-phase component analysis[J]. 2018.
[4]
[4] Li H, Wu X J, Kittler J. Infrared and Visible Image Fusion using a Deep Learning Framework[C]//Pattern Recognition (ICPR), 2018 24rd International Conference on. IEEE, 2018: 2705 - 2710.
[5]
[5] G. Liu and S. Yan. Latent low-rank representation for subspace segmentation and feature extraction. In 2011 International Conference on Computer Vision, pages 1615–1622. IEEE, 2011.
[6]
[6] Li H, Wu X J. Infrared and visible image fusion using latent low-rank representation[J]. arXiv preprint arXiv:1804.08992, 2018.
[7]
[7] Hyvärinen A. Survey on independent component analysis[J]. 1999.
[8]
[8] P. Comon. Independent component analysis ? a new concept? Signal Processing, 36:287?314, 1994.
[9]
[9] Hyvärinen A, Oja E. Independent component analysis: algorithms and applications[J]. Neural networks, 2000, 13(4-5): 411-430.
[10]
[10] A. Hyvärinen. The Fixed-point algorithm and maximum likelihood estimation for independent component analysis. Neural Processing Letters, 1999. To appear.
[11]
[11] B. S. Kumar. Image fusion based on pixel significance using cross bilateral filter. Signal, image and video processing, 9(5):1193–1204, 2015.
[12]
[12] Tan W, Zhou H, Song J, et al. Infrared and visible image perceptive fusion through multi-level Gaussian curvature filtering image decomposition[J]. Applied optics, 2019, 58(12): 3064-3073.
[13]
[13] Z. Wang, A.C. Bovik, H.R. Sheikh, E.P. Simoncelli, Image quality assessment: from error visibility to structural similarity, IEEE Trans. Image Process. 13 (4) (2004) 600–612.
[14]
[14] G. Qu, D. Zhang, P. Yan, Information measure for performance of image fusion, Electron. Lett. 38 (7) (2002) 313–315.
[15]
[15] J.W. Roberts, J. Van Aardt, F. Ahmed, Assessment of image fusion procedures using entropy, image quality, and multispectral classification, J. Appl. Remote Sens. 2 (1) (2008) 023522.
[16]
[16] A.M. Eskicioglu, P.S. Fisher, Image quality measures and their performance, IEEE Trans. Commun. 43 (12) (1995) 2959–2965.
[17]
[17] Y. Han, Y. Cai, Y. Cao, X. Xu, A new image fusion performance metric based on visual information fidelity, Inform. Fus. 14 (2) (2013) 127–135.
[18]
[18] C. Xydeas, and V. Petrovic. Objective image fusion performance measure. Electronics letters, 36(4):308–309, 2000.

Cited By

View all
  • (2024)Using Sparse Parts in Fused Information to Enhance Performance in Latent Low-Rank Representation-Based Fusion of Visible and Infrared ImagesSensors10.3390/s2405151424:5(1514)Online publication date: 26-Feb-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PRIS '21: Proceedings of the 2021 International Conference on Pattern Recognition and Intelligent Systems
July 2021
91 pages
ISBN:9781450390392
DOI:10.1145/3480651
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. image fusion
  2. independent component analysis
  3. infrared image
  4. latent low-rank representation
  5. visible image

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

PRIS 2021

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)1
Reflects downloads up to 15 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Using Sparse Parts in Fused Information to Enhance Performance in Latent Low-Rank Representation-Based Fusion of Visible and Infrared ImagesSensors10.3390/s2405151424:5(1514)Online publication date: 26-Feb-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media