IPLF: A Novel Image Pair Learning Fusion Network for Infrared and Visible Image

In this paper, a novel fusion network for infrared and visible images is proposed, named Image Pair Learning Fusion Network (IPLF). At present, most of the released deep learning-based fusion models are trained using unsupervised learning. This learning method lacks ground truth and cannot guide net...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2022-05, Vol.22 (9), p.8808-8817
Hauptverfasser: Zhu, Depeng, Zhan, Weida, Jiang, Yichun, Xu, Xiaoyu, Guo, Renzhong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, a novel fusion network for infrared and visible images is proposed, named Image Pair Learning Fusion Network (IPLF). At present, most of the released deep learning-based fusion models are trained using unsupervised learning. This learning method lacks ground truth and cannot guide network model learning in a targeted manner. First, we propose the use of supervised learning to guide the training of fusion models (IPLF). The generated image pairs are used as ground truth. Second, we propose a model learning strategy using paired images. This learning strategy enhances the complementary constraints of the network model so that the final fused image possesses the rich features of both infrared and visible images. Third, we designed the structure measurement loss function and the edge preservation loss function to ensure that the generated fusion image has rich edge information and comfortable visual effects. In addition, we have introduced a spatial attention module, which can make the final fusion image highlight the target information. Finally, the size of the convolution kernel we use in the first convolution block is 7\ \times\ 7 , the main purpose is to increase the receptive field. Experiments on the TNO and CVC-14 datasets prove that our proposed method is superior to the existing state-of-the-art methods in terms of qualitative and quantitative evaluation. The source code can be found in https://github.com/depeng6/IPLF .
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2022.3161733