DUGAN: Infrared and visible image fusion based on dual fusion paths and a U-type discriminator

Existing infrared and visible image fusion techniques based on generative adversarial networks (GAN) generally disregard local and texture detail features, which tend to limit the fusion performance. Therefore, we propose a GAN model based on dual fusion paths and a U-type discriminator, denoted as...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neurocomputing (Amsterdam) 2024-04, Vol.578, p.127391, Article 127391
Hauptverfasser: Chang, Le, Huang, Yongdong, Li, Qiufu, Zhang, Yuduo, Liu, Lijun, Zhou, Qingjian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Existing infrared and visible image fusion techniques based on generative adversarial networks (GAN) generally disregard local and texture detail features, which tend to limit the fusion performance. Therefore, we propose a GAN model based on dual fusion paths and a U-type discriminator, denoted as DUGAN. Specifically, the image and gradient paths are integrated into the generator to fully extract the content and texture detail features from the source images and their corresponding gradient images. This incorporation aids the generator in generating fusion results with rich information by integrating output features of dual fusion paths. In addition, we construct a U-type discriminator to focus on input images’ global and local information, which drives the network to generate fusion results visually consistent with the source images. Furthermore, we integrate attention blocks in the discriminator to improve the representation of salient information. Experimental results demonstrate that DUGAN has better performance in qualitative and quantitative evaluation compared with other state-of-the-art methods. The source code has been released at https://github.com/chang-le-11/DUGAN.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2024.127391