Two-Branch Feature Interaction Fusion Method Based on Generative Adversarial Network
This study proposes a fusion method of infrared and visible images based on feature interaction. Existing fusion methods can be classified into two categories based on a single-branch network and a two-branch network. Generative adversarial networks are widely used in single-branch-based fusion meth...
Gespeichert in:
Veröffentlicht in: | Electronics (Basel) 2023-08, Vol.12 (16), p.3442 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This study proposes a fusion method of infrared and visible images based on feature interaction. Existing fusion methods can be classified into two categories based on a single-branch network and a two-branch network. Generative adversarial networks are widely used in single-branch-based fusion methods, which ignore the difference in feature extraction caused by different input images. Most two-branch-based fusion methods use convolutional neural networks, which do not take into account the inverse promotion of fusion results and lack the interaction between different input features. To remedy the shortcomings of these fusion methods and better utilize the feature from source images, this study proposes a two-branch feature interactions method based on a generative adversarial network for visible and infrared image fusion. In the generator part, a two-branch feature interaction approach was designed to extract features from different inputs and realize feature interaction through the network connection of different branches. In the discriminator part, a double-classification discriminator was used for visible images and infrared images. Extensive comparison experiments with state-of-the-art methods have demonstrated the advantages of this proposed generative adversarial network based on two-branch feature interaction, which can enhance the texture details of objects in fusion results and reduce the interference of noise information from source inputs. In addition, the above advantages were also confirmed in generalization experiments of object detection. |
---|---|
ISSN: | 2079-9292 2079-9292 |
DOI: | 10.3390/electronics12163442 |