Colorization of infrared images based on feature fusion and contrastive learning
•A novel method for infrared image colorization task is proposed.•An improved generator is designed to facilitate better generation of colorized image by fusing shallow detail features and deep semantic features.•A contrastive loss function is designed to ensure consistent object structure after col...
Gespeichert in:
Veröffentlicht in: | Optics and lasers in engineering 2023-03, Vol.162, p.107395, Article 107395 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •A novel method for infrared image colorization task is proposed.•An improved generator is designed to facilitate better generation of colorized image by fusing shallow detail features and deep semantic features.•A contrastive loss function is designed to ensure consistent object structure after colorization.•Comparision analysis demonstrates the superiority over state-of-the-art methods.
Converting infrared images to RGB images that match human eye perception is a challenging task. Current infrared image coloring techniques can bring visual improvements, but still suffer from texture distortion, blurred details, and poor image quality. In this paper, we work on solving the above problems. First, we design an improved generator structure. On the basis of Unet, we add dense convolutional blocks and skip connections to integrate low-level detail information with high-level semantic information. The developed generator can capture features at different levels and integrate them by feature fusion. It ensures that the captured features are not lost. Second, we design a new contrastive loss function. Based on the contrastive learning framework, this function focuses on learning common features between similar instances and distinguishing differences between non-similar instances. This ensures the consistency of the content and structure of the images. Finally, an in-depth contrast analysis is conducted based on commonly used datasets to demonstrate the superior colorization performance of our method against the state-of-the-art approaches. |
---|---|
ISSN: | 0143-8166 1873-0302 |
DOI: | 10.1016/j.optlaseng.2022.107395 |