GACA: A Gradient-Aware and Contrastive-Adaptive Learning Framework for Low-Light Image Enhancement

Image gradients contain crucial information in the images. However, the gradient information of low-light images is often concealed in darkness and is susceptible to noise contamination. This imprecise gradient information poses a significant obstacle to low-light image enhancement (LLIE) tasks. Sim...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2024, Vol.73, p.1-14
Hauptverfasser: Yao, Zishu, Su, Jian-Nan, Fan, Guodong, Gan, Min, Chen, C. L. Philip
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Image gradients contain crucial information in the images. However, the gradient information of low-light images is often concealed in darkness and is susceptible to noise contamination. This imprecise gradient information poses a significant obstacle to low-light image enhancement (LLIE) tasks. Simultaneously, methods relying solely on pixel-level reconstruction loss struggle to accurately correct the mapping from dimly lit images to normal images, resulting in restored outcomes with color abnormalities or artifacts. In this article, we propose a gradient-aware and contrastive-adaptive (GACA) learning framework to address the aforementioned issues. GACA initially estimates more accurate gradient information and employs it as a structural prior to guide image generation. Simultaneously, we introduce a novel regularization constraint to better rectify the image mapping. Extensive experiments on benchmark datasets and downstream segmentation tasks demonstrate the state-of-the-art performance and generalization. Compared to existing approaches, our method achieves an average 4.7% reduction in natural image quality evaluator (NIQE) on benchmark datasets. The code is available at https://github.com/iijjlk/GACA .
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2024.3353285