LE-GAN: Unsupervised low-light image enhancement network using attention module and identity invariant loss

Low-light image enhancement aims to recover normal-light images from the images captured under very dim environments. Existing methods cannot well handle the noise, color bias and over-exposure problem, and fail to ensure visual quality when lacking paired training data. To address these problems, w...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2022-03, Vol.240, p.108010, Article 108010
Hauptverfasser: Fu, Ying, Hong, Yang, Chen, Linwei, You, Shaodi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Low-light image enhancement aims to recover normal-light images from the images captured under very dim environments. Existing methods cannot well handle the noise, color bias and over-exposure problem, and fail to ensure visual quality when lacking paired training data. To address these problems, we propose a novel unsupervised low-light image enhancement network named LE-GAN, which is based on generative adversarial networks and is trained with unpaired low/normal-light images. Specifically, we design an illumination-aware attention module that enhances the feature extraction of the network to address the problems of noise and color bias, as well as improve the visual quality. We further propose a novel identity invariant loss to address the over-exposure problem to make the network learn to enhance low-light images adaptively. Extensive experiments show that the proposed method can achieve promising results. Furthermore, we collect a large-scale low-light dataset named Paired Normal/Lowlight Images (PNLI). It consists of 2,000 pairs of low/normal-light images captured in various real-world scenes, which can provide the research community with a high-quality dataset to advance the development of this field.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2021.108010