Seeing in the Dark by Component-GAN

Recently, Retinex theory based low-light image enhancement (LLIE) algorithms have achieved impressive results in controlled environment. However, the majority of deep learning based LLIE algorithms leverage relighting by enhancing the illumination components that directly determines the image bright...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters 2021, Vol.28, p.1250-1254
Hauptverfasser: Rao, Ning, Lu, Tao, Zhou, Qiang, Zhang, Yanduo, Wang, Zhongyuan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, Retinex theory based low-light image enhancement (LLIE) algorithms have achieved impressive results in controlled environment. However, the majority of deep learning based LLIE algorithms leverage relighting by enhancing the illumination components that directly determines the image brightness, regretfully, they ignore the information of reflectance components, which may cause problems such as image noise and color distortion in reconstructed images. To tackle this problem, in this letter, we propose a component enhancement network based on Generative Adversarial Network (Component-GAN) for recovering clear images from low-light ones. Specifically, the network is composed of the decomposition part for dividing the paired low/normal-light images into illumination components and reflectance components, and the enhancement part for generating high-quality images. It is worth to note that we provide two branches of component enhancement network, which are parallel to improve the two components simultaneously. Hereby, we treat the reconstruction part as the generative network and adopt discriminative network to boost image reconstruction performance. Through extensive experiments, the proposed approach outperforms some state-of-the-art LLIE methods in terms of visual and subjective qualities.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2021.3079848