Luminance-Aware Pyramid Network for Low-Light Image Enhancement
Low-light image enhancement based on deep convolutional neural networks (CNNs) has revealed prominent performance in recent years. However, it is still a challenging task since the underexposed regions and details are always imperceptible. Moreover, deep learning models are always accompanied by com...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on multimedia 2021, Vol.23, p.3153-3165 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Low-light image enhancement based on deep convolutional neural networks (CNNs) has revealed prominent performance in recent years. However, it is still a challenging task since the underexposed regions and details are always imperceptible. Moreover, deep learning models are always accompanied by complex structures and enormous computational burden, which hinders their deployment on mobile devices. To remedy these issues, in this paper, we present a lightweight and efficient Luminance-aware Pyramid Network (LPNet) to reconstruct normal-light images in a coarse-to-fine strategy. The architecture is comprised of two coarse feature extraction branches and a luminance-aware refinement branch with an auxiliary subnet learning the luminance map of the input and target images. Besides, we propose a multi-scale contrast feature block (MSCFB) that involves channel split, channel shuffle strategies, and contrast attention mechanism. MSCFB is the essential component of our network, which achieves an excellent balance between image quality and model size. In this way, our method can not only brighten up low-light images with rich details and high contrast but also significantly ameliorate the execution speed. Extensive experiments demonstrate that our LPNet outperforms state-of-the-art methods both qualitatively and quantitatively. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2020.3021243 |