Learning an adaptive model for extreme low-light raw image processing
Low-light images suffer from severe noise and low illumination. In this work, the authors propose an adaptive low-light raw image enhancement network to avoid parameter-handcrafting in current deep learning models and to improve image quality. The proposed method can be divided into two sub-models:...
Gespeichert in:
Veröffentlicht in: | IET image processing 2020-12, Vol.14 (14), p.3433-3443 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Low-light images suffer from severe noise and low illumination. In this work, the authors propose an adaptive low-light raw image enhancement network to avoid parameter-handcrafting in current deep learning models and to improve image quality. The proposed method can be divided into two sub-models: brightness prediction and exposure shifting (ES). The former is designed to control the brightness of the resulting image by estimating a guideline exposure time $t_1$t1. The latter learns to approximate an exposure-shifting operator ES, converting a low-light image with real exposure time $t_0$t0 to a noise-free image with guideline exposure time $t_1$t1. Additionally, structural similarity loss and image enhancement vector are introduced to promote image quality, and a new campus image data set (CID) for training the proposed model is proposed to overcome the limitations of the existing data sets. In quantitative tests, it is shown that the proposed method has the lowest noise level estimation score compared with the state-of-the-art low-light algorithms, suggesting a superior denoising performance. Furthermore, those tests illustrate that the proposed method is able to adaptively control the global image brightness according to the content of the image scene. Lastly, the potential application in video processing is briefly discussed. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/iet-ipr.2020.0100 |