Extending Camera’s Capabilities in Low Light Conditions Based on LIP Enhancement Coupled with CNN Denoising

Using a sensor in variable lighting conditions, especially very low-light conditions, requires the application of image enhancement followed by denoising to retrieve correct information. The limits of such a process are explored in the present paper, with the objective of preserving the quality of e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2021-11, Vol.21 (23), p.7906
Hauptverfasser: Carré, Maxime, Jourlin, Michel
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Using a sensor in variable lighting conditions, especially very low-light conditions, requires the application of image enhancement followed by denoising to retrieve correct information. The limits of such a process are explored in the present paper, with the objective of preserving the quality of enhanced images. The LIP (Logarithmic Image Processing) framework was initially created to process images acquired in transmission. The compatibility of this framework with the human visual system makes possible its application to images acquired in reflection. Previous works have established the ability of the LIP laws to perform a precise simulation of exposure time variation. Such a simulation permits the enhancement of low-light images, but a denoising step is required, realized by using a CNN (Convolutional Neural Network). A main contribution of the paper consists of using rigorous tools (metrics) to estimate the enhancement reliability in terms of noise reduction, visual image quality, and color preservation. Thanks to these tools, it has been established that the standard exposure time can be significantly reduced, which considerably enlarges the use of a given sensor. Moreover, the contribution of the LIP enhancement and denoising step are evaluated separately.
ISSN:1424-8220
1424-8220
DOI:10.3390/s21237906