Robust Depth Estimation Using Auto-Exposure Bracketing

As the computing power of handheld devices grows, there has been increasing interest in the capture of depth information to enable a variety of photographic applications. However, under low-light conditions, most devices still suffer from low imaging quality and inaccurate depth acquisition. To addr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 2019-05, Vol.28 (5), p.2451-2464
Hauptverfasser: Im, Sunghoon, Jeon, Hae-Gon, Kweon, In So
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As the computing power of handheld devices grows, there has been increasing interest in the capture of depth information to enable a variety of photographic applications. However, under low-light conditions, most devices still suffer from low imaging quality and inaccurate depth acquisition. To address the problem, we present a robust depth estimation method from a short burst shot with varied intensity (i.e., auto-exposure bracketing) and/or strong noise (i.e., high ISO). Our key idea synergistically combines deep convolutional neural networks with a geometric understanding of the scene. We introduce a geometric transformation between optical flow and depth tailored for burst images, enabling our learning-based multi-view stereo matching to be performed effectively. We then describe our depth estimation pipeline that incorporates this geometric transformation into our residual-flow network. It allows our framework to produce an accurate depth map even with a bracketed image sequence. We demonstrate that our method outperforms the state-of-the-art methods for various datasets captured by a smartphone and a DSLR camera. Moreover, we show that the estimated depth is applicable for image quality enhancement and photographic editing.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2018.2886777