Depth annotations: Designing depth of a single image for depth-based effects

•A fast depth-map creation solution from a single image•Various additional tools to refine the depth map•A selection of effective effects including wiggle stereography and unsharp masking. [Display omitted] We present a novel pipeline to generate a depth map from a single image that can be used as i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers & graphics 2018-04, Vol.71, p.180-188
Hauptverfasser: Liao, Jingtang, Shen, Shuheng, Eisemann, Elmar
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•A fast depth-map creation solution from a single image•Various additional tools to refine the depth map•A selection of effective effects including wiggle stereography and unsharp masking. [Display omitted] We present a novel pipeline to generate a depth map from a single image that can be used as input for a variety of artistic depth-based effects. In such a context, the depth maps do not have to be perfect but are rather designed with respect to a desired result. Consequently, our solution centers around user interaction and relies on a scribble-based depth editing. The annotations can be sparse, as the depth map is generated by a diffusion process, which is guided by image features. We support a variety of controls, such as a non-linear depth mapping, a steering mechanism for the diffusion (e.g., directionality, emphasis, or reduction of the influence of image cues), and besides absolute, we also support relative depth indications. In case that a depth estimate is available from an automatic solution, we illustrate how this information can be integrated in form of a depth palette, that allows the user to transfer depth values via a painting metaphor. We demonstrate a variety of artistic 3D results, including wiggle stereoscopy, artistic abstractions, haze, unsharp masking, and depth of field.
ISSN:0097-8493
1873-7684
DOI:10.1016/j.cag.2017.11.005