New Hole-Filling Method Using Extrapolated Spatio-Temporal Background Information for a Synthesized Free-View

This paper introduces a new hole-filling method using extrapolated spatio-temporal background information to obtain a synthesized free-view. New temporal background modeling is proposed, which incorporates stationary temporal information in the hole-filling process and preserves the temporal consist...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2019-06, Vol.21 (6), p.1345-1358
Hauptverfasser: Nguyen, Tien-Dat, Kim, Beomsu, Hong, Min-Cheol
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper introduces a new hole-filling method using extrapolated spatio-temporal background information to obtain a synthesized free-view. New temporal background modeling is proposed, which incorporates stationary temporal information in the hole-filling process and preserves the temporal consistency of synthesized views. A background codebook is distinguished from a non-overlapped patch-based codebook, which contributes to extracting reliable temporal background information. Furthermore, a depth-map driven spatial local background estimation is also addressed to discriminate the background holes in each disocclusion and to define two spatial BG constraints that represent the lower and upper bounds of a background candidate. Holes are filled by comparing the similarities between the temporal background information and the spatial background constraints. In addition, a depth map-based ghost removal filter is described to solve the problem of the non-fit between a color image and the corresponding depth map of a virtual view. Finally, an exemplar-based inpainting is applied to fill in the remaining holes with a priority function that includes a new depth term. The experimental results demonstrated that the proposed method led to results that promised subjective and objective improvement over state-of-the-art methods.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2018.2880954