Automatic objects segmentation with RGB-D cameras

•A new trilateral filter incorporating distance, color, and boundary information.•A new Graph Cuts based on color, depth, and boundary information.•A method to rectify the misaligned depth and RGB data of fast moving objects. Automatic object segmentation is a fundamentally difficult problem due to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of visual communication and image representation 2014-05, Vol.25 (4), p.709-718
Hauptverfasser: Liu, Haowei, Philipose, Matthai, Sun, Ming-Ting
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•A new trilateral filter incorporating distance, color, and boundary information.•A new Graph Cuts based on color, depth, and boundary information.•A method to rectify the misaligned depth and RGB data of fast moving objects. Automatic object segmentation is a fundamentally difficult problem due to issues such as shadow, lighting, and semantic gaps. Edges play a critical role in object segmentation; however, it is almost impossible for the computer to know which edges correspond to object boundaries and which are caused by internal texture discontinuities. Active 3-D cameras, which provide streams of depth and RGB frames, are poised to become inexpensive and widespread. The depth discontinuities provide useful information for identifying object boundaries, which makes automatic object segmentation possible. However, the depth frames are extremely noisy. Also, the depth and RGB information often lose synchronization when the object is moving fast, due to different response time of the RGB and depth sensors. We show how to use the combined depth and RGB information to mitigate these problems and produce an accurate silhouette of the object. On a large dataset (24 objects with 1500 images), we provide both qualitative and quantitative evidences that our proposed techniques are effective.
ISSN:1047-3203
1095-9076
DOI:10.1016/j.jvcir.2013.03.012