Relationship-Based Point Cloud Completion

We propose a partial point cloud completion approach for scenes that are composed of multiple objects. We focus on pairwise scenes where two objects are in close proximity and are contextually related to each other, such as a chair tucked in a desk, a fruit in a basket, a hat on a hook and a flower...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics 2022-12, Vol.28 (12), p.4940-4950
Hauptverfasser: Zhao, Xi, Zhang, Bowen, Wu, Jinji, Hu, Ruizhen, Komura, Taku
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We propose a partial point cloud completion approach for scenes that are composed of multiple objects. We focus on pairwise scenes where two objects are in close proximity and are contextually related to each other, such as a chair tucked in a desk, a fruit in a basket, a hat on a hook and a flower in a vase. Different from existing point cloud completion methods, which mainly focus on single objects, we design a network that encodes not only the geometry of the individual shapes, but also the spatial relations between different objects. More specifically, we complete missing parts of the objects in a conditional manner, where the partial or completed point cloud of the other object is used as an additional input to help predict missing parts. Based on the idea of conditional completion, we further propose a two-path network, which is guided by a consistency loss between different sequences of completion. Our method can handle difficult cases where the objects heavily occlude each other. Also, it only requires a small set of training data to reconstruct the interaction area compared to existing completion approaches. We evaluate our method qualitatively and quantitatively via ablation studies and in comparison to the state-of-the-art point cloud completion methods.
ISSN:1077-2626
1941-0506
DOI:10.1109/TVCG.2021.3109392