View selection for sketch-based 3D model retrieval using visual part shape description

Hand drawings are the imprints of shapes in human’s mind. How a human expresses a shape is a consequence of how he or she visualizes it. A query-by-sketch 3D object retrieval application is closely tied to this concept from two aspects. First, describing sketches must involve elements in a figure th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Visual computer 2017-05, Vol.33 (5), p.565-583
Hauptverfasser: Yasseen, Zahraa, Verroust-Blondet, Anne, Nasri, Ahmad
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Hand drawings are the imprints of shapes in human’s mind. How a human expresses a shape is a consequence of how he or she visualizes it. A query-by-sketch 3D object retrieval application is closely tied to this concept from two aspects. First, describing sketches must involve elements in a figure that matter most to a human. Second, the representative 2D projection of the target 3D objects should be limited to “the canonical views” from a human cognition perspective. We advocate for these two rules by presenting a new approach for sketch-based 3D object retrieval that describes a 2D shape by the visual protruding parts of its silhouette. Furthermore, we present a list of candidate 2D projections that represent the canonical views of a 3D object. The general rule is that humans would visually avoid part occlusion and symmetry. We quantify the extent of part occlusion of the projected silhouettes of 3D objects by skeletal length computations. Sorting the projected views in the decreasing order of skeletal lengths gives access to a subset of the best representative views. We experimentally show how views that cause misinterpretation and mismatching can be detected according to the part occlusion criteria. We also propose criteria for locating side, off axis, or asymmetric views.
ISSN:0178-2789
1432-2315
1432-8726
DOI:10.1007/s00371-016-1328-7