Saliency Detection for 3D Surface Geometry Using Semi-regular Meshes

In this paper, a unified detection algorithm of viewindependent and view-dependent saliency for three-dimensional mesh models is proposed. While the conventional techniques use the irregular meshes, we adopt the semi-regular meshes to overcome the drawback of irregular connectivity for saliency comp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2017-12, Vol.19 (12), p.2692-2705
Hauptverfasser: Jeong, Se-Won, Sim, Jae-Young
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, a unified detection algorithm of viewindependent and view-dependent saliency for three-dimensional mesh models is proposed. While the conventional techniques use the irregular meshes, we adopt the semi-regular meshes to overcome the drawback of irregular connectivity for saliency computation. We employ the angular deviation of normal vectors between neighboring faces as geometric curvature features, which are evaluated at hierarchically structured triangle faces. We construct a fully connected graph at each level of semi-regular mesh, where the face patches serve as graph nodes. At the base mesh level, we estimate the saliency as the stationary distribution of random walk. At the higher level meshes, we take the maximum value between the stationary distribution of random walk at the current level and an upsampled saliency map from the previous coarser scale. Moreover, we also propose a view-dependent saliency detection method that employs the visibility feature in addition to the geometric features to estimate the saliency with respect to a selected viewpoint. Experimental results demonstrate that the proposed saliency detection algorithm captures global conspicuous regions reliably and detects locally detailed geometric features faithfully, compared with the conventional techniques.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2017.2710802