M-GCN: Multi-Branch Graph Convolution Network for 2D Image-based on 3D Model Retrieval

2D image based 3D model retrieval is a challenging research topic in the field of 3D model retrieval. The huge gap between two modalities - 2D image and 3D model, extremely constrains the retrieval performance. In order to handle this problem, we propose a novel multi-branch graph convolution networ...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2021, Vol.23, p.1962-1976
Hauptverfasser: Nie, Wei-Zhi, Ren, Min-Jie, Liu, An-An, Mao, Zhendong, Nie, Jie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:2D image based 3D model retrieval is a challenging research topic in the field of 3D model retrieval. The huge gap between two modalities - 2D image and 3D model, extremely constrains the retrieval performance. In order to handle this problem, we propose a novel multi-branch graph convolution network (M-GCN) to address the 2D image based 3D model retrieval problem. First, we compute the similarity between 2D image and 3D model based on visual information to construct one cross-modalities graph model, which can provide the original relationship between image and 3D model. However, this relationship is not accurate because of the difference of modalities. Thus, the multi-head attention mechanism is employed to generate a set of fully connected edge-weighted graphs, which can predict the hidden relationship between 2D image and 3D model to further strengthen the correlation for the embedding generation of nodes. Finally, we apply the max-pooling operation to fuse the multi-graphs information and generate the fusion embeddings of nodes for retrieval. To validate the performance of our method, we evaluated M-GCN on the MI3DOR dataset, Shrec 2018 track and Shrec 2014 track. The experimental results demonstrate the superiority of our proposed method over the state-of-the-art methods.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2020.3006371