DRGCNN: Dynamic region graph convolutional neural network for point clouds
•DRGConv allows each point can gather different regional features.•DRGConv module can be integrated to other backbone networks.•Informationfusionfrombothlocalandglobalcontextsonpoint-clouds.•DRGConv module has translational invariance. Convolutional Neural Network (CNN) is good at processing regular...
Gespeichert in:
Veröffentlicht in: | Expert systems with applications 2022-11, Vol.205, p.117663, Article 117663 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •DRGConv allows each point can gather different regional features.•DRGConv module can be integrated to other backbone networks.•Informationfusionfrombothlocalandglobalcontextsonpoint-clouds.•DRGConv module has translational invariance.
Convolutional Neural Network (CNN) is good at processing regular data, but point clouds are regular and discretely distributed in space. In order to handle point cloud, we use graph to relate points to points and use graph convolution to process point cloud. In neurology, it is well known that the receptive field size of visual cortical neurons is regulated by stimulation. Different sizes of receptive fields have different information. So, we construct different dynamic graph with point’s neighbors to expand each point’s receptive field and adaptively select the information in the different size of receptive field. So in this paper, we propose a new convolution method, Dynamic Region Graph Convolution. It consists of three parts: graph construction, region selection, feature fusion. And it is easily integrated into other popular networks. In experiment, to proof the effectiveness and robustness, we choose different k to construct graph and different number of points for training. We show the encouraging performance on point cloud data sets ModelNet40, ShapeNetPart with 93.3% and 86.2%. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2022.117663 |