Point Cloud Classification Model Based on a Dual-Input Deep Network Framework

The disorder, sparseness, irregularity, noise, and background of point clouds cause significant challenges in point cloud classification tasks. In such tasks, deep learning methods based on raw point cloud data have recently achieved good performance on simulated data. However, many methods experien...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020, Vol.8, p.55991-55999
Hauptverfasser: Zhai, Ruifeng, Li, Xueyan, Wang, Zhenxin, Guo, Shuxu, Hou, Shuzhao, Hou, Yu, Gao, Fengli, Song, Junfeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The disorder, sparseness, irregularity, noise, and background of point clouds cause significant challenges in point cloud classification tasks. In such tasks, deep learning methods based on raw point cloud data have recently achieved good performance on simulated data. However, many methods experience problems when applied to realistic data containing much noise and complex background information. This paper proposes an end-to-end dual-input network (DINet) point cloud classification model based on deep learning. In the proposed model, a feature extractor obtains high-dimensional features, a feature comparator aggregates and disperses homogeneous and heterogeneous point clouds, respectively, in the feature space, and a feature analyzer completes the task. The two-channel data input facilitates a universal DINet framework that is flexible and extends to other models generalized across different datasets. DINet brings improvements in performance, presenting an overall accuracy value of 81.3% and average accuracy value of 79.6% in experiments conducted on the real-world ScanObjectNN dataset. The code for the proposed point cloud classification model is available at https://github.com/zhairf/DINet.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.2981357