PL-Net3d: Robust 3D Object Class Recognition Using Geometric Models
Three-dimensional point clouds produced by 3D scanners are often noisy and contain outliers. Such data inaccuracies can significantly affect current deep learning-based methods and reduce their ability to classify objects. Most deep neural networks-based object classification methods were targeted t...
Gespeichert in:
Veröffentlicht in: | IEEE access 2019, Vol.7, p.163757-163766 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Three-dimensional point clouds produced by 3D scanners are often noisy and contain outliers. Such data inaccuracies can significantly affect current deep learning-based methods and reduce their ability to classify objects. Most deep neural networks-based object classification methods were targeted to achieve high classification accuracy without considering classification robustness. Thus, despite their great success, they still fail to achieve good classification accuracy with low levels of noise and outliers. This work is carried out to develop a robust network structure that can solidly identify objects. The proposed method uses patches of planar segments, which can robustly capture object appearance. The planar segments information are then fed into a deep neural network for classification. We base our approach on the PointNet deep learning architecture. Our method was tested against several kinds of data inaccuracies such as scattered outliers, clustered outliers, noise and missing points. The proposed method shows excellent performance in the presence of these inaccuracies compared to state-of-the-art techniques. By decomposing objects into planes, the suggested method is simple, fast, provides good classification accuracy and can handle different kinds of point cloud data inaccuracies. The code can be found at https://github.com/AymanMukh/Pl-Net3D. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2952638 |