Bioinspired point cloud representation: 3D object tracking

The problem of processing point cloud sequences is considered in this work. In particular, a system that represents and tracks objects in dynamic scenes acquired using low-cost sensors such as the Kinect is presented. An efficient neural network-based approach is proposed to represent and estimate t...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural computing & applications 2018-05, Vol.29 (9), p.663-672
Hauptverfasser: Orts-Escolano, Sergio, Garcia-Rodriguez, Jose, Cazorla, Miguel, Morell, Vicente, Azorin, Jorge, Saval, Marcelo, Garcia-Garcia, Alberto, Villena, Victor
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The problem of processing point cloud sequences is considered in this work. In particular, a system that represents and tracks objects in dynamic scenes acquired using low-cost sensors such as the Kinect is presented. An efficient neural network-based approach is proposed to represent and estimate the motion of 3D objects. This system addresses multiple computer vision tasks such as object segmentation, representation, motion analysis and tracking. The use of a neural network allows the unsupervised estimation of motion and the representation of objects in the scene. This proposal avoids the problem of finding corresponding features while tracking moving objects. A set of experiments are presented that demonstrate the validity of our method to track 3D objects. Moreover, an optimization strategy is applied to achieve real-time processing rates. Favorable results are presented demonstrating the capabilities of the GNG-based algorithm for this task. Some videos of the proposed system are available on the project website ( http://www.dtic.ua.es/~sorts/3d_object_tracking/ ).
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-016-2585-0