Real-Time People Tracking and Identification From Sparse mm-Wave Radar Point-Clouds

Mm-wave radars have recently gathered significant attention as a means to track human movement and identify subjects from their gait characteristics. A widely adopted method to perform the identification is the extraction of the micro-Doppler signature of the targets, which is computationally demand...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2021, Vol.9, p.78504-78520
Hauptverfasser: Pegoraro, Jacopo, Rossi, Michele
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Mm-wave radars have recently gathered significant attention as a means to track human movement and identify subjects from their gait characteristics. A widely adopted method to perform the identification is the extraction of the micro-Doppler signature of the targets, which is computationally demanding in case of co-existing multiple targets within the monitored physical space. Such computational complexity is the main problem of state-of-the-art approaches, and makes them inapt for real-time use. In this work, we present an end-to-end, low-complexity but highly accurate method to track and identify multiple subjects in real-time using the sparse point-cloud sequences obtained from a low-cost mm-wave radar. Our proposed system features an extended object tracking Kalman filter, used to estimate the position, shape and extension of the subjects, which is integrated with a novel deep learning classifier, specifically tailored for effective feature extraction and fast inference on radar point-clouds. The proposed method is thoroughly evaluated on an edge-computing platform from NVIDIA (Jetson series), obtaining greatly reduced execution times (reduced complexity) against the best approaches from the literature. Specifically, it achieves accuracies as high as 91.62%, operating at 15 frames per seconds, in identifying three subjects that concurrently and freely move in an unseen indoor environment, among a group of eight.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3083980