Use of Spatiotemporal Parameters of Gait for Automated Classification of Pedestrian Gender and Age

This study investigates the feasibility of using the spatiotemporal parameters of gait—step frequency and step length—as cues for classifying pedestrians according to their gender and age. The gait parameters are automatically extracted from the pedestrian walking speed profile. Computer vision tech...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Transportation research record 2013-01, Vol.2393 (1), p.31-40
Hauptverfasser: Hediyeh, Houman, Sayed, Tarek, Zaki, Mohamed H.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study investigates the feasibility of using the spatiotemporal parameters of gait—step frequency and step length—as cues for classifying pedestrians according to their gender and age. The gait parameters are automatically extracted from the pedestrian walking speed profile. Computer vision techniques are used for the automatic detection and tracking of pedestrians in an open (uncontrolled) environment. The classification is undertaken by using a simple k nearest neighbor algorithm. For demonstration, two case studies are used: Vancouver, British Columbia, Canada, and Oakland, California. For gender, correct classification rates of 78% and 81% were achieved for the Vancouver and Oakland case studies, respectively. Gender classification for the Vancouver case study considered pedestrians walking alone or in groups, and the Oakland case study gender classification considered only pedestrians walking alone. Pedestrian age classification resulted in a correct classification rate of 86% for the Oakland case study. Another classification measure, the kappa statistic, showed that the classification results were statistically significant beyond what is expected by chance. The method has the advantages of relying only on the pedestrian speed profile and using a simple classification algorithm.
ISSN:0361-1981
2169-4052
DOI:10.3141/2393-04