Detection and classification of vehicles from omnidirectional videos using multiple silhouettes

To detect and classify vehicles in omnidirectional videos, we propose an approach based on the shape (silhouette) of the moving object obtained by background subtraction. Different from other shape-based classification techniques, we exploit the information available in multiple frames of the video....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern analysis and applications : PAA 2017-08, Vol.20 (3), p.893-905
Hauptverfasser: Karaimer, Hakki Can, Baris, Ipek, Bastanlar, Yalin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To detect and classify vehicles in omnidirectional videos, we propose an approach based on the shape (silhouette) of the moving object obtained by background subtraction. Different from other shape-based classification techniques, we exploit the information available in multiple frames of the video. We investigated two different approaches for this purpose. One is combining silhouettes extracted from a sequence of frames to create an average silhouette, the other is making individual decisions for all frames and use consensus of these decisions. Using multiple frames eliminates most of the wrong decisions which are caused by a poorly extracted silhouette from a single video frame. The vehicle types we classify are motorcycle, car (sedan) and van (minibus). The features extracted from the silhouettes are convexity, elongation, rectangularity and Hu moments. We applied two separate methods of classification. First one is a flowchart-based method that we developed and the second is K-nearest neighbour classification. 60% of the samples in the dataset are used for training. To ensure randomization in the experiments, threefold cross-validation is applied. The results indicate that using multiple silhouettes increases the classification performance.
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-017-0593-z