A multi-object tracker using dynamic Bayesian networks and a residual neural network based similarity estimator
In this paper we introduce a novel multi-object tracker based on the tracking-by-detection paradigm. This tracker utilises a Dynamic Bayesian Network for predicting objects’ positions through filtering and updating in real-time. The algorithm is trained and then tested using the MOTChallenge11https:...
Gespeichert in:
Veröffentlicht in: | Computer vision and image understanding 2022-12, Vol.225, p.103569, Article 103569 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper we introduce a novel multi-object tracker based on the tracking-by-detection paradigm. This tracker utilises a Dynamic Bayesian Network for predicting objects’ positions through filtering and updating in real-time. The algorithm is trained and then tested using the MOTChallenge11https://motchallenge.net/. challenge benchmark of video sequences. After initial testing, a state-of-the-art residual neural network for extracting feature descriptors is used. This ResNet feature extractor is integrated into the tracking algorithm for object similarity estimation to further enhance tracker performance. Finally, we demonstrate the effects of object detection on tracker performance using a custom trained state of the art You Only Look Once (YOLO) V5 object detector. Results are analysed and evaluated using the MOTChallenge Evaluation Kit, followed by a comparison to state-of-the-art methods.
•A new multi-object tracker using a dynamic Bayesian network is introduced•State of the art residual neural network for extracting feature descriptors•DBN trained on MOTChallenge data and Feature extractor on the MARS pedestrian dataset•IoU and similarity distances along with Hungarian algorithm for max cost assignment•A custom trained object detector for analysing effects on tracker performance |
---|---|
ISSN: | 1077-3142 1090-235X |
DOI: | 10.1016/j.cviu.2022.103569 |