Vision-Based Method Integrating Deep Learning Detection for Tracking Multiple Construction Machines

AbstractTracking construction machines in videos is a fundamental step in the automated surveillance of construction productivity, safety, and project progress. However, existing vision-based tracking methods are not able to achieve high tracking precision, robustness, and practical processing speed...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of computing in civil engineering 2021-03, Vol.35 (2)
Hauptverfasser: Xiao, Bo, Kang, Shih-Chung
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:AbstractTracking construction machines in videos is a fundamental step in the automated surveillance of construction productivity, safety, and project progress. However, existing vision-based tracking methods are not able to achieve high tracking precision, robustness, and practical processing speed simultaneously. Occlusions and illumination variations on construction sites also prevent vision-based tracking methods from obtaining optimal tracking performance. To address these challenges, this research proposes a vision-based method, called construction machine tracker (CMT), to track multiple construction machines in videos. CMT consists of three main modules: detection, association, and assignment. The detection module detects construction machines using the deep learning algorithm YOLOv3 in each frame. Then the association module relates the detection results of two consecutive frames, and the assignment module produces the tracking results. In testing, CMT achieved 93.2% in multiple object tracking accuracy (MOTA) and 86.5% in multiple object tracking precision (MOTP) with a processing speed of 20.8 frames per second when tested on four construction videos. The proposed CMT was integrated into a framework of analyzing excavator productivity in earthmoving cycles and achieved 96.9% accuracy.
ISSN:0887-3801
1943-5487
DOI:10.1061/(ASCE)CP.1943-5487.0000957