Proposing Smart System for Detecting and Monitoring Vehicle Using Multiobject Multicamera Tracking
Nowadays, the strong development of the economy and society has driven the increase in traffic participation, making traffic management increasingly difficult. To effectively address this issue, AI applications are being applied to improve urban traffic management and operations. Therefore, we propo...
Gespeichert in:
Veröffentlicht in: | International Journal of Digital Multimedia Broadcasting 2024-04, Vol.2024, p.1-14 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Nowadays, the strong development of the economy and society has driven the increase in traffic participation, making traffic management increasingly difficult. To effectively address this issue, AI applications are being applied to improve urban traffic management and operations. Therefore, we propose a smart system to detect and monitor vehicles across multiple surveillance cameras. Our system leverages data collected from traffic surveillance cameras and harnesses the power of deep learning technology to detect and track vehicles smoothly. To achieve this, we use the YOLO model for detection in conjunction with the DeepSORT algorithm for precise vehicle tracking on each camera. Furthermore, our system uses a ResNet backbone model for feature extraction of objects within each camera’s frame. It utilizes cosine distance to identify similar objects in other cameras, facilitating multicamera tracking. To ensure optimal performance, our system is implemented using the NVIDIA DeepStream SDK, enabling it to achieve an impressive speed of 21 fps on each camera and an average of precision approximately 85% for three modules. The results of our study affirm the system’s suitability and its potential for practical applications in the field of urban traffic management. |
---|---|
ISSN: | 1687-7578 1687-7586 |
DOI: | 10.1155/2024/6667738 |