High-Performance YOLOv5s: Traffic Sign Detection Algorithm for Small Target
Traffic sign detection is one of essential foundations in intelligent transportation and intelligent driving systems. For autonomous driving, traffic signs captured by cameras are usually small, and detecting low resolution imagines of traffic signs at long distances is still a big challenge. To imp...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.191527-191536 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Traffic sign detection is one of essential foundations in intelligent transportation and intelligent driving systems. For autonomous driving, traffic signs captured by cameras are usually small, and detecting low resolution imagines of traffic signs at long distances is still a big challenge. To improve the accuracy of small traffic sign detection, this paper proposes an improved lightweight model based on you only look once version 5 small (YOLOv5s). Firstly, a dense connection is employed to reduce the number of parameters in the main network, facilitating multiple reuses of large-scale feature maps to strengthen the ability for extracting information from small targets. Secondly, a new C3 module is constructed by combining the receptive-field attention convolution operation (RFCAConv) mechanism for feature fusion in the neck network to make the network more focused on details of small targets in feature maps. Finally, we replace the original corrected intersection over union (CIOU) loss function with inner-shape intersection over union (Inner-SIOU) loss function, which improves both the training speed and accuracy of the model. Testing results on public traffic signs datasets of CCTSDB2021 and TT100K indicate that the proposed mode reduces the parameter count by 30%, increases mAP0.5 by 4-5%, and boosts FPS by 16% comparing with the original YOLOv5s model. |
---|---|
ISSN: | 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3513445 |