Yolo-global: a real-time target detector for mineral particles

Recently, deep learning methodologies have achieved significant advancements in mineral automatic sorting and anomaly detection. However, the limited features of minerals transported in the form of small particles pose significant challenges to accurate detection. To address this challenge, we propo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of real-time image processing 2024-05, Vol.21 (3), p.85, Article 85
Hauptverfasser: Wang, Zihao, Zhou, Dong, Guo, Chengjun, Zhou, Ruihao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, deep learning methodologies have achieved significant advancements in mineral automatic sorting and anomaly detection. However, the limited features of minerals transported in the form of small particles pose significant challenges to accurate detection. To address this challenge, we propose a enhanced mineral particle detection algorithm based on the YOLOv8s model. Initially, a C2f-SRU block is introduced to enable the feature extraction network to more effectively process spatial redundant information. Additionally, we designed the GFF module with the aim of enhancing information propagation between non-adjacent scale features, thereby enabling deep networks to more fully leverage spatial positional information from shallower networks. Finally, we adopted the Wise-IoU loss function to optimize the detection performance of the model. We also re-designed the position of the prediction heads to achieve precise detection of small-scale targets. The experimental results substantiate the effectiveness of the algorithm, with YOLO-Global achieving a mAP@.5 of 95.8%. In comparison to the original YOLOv8s, the improved model exhibits a 2.5% increase in mAP, achieving a model inference speed of 81 fps, meeting the requirements for real-time processing and accuracy.
ISSN:1861-8200
1861-8219
DOI:10.1007/s11554-024-01468-y