CA-YOLOv5: A YOLO model for apple detection in the natural environment

Improving the effectiveness of harvesting robots requires quick and accurate apple detection in natural environments. The colour and shape features of apples are corrupted due to the reflected light and the incomplete coverage of the fruit bag, bringing difficulties to apple detection. To address th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Systems science & control engineering 2024-12, Vol.12 (1)
Hauptverfasser: Yang, Ruotong, He, Yuanbo, Hu, Zhiwei, Gao, Ruibo, Yang, Hua
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Improving the effectiveness of harvesting robots requires quick and accurate apple detection in natural environments. The colour and shape features of apples are corrupted due to the reflected light and the incomplete coverage of the fruit bag, bringing difficulties to apple detection. To address this issue, the Coordinate Attention You Only Look Once version 5 (CA-YOLOv5) is designed to simultaneously detect bagged and unbagged apples in the natural environment. Firstly, 1525 apple images are collected from apple orchards to build a dataset. Secondly, to solve the reflected light problem, all C3 modules in the Backbone are substituted for Coordinate Attention modules which can improve the feature representation of objects. Finally, to solve the incomplete bagging problem, the Path Aggregation Network in the Neck is replaced by a Bidirectional Feature Pyramid Network which can better fuse the features of various sizes. The CA-YOLOv5 network reaches 82.7%, 89.8%, 48.6%, and 87.0% for recall, mAP@0.5, mAP@0.5:0.95, and F1 score, respectively, which is 2.3%,1.2%,1.9%, and 2.9% higher than the YOLOv5. The results reveal that CA-YOLOv5 has much superior detection performance than the original YOLOv5, and it can serve as a technical benchmark for the development of automatic orchard-picking robots.
ISSN:2164-2583
2164-2583
DOI:10.1080/21642583.2023.2278905