Weed Detection and Classification with Computer Vision Using a Limited Image Dataset

In agriculture, as precision farming increasingly employs robots to monitor crops, the use of weeding and harvesting robots is expanding the need for computer vision. Currently, most researchers and companies address these computer vision tasks with CNN-based deep learning. This technology requires...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied sciences 2024-06, Vol.14 (11), p.4839
Hauptverfasser: Moldvai, László, Mesterházi, Péter Ákos, Teschner, Gergely, Nyéki, Anikó
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In agriculture, as precision farming increasingly employs robots to monitor crops, the use of weeding and harvesting robots is expanding the need for computer vision. Currently, most researchers and companies address these computer vision tasks with CNN-based deep learning. This technology requires large datasets of plant and weed images labeled by experts, as well as substantial computational resources. However, traditional feature-based approaches to computer vision can extract meaningful parameters and achieve comparably good classification results with only a tenth of the dataset size. This study presents these methods and seeks to determine the minimum number of training images required to achieve reliable classification. We tested the classification results with 5, 10, 20, 40, 80, and 160 images per weed type in a four-class classification system. We extracted shape features, distance transformation features, color histograms, and texture features. Each type of feature was tested individually and in various combinations to determine the best results. Using six types of classifiers, we achieved a 94.56% recall rate with 160 images per weed. Better results were obtained with more training images and a greater variety of features.
ISSN:2076-3417
2076-3417
DOI:10.3390/app14114839