Tree Seedlings Detection and Counting Using a Deep Learning Algorithm

Tree-counting methods based on computer vision technologies are low-cost and efficient in contrast to the traditional tree counting methods, which are time-consuming, laborious, and humanly infeasible. This study presents a method for detecting and counting tree seedlings in images using a deep lear...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied sciences 2023-01, Vol.13 (2), p.895
Hauptverfasser: Moharram, Deema, Yuan, Xuguang, Li, Dan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Tree-counting methods based on computer vision technologies are low-cost and efficient in contrast to the traditional tree counting methods, which are time-consuming, laborious, and humanly infeasible. This study presents a method for detecting and counting tree seedlings in images using a deep learning algorithm with a high economic value and broad application prospects in detecting the type and quantity of tree seedlings. The dataset was built with three types of tree seedlings: dragon spruce, black chokeberries, and Scots pine. The data were augmented via several data augmentation methods to improve the accuracy of the detection model and prevent overfitting. Then a YOLOv5 object detection network was built and trained with three types of tree seedlings to obtain the training weights. The results of the experiments showed that our proposed method could effectively identify and count the tree seedlings in an image. Specifically, the MAP of the dragon spruce, black chokeberries, and Scots pine tree seedlings were 89.8%, 89.1%, and 95.6%, respectively. The accuracy of the detection model reached 95.10% on average (98.58% for dragon spruce, 91.62% for black chokeberries, and 95.11% for Scots pine). The proposed method can provide technical support for the statistical tasks of counting trees.
ISSN:2076-3417
2076-3417
DOI:10.3390/app13020895