A recognition method for cucumber diseases using leaf symptom images based on deep convolutional neural network

•A deep convolutional neural network (DCNN) is presented to recognize four cucumber diseases.•Taking symptom images as input, the DCNN was conducted symptom-wise disease recognition.•The method achieved good recognition results on disease images captured in field conditions. Manual approaches to rec...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers and electronics in agriculture 2018-11, Vol.154, p.18-24
Hauptverfasser: Ma, Juncheng, Du, Keming, Zheng, Feixiang, Zhang, Lingxian, Gong, Zhihong, Sun, Zhongfu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•A deep convolutional neural network (DCNN) is presented to recognize four cucumber diseases.•Taking symptom images as input, the DCNN was conducted symptom-wise disease recognition.•The method achieved good recognition results on disease images captured in field conditions. Manual approaches to recognize cucumber diseases are often time-consuming, laborious and subjective. A deep convolutional neural network (DCNN) was proposed to conduct symptom-wise recognition of four cucumber diseases, i.e., anthracnose, downy mildew, powdery mildew, and target leaf spots. The symptom images were segmented from cucumber leaf images captured under field conditions. In order to decrease the chance of overfitting, data augmentation methods were utilized to enlarge the datasets formed by the segmented symptom images. With the augmented datasets containing 14,208 symptom images, the DCNN achieved good recognition results, with an accuracy of 93.4%. In order to compare the results of the DCNN, comparative experiments were conducted using conventional classifiers (Random Forest and Support Vector Machines), as well as AlexNet. Results showed that the DCNN was a robust tool for recognizing the cucumber diseases in field conditions.
ISSN:0168-1699
1872-7107
DOI:10.1016/j.compag.2018.08.048