Comparison of Deep Learning Models for Corn Disease Region Location, Identification of Disease Type, and Severity Estimation Using Images Acquired From UAS-Mounted and Handheld Sensors

Highlights An approach using deep learning was proposed for identifying diseased regions in UAS imagery of corn fields with 97.23% testing accuracy using the VGG16 model. Disease types were identified within the diseased regions with a testing accuracy of 98.85% using the VGG16 model. On the disease...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of the ASABE 2022, Vol.65 (6), p.1433-1442
Hauptverfasser: Ahmad, Aanis, Saraswat, Dharmendra, Gamal, Aly, Johal, Gurmukh S.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Highlights An approach using deep learning was proposed for identifying diseased regions in UAS imagery of corn fields with 97.23% testing accuracy using the VGG16 model. Disease types were identified within the diseased regions with a testing accuracy of 98.85% using the VGG16 model. On the diseased leaves, severity was estimated with a testing accuracy of 94.20% using the VGG16 model. Deep Learning models have the potential to bring efficiency and accuracy to field scouting. Abstract . Accurately locating diseased regions, identifying disease types, and estimating disease severity in corn fields are all connected steps for developing an effective disease management system. Traditional disease management that relied on a manual scouting approach was inefficient. Therefore, the research community is working on developing advanced disease management systems using deep learning. However, most of the past studies used public datasets consisting of images with uniform backgrounds acquired under lab conditions to train deep learning models, thus, limiting their use under field conditions. In addition, limited studies have been conducted for in-field corn disease analysis using Unmanned Aerial System (UAS) imagery. Therefore, UAS and handheld imagery sensors were used in this study to acquire corn disease images from fields located at Purdue University’s Agronomy Center for Research and Education (ACRE) in the summer of 2020. A total of 55 UAS flights were conducted over three different corn fields from June 20 through September 29, resulting in a collection of approximately 59,000 images. A novel three-stage approach was proposed by independently training a total of nine image classification models using three neural network architectures, namely: VGG16, ResNet50, and InceptionV3, for locating diseased regions, identifying disease types, and estimating disease severity under field conditions. Diseased regions were first identified accurately in UAS-acquired corn field imagery by a sliding window and deep learning-based image classification, with testing accuracies of up to 97.23%. Diseased region identification was followed by accurately identifying three common corn diseases, namely Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS), within the diseased regions with testing accuracies of up to 98.85%. Finally, the severity of the NLS disease on leaves was estimated with a testing accuracy of up to 94.20%. The VGG16 model achieved th
ISSN:2769-3287
2769-3295
2769-3287
DOI:10.13031/ja.14895