Tar Spot Disease Identification and Severity Estimation Using Deep Learning
Highlights Image classification was used to identify tar spot, NLB, GLS, and NLS corn disease with 99.51% testing accuracy. YOLOv7 object detection was used to locate and identify tar spot lesions on infected leaves with mAP@0.5 of 40.46%. A novel tar spot severity estimation approach was developed...
Gespeichert in:
Veröffentlicht in: | Journal of the ASABE 2024, Vol.67 (5), p.1353-1368 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1368 |
---|---|
container_issue | 5 |
container_start_page | 1353 |
container_title | Journal of the ASABE |
container_volume | 67 |
creator | Ahmad, Aanis Aggarwal, Varun Saraswat, Dharmendra Johal, Gurmukh S. |
description | Highlights
Image classification was used to identify tar spot, NLB, GLS, and NLS corn disease with 99.51% testing accuracy.
YOLOv7 object detection was used to locate and identify tar spot lesions on infected leaves with mAP@0.5 of 40.46%.
A novel tar spot severity estimation approach was developed using deep learning and color histogram thresholding.
A web-based disease diagnosis tool was developed to help with an accurate, in-field diagnosis of tar spot in corn.
Abstract.
Management of tar spot disease in corn has traditionally relied on manual field scouting and visual analysis since it was first observed in the US in 2015. The disease has been identified using deep learning (DL) models as the application of computer vision and DL techniques for disease management is increasing. The severity of the disease has been estimated using close-range images of infected corn leaves under lab conditions with uniform backgrounds. However, DL models trained using images acquired under uniform lab conditions are limited in their ability to generalize to field conditions. Although recent studies have shown success in quantifying the disease, its analysis under field conditions with complex backgrounds to provide a field-ready solution in the form of an application has not yet been developed. Therefore, this study acquired a custom handheld imagery dataset of 455 images for the tar spot disease in a greenhouse with noisy backgrounds to simulate field conditions for training DL-based disease identification models. The dataset was combined with a publicly available Corn Disease and Severity (CD&S) dataset consisting of field-acquired diseased images corresponding to Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS). Image classification models were first trained to accurately identify tar spot, NLB, GLS, and NLS diseases. To accurately locate and identify tar spot disease lesions, YOLOv7 object detection models were then trained. In addition, semantic segmentation models were trained using the UNet architecture for leaf and lesion segmentation. After training the image classification models, the highest testing accuracy of 99.51% was achieved with the InceptionV3 model. For YOLOv7 object detection, the highest mAP of 40.46% was achieved for locating and identifying tar spot disease lesions on infected leaves. Finally, for UNet semantic segmentation, the mIoU for leaf and lesion segmentation were 0.80 and 0.28, respectively. Therefore, tradit |
doi_str_mv | 10.13031/ja.15634 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3149763619</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3149763619</sourcerecordid><originalsourceid>FETCH-LOGICAL-c629-d768b190dede6fc4d99ce5ba27d4f039926d0f7fef62caf44f1965ad5eca63463</originalsourceid><addsrcrecordid>eNpNUMtOwzAQtBBIVKUH_sASJw4pfsWJj6gtUBGJQ8PZcuw1cgRJsFOk_j1Rw4HTzo5GOzuD0C0la8oJpw-tWdNccnGBFqyQKuOsLC7_4Wu0Sik0RJSKckblAr3WJuLD0I94GxKYBHjvoBuDD9aMoe-w6Rw-wA_EMJ7wLo3ha-bfU-g-8BZgwBWY2E3bDbry5jPB6m8uUf20qzcvWfX2vN88VpmVTGWukGVDFXHgQHornFIW8sawwglPuFJMOuILD14ya7wQniqZG5eDNVM0yZfobj47xP77CGnUbX-M3eSoORWqkFxSNanuZ5WNfUoRvB7i9Hs8aUr0uS3dGn1ui_8C5vFcnQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3149763619</pqid></control><display><type>article</type><title>Tar Spot Disease Identification and Severity Estimation Using Deep Learning</title><source>ASABE Technical Library</source><creator>Ahmad, Aanis ; Aggarwal, Varun ; Saraswat, Dharmendra ; Johal, Gurmukh S.</creator><creatorcontrib>Ahmad, Aanis ; Aggarwal, Varun ; Saraswat, Dharmendra ; Johal, Gurmukh S.</creatorcontrib><description>Highlights
Image classification was used to identify tar spot, NLB, GLS, and NLS corn disease with 99.51% testing accuracy.
YOLOv7 object detection was used to locate and identify tar spot lesions on infected leaves with mAP@0.5 of 40.46%.
A novel tar spot severity estimation approach was developed using deep learning and color histogram thresholding.
A web-based disease diagnosis tool was developed to help with an accurate, in-field diagnosis of tar spot in corn.
Abstract.
Management of tar spot disease in corn has traditionally relied on manual field scouting and visual analysis since it was first observed in the US in 2015. The disease has been identified using deep learning (DL) models as the application of computer vision and DL techniques for disease management is increasing. The severity of the disease has been estimated using close-range images of infected corn leaves under lab conditions with uniform backgrounds. However, DL models trained using images acquired under uniform lab conditions are limited in their ability to generalize to field conditions. Although recent studies have shown success in quantifying the disease, its analysis under field conditions with complex backgrounds to provide a field-ready solution in the form of an application has not yet been developed. Therefore, this study acquired a custom handheld imagery dataset of 455 images for the tar spot disease in a greenhouse with noisy backgrounds to simulate field conditions for training DL-based disease identification models. The dataset was combined with a publicly available Corn Disease and Severity (CD&S) dataset consisting of field-acquired diseased images corresponding to Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS). Image classification models were first trained to accurately identify tar spot, NLB, GLS, and NLS diseases. To accurately locate and identify tar spot disease lesions, YOLOv7 object detection models were then trained. In addition, semantic segmentation models were trained using the UNet architecture for leaf and lesion segmentation. After training the image classification models, the highest testing accuracy of 99.51% was achieved with the InceptionV3 model. For YOLOv7 object detection, the highest mAP of 40.46% was achieved for locating and identifying tar spot disease lesions on infected leaves. Finally, for UNet semantic segmentation, the mIoU for leaf and lesion segmentation were 0.80 and 0.28, respectively. Therefore, traditional color histogram thresholding was used for segmenting the tar spot lesions, and DL techniques were used to develop a novel severity estimation framework. After evaluating the models, the image classification model was deployed on a progressive web application accessible through a smartphone to enable real-time analysis. In this study, different DL models were trained and evaluated for tar spot disease identification and its severity estimation. In addition, a smartphone-based disease diagnosis tool was developed that has the potential for an accurate, in-field diagnosis of tar spot in corn. Keywords: Computer vision, Deep learning, Disease identification, Image classification, Precision agriculture, Severity estimation, Tar spot.</description><identifier>ISSN: 2769-3287</identifier><identifier>ISSN: 2769-3295</identifier><identifier>EISSN: 2769-3287</identifier><identifier>DOI: 10.13031/ja.15634</identifier><language>eng</language><publisher>St. Joseph: American Society of Agricultural and Biological Engineers</publisher><subject>Applications programs ; Classification ; Computer vision ; Corn ; Datasets ; Deep learning ; Diagnosis ; Disease ; Disease management ; Image acquisition ; Image classification ; Image processing ; Image segmentation ; Leaf blight ; Leafspot ; Leaves ; Lesions ; Medical imaging ; Northern leaf blight ; Object recognition ; Observational learning ; Real time ; Semantic segmentation ; Semantics ; Smartphones ; Tar ; Tar spot ; Training ; Vegetables ; Visual discrimination learning ; Visual fields ; Visual observation</subject><ispartof>Journal of the ASABE, 2024, Vol.67 (5), p.1353-1368</ispartof><rights>Copyright American Society of Agricultural and Biological Engineers 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,4009,27902,27903,27904</link.rule.ids></links><search><creatorcontrib>Ahmad, Aanis</creatorcontrib><creatorcontrib>Aggarwal, Varun</creatorcontrib><creatorcontrib>Saraswat, Dharmendra</creatorcontrib><creatorcontrib>Johal, Gurmukh S.</creatorcontrib><title>Tar Spot Disease Identification and Severity Estimation Using Deep Learning</title><title>Journal of the ASABE</title><description>Highlights
Image classification was used to identify tar spot, NLB, GLS, and NLS corn disease with 99.51% testing accuracy.
YOLOv7 object detection was used to locate and identify tar spot lesions on infected leaves with mAP@0.5 of 40.46%.
A novel tar spot severity estimation approach was developed using deep learning and color histogram thresholding.
A web-based disease diagnosis tool was developed to help with an accurate, in-field diagnosis of tar spot in corn.
Abstract.
Management of tar spot disease in corn has traditionally relied on manual field scouting and visual analysis since it was first observed in the US in 2015. The disease has been identified using deep learning (DL) models as the application of computer vision and DL techniques for disease management is increasing. The severity of the disease has been estimated using close-range images of infected corn leaves under lab conditions with uniform backgrounds. However, DL models trained using images acquired under uniform lab conditions are limited in their ability to generalize to field conditions. Although recent studies have shown success in quantifying the disease, its analysis under field conditions with complex backgrounds to provide a field-ready solution in the form of an application has not yet been developed. Therefore, this study acquired a custom handheld imagery dataset of 455 images for the tar spot disease in a greenhouse with noisy backgrounds to simulate field conditions for training DL-based disease identification models. The dataset was combined with a publicly available Corn Disease and Severity (CD&S) dataset consisting of field-acquired diseased images corresponding to Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS). Image classification models were first trained to accurately identify tar spot, NLB, GLS, and NLS diseases. To accurately locate and identify tar spot disease lesions, YOLOv7 object detection models were then trained. In addition, semantic segmentation models were trained using the UNet architecture for leaf and lesion segmentation. After training the image classification models, the highest testing accuracy of 99.51% was achieved with the InceptionV3 model. For YOLOv7 object detection, the highest mAP of 40.46% was achieved for locating and identifying tar spot disease lesions on infected leaves. Finally, for UNet semantic segmentation, the mIoU for leaf and lesion segmentation were 0.80 and 0.28, respectively. Therefore, traditional color histogram thresholding was used for segmenting the tar spot lesions, and DL techniques were used to develop a novel severity estimation framework. After evaluating the models, the image classification model was deployed on a progressive web application accessible through a smartphone to enable real-time analysis. In this study, different DL models were trained and evaluated for tar spot disease identification and its severity estimation. In addition, a smartphone-based disease diagnosis tool was developed that has the potential for an accurate, in-field diagnosis of tar spot in corn. Keywords: Computer vision, Deep learning, Disease identification, Image classification, Precision agriculture, Severity estimation, Tar spot.</description><subject>Applications programs</subject><subject>Classification</subject><subject>Computer vision</subject><subject>Corn</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Diagnosis</subject><subject>Disease</subject><subject>Disease management</subject><subject>Image acquisition</subject><subject>Image classification</subject><subject>Image processing</subject><subject>Image segmentation</subject><subject>Leaf blight</subject><subject>Leafspot</subject><subject>Leaves</subject><subject>Lesions</subject><subject>Medical imaging</subject><subject>Northern leaf blight</subject><subject>Object recognition</subject><subject>Observational learning</subject><subject>Real time</subject><subject>Semantic segmentation</subject><subject>Semantics</subject><subject>Smartphones</subject><subject>Tar</subject><subject>Tar spot</subject><subject>Training</subject><subject>Vegetables</subject><subject>Visual discrimination learning</subject><subject>Visual fields</subject><subject>Visual observation</subject><issn>2769-3287</issn><issn>2769-3295</issn><issn>2769-3287</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNUMtOwzAQtBBIVKUH_sASJw4pfsWJj6gtUBGJQ8PZcuw1cgRJsFOk_j1Rw4HTzo5GOzuD0C0la8oJpw-tWdNccnGBFqyQKuOsLC7_4Wu0Sik0RJSKckblAr3WJuLD0I94GxKYBHjvoBuDD9aMoe-w6Rw-wA_EMJ7wLo3ha-bfU-g-8BZgwBWY2E3bDbry5jPB6m8uUf20qzcvWfX2vN88VpmVTGWukGVDFXHgQHornFIW8sawwglPuFJMOuILD14ya7wQniqZG5eDNVM0yZfobj47xP77CGnUbX-M3eSoORWqkFxSNanuZ5WNfUoRvB7i9Hs8aUr0uS3dGn1ui_8C5vFcnQ</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Ahmad, Aanis</creator><creator>Aggarwal, Varun</creator><creator>Saraswat, Dharmendra</creator><creator>Johal, Gurmukh S.</creator><general>American Society of Agricultural and Biological Engineers</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7ST</scope><scope>7TB</scope><scope>8FD</scope><scope>C1K</scope><scope>FR3</scope><scope>KR7</scope><scope>SOI</scope></search><sort><creationdate>2024</creationdate><title>Tar Spot Disease Identification and Severity Estimation Using Deep Learning</title><author>Ahmad, Aanis ; Aggarwal, Varun ; Saraswat, Dharmendra ; Johal, Gurmukh S.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c629-d768b190dede6fc4d99ce5ba27d4f039926d0f7fef62caf44f1965ad5eca63463</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Applications programs</topic><topic>Classification</topic><topic>Computer vision</topic><topic>Corn</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Diagnosis</topic><topic>Disease</topic><topic>Disease management</topic><topic>Image acquisition</topic><topic>Image classification</topic><topic>Image processing</topic><topic>Image segmentation</topic><topic>Leaf blight</topic><topic>Leafspot</topic><topic>Leaves</topic><topic>Lesions</topic><topic>Medical imaging</topic><topic>Northern leaf blight</topic><topic>Object recognition</topic><topic>Observational learning</topic><topic>Real time</topic><topic>Semantic segmentation</topic><topic>Semantics</topic><topic>Smartphones</topic><topic>Tar</topic><topic>Tar spot</topic><topic>Training</topic><topic>Vegetables</topic><topic>Visual discrimination learning</topic><topic>Visual fields</topic><topic>Visual observation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ahmad, Aanis</creatorcontrib><creatorcontrib>Aggarwal, Varun</creatorcontrib><creatorcontrib>Saraswat, Dharmendra</creatorcontrib><creatorcontrib>Johal, Gurmukh S.</creatorcontrib><collection>CrossRef</collection><collection>Environment Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>Engineering Research Database</collection><collection>Civil Engineering Abstracts</collection><collection>Environment Abstracts</collection><jtitle>Journal of the ASABE</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ahmad, Aanis</au><au>Aggarwal, Varun</au><au>Saraswat, Dharmendra</au><au>Johal, Gurmukh S.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Tar Spot Disease Identification and Severity Estimation Using Deep Learning</atitle><jtitle>Journal of the ASABE</jtitle><date>2024</date><risdate>2024</risdate><volume>67</volume><issue>5</issue><spage>1353</spage><epage>1368</epage><pages>1353-1368</pages><issn>2769-3287</issn><issn>2769-3295</issn><eissn>2769-3287</eissn><abstract>Highlights
Image classification was used to identify tar spot, NLB, GLS, and NLS corn disease with 99.51% testing accuracy.
YOLOv7 object detection was used to locate and identify tar spot lesions on infected leaves with mAP@0.5 of 40.46%.
A novel tar spot severity estimation approach was developed using deep learning and color histogram thresholding.
A web-based disease diagnosis tool was developed to help with an accurate, in-field diagnosis of tar spot in corn.
Abstract.
Management of tar spot disease in corn has traditionally relied on manual field scouting and visual analysis since it was first observed in the US in 2015. The disease has been identified using deep learning (DL) models as the application of computer vision and DL techniques for disease management is increasing. The severity of the disease has been estimated using close-range images of infected corn leaves under lab conditions with uniform backgrounds. However, DL models trained using images acquired under uniform lab conditions are limited in their ability to generalize to field conditions. Although recent studies have shown success in quantifying the disease, its analysis under field conditions with complex backgrounds to provide a field-ready solution in the form of an application has not yet been developed. Therefore, this study acquired a custom handheld imagery dataset of 455 images for the tar spot disease in a greenhouse with noisy backgrounds to simulate field conditions for training DL-based disease identification models. The dataset was combined with a publicly available Corn Disease and Severity (CD&S) dataset consisting of field-acquired diseased images corresponding to Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS). Image classification models were first trained to accurately identify tar spot, NLB, GLS, and NLS diseases. To accurately locate and identify tar spot disease lesions, YOLOv7 object detection models were then trained. In addition, semantic segmentation models were trained using the UNet architecture for leaf and lesion segmentation. After training the image classification models, the highest testing accuracy of 99.51% was achieved with the InceptionV3 model. For YOLOv7 object detection, the highest mAP of 40.46% was achieved for locating and identifying tar spot disease lesions on infected leaves. Finally, for UNet semantic segmentation, the mIoU for leaf and lesion segmentation were 0.80 and 0.28, respectively. Therefore, traditional color histogram thresholding was used for segmenting the tar spot lesions, and DL techniques were used to develop a novel severity estimation framework. After evaluating the models, the image classification model was deployed on a progressive web application accessible through a smartphone to enable real-time analysis. In this study, different DL models were trained and evaluated for tar spot disease identification and its severity estimation. In addition, a smartphone-based disease diagnosis tool was developed that has the potential for an accurate, in-field diagnosis of tar spot in corn. Keywords: Computer vision, Deep learning, Disease identification, Image classification, Precision agriculture, Severity estimation, Tar spot.</abstract><cop>St. Joseph</cop><pub>American Society of Agricultural and Biological Engineers</pub><doi>10.13031/ja.15634</doi><tpages>16</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2769-3287 |
ispartof | Journal of the ASABE, 2024, Vol.67 (5), p.1353-1368 |
issn | 2769-3287 2769-3295 2769-3287 |
language | eng |
recordid | cdi_proquest_journals_3149763619 |
source | ASABE Technical Library |
subjects | Applications programs Classification Computer vision Corn Datasets Deep learning Diagnosis Disease Disease management Image acquisition Image classification Image processing Image segmentation Leaf blight Leafspot Leaves Lesions Medical imaging Northern leaf blight Object recognition Observational learning Real time Semantic segmentation Semantics Smartphones Tar Tar spot Training Vegetables Visual discrimination learning Visual fields Visual observation |
title | Tar Spot Disease Identification and Severity Estimation Using Deep Learning |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T20%3A57%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Tar%20Spot%20Disease%20Identification%20and%20Severity%20Estimation%20Using%20Deep%20Learning&rft.jtitle=Journal%20of%20the%20ASABE&rft.au=Ahmad,%20Aanis&rft.date=2024&rft.volume=67&rft.issue=5&rft.spage=1353&rft.epage=1368&rft.pages=1353-1368&rft.issn=2769-3287&rft.eissn=2769-3287&rft_id=info:doi/10.13031/ja.15634&rft_dat=%3Cproquest_cross%3E3149763619%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3149763619&rft_id=info:pmid/&rfr_iscdi=true |