SEV‐Net: Residual network embedded with attention mechanism for plant disease severity detection
Summary Early and accurate assessment of plant disease severity is key to preventing disease attack. Traditional detection methods rely on manual vision to distinguish between types of disease infection, but this is time consuming, laborious and inaccurate. To address this problem, this paper propos...
Gespeichert in:
Veröffentlicht in: | Concurrency and computation 2021-05, Vol.33 (10), p.n/a |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Summary
Early and accurate assessment of plant disease severity is key to preventing disease attack. Traditional detection methods rely on manual vision to distinguish between types of disease infection, but this is time consuming, laborious and inaccurate. To address this problem, this paper proposes a deep learning‐based attentional network model (SEV‐Net) for plant disease severity identification and classification. The network embeds the improved channel and spatial attention module into the residual block of ResNet. The proposed attention module reduces the redundancy of information between channels and focuses on the most information‐rich regions of the feature map. In this experiment, SEV‐Net achieved an accuracy of 97.59% and 95.37% for multiple and single plant (Tomato) disease severity classification, which was better than existing attentional networks (SE‐Net and CBAM). Moreover, the combination of visualization techniques showed that SEV‐Net was adept at distinguishing small variations between plant diseases, proving the feasibility and effectiveness of the network. Furthermore, we have also designed and developed an Android application for real‐time classification of plant disease severity. The system deploys the SEV‐Net network model, which has higher classification accuracy and faster recognition speed. |
---|---|
ISSN: | 1532-0626 1532-0634 |
DOI: | 10.1002/cpe.6161 |