DCTN: a dense parallel network combining CNN and transformer for identifying plant disease in field
Crop diseases can have a detrimental impact on crop growth, resulting in a significant reduction in crop yield. Therefore, accurate detection of these diseases is crucial for enhancing crop productivity. Despite notable advancements in deep learning techniques for disease identification, most experi...
Gespeichert in:
Veröffentlicht in: | Soft computing (Berlin, Germany) Germany), 2023-11, Vol.27 (21), p.15549-15561 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Crop diseases can have a detrimental impact on crop growth, resulting in a significant reduction in crop yield. Therefore, accurate detection of these diseases is crucial for enhancing crop productivity. Despite notable advancements in deep learning techniques for disease identification, most experiments have been conducted under simplified laboratory conditions, posing challenges for accurately identifying crop diseases in complex real-world field environments. To bridge this gap, we draw inspiration from the Transformer model’s ability to capture long-range global dependencies and handle occlusion. We propose a novel approach called Dense CNNs and Transformer Network (DCTN) for accurate detection of field crop diseases. Moreover, we introduce a new attention mechanism that utilizes multi-head self-attention via deep separable convolution projection and down-sampling, significantly enhancing computational efficiency. Additionally, we have meticulously curated and cleaned a dataset of 45,547 images depicting healthy and diseased crops in real-field environments. Our proposed method demonstrates superior performance, particularly in terms of its robustness against background interference in crop disease detection. Notably, DCTN achieves accuracies of 93.01% and 99.69% on our dataset and a publicly available dataset, respectively. For those who are interested, the code for our approach will be made available on
https://github.com/wh9704/DCTN
. |
---|---|
ISSN: | 1432-7643 1433-7479 |
DOI: | 10.1007/s00500-023-09071-2 |