An efficient mobile model for insect image classification in the field pest management
Accurately recognizing insect pest in their larva phase is significant to take the early treatment on the infected crops, thus helping timely reduce the yield loss in agricultural products. The convolutional neural networks (CNNs)-based classification methods have become the most competitive methods...
Gespeichert in:
Veröffentlicht in: | Engineering science and technology, an international journal an international journal, 2023-03, Vol.39, p.101335, Article 101335 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Accurately recognizing insect pest in their larva phase is significant to take the early treatment on the infected crops, thus helping timely reduce the yield loss in agricultural products. The convolutional neural networks (CNNs)-based classification methods have become the most competitive methods to address many technical challenges related to image recognition in the field. Focusing on accurate and small models carried on mobile devices, this study proposed a novel pest classification method PCNet (Pest Classification Network) based on lightweight CNNs embedded attention mechanism. PCNet was designed with EfficientNet V2 as the backbone, and the coordinate attention mechanism (CA) was incorporated in this architecture to learn the inter-channel pest information and pest positional information of input images. Moreover, combining the feature maps output by mobile inverted bottleneck (MBConv) with the feature maps output by average pooling to develop the feature fusion module, which implements the feature fusion between shallow layers and deep layers to address the loss of insect pest features in the down-sampling procedures. In addition, a stochastic, pipeline-based data augmentation approach was adopted to randomly enhance data diversity and thus avoid model overfitting. The experimental results show that the PCNet model achieved recognition accuracy of 98.4 % on the self-built dataset consisting of 30 classes of larvae, which outperforms three classic CNN models (AlexNet, VGG16, and ResNet101), and four lightweight CNN models (ShuffleNet V2, MobileNet V3, EfficientNet V1 and V2). To further verify the robustness on different datasets, the proposed model was also tested on two other public datasets: IP102 and miniImageNet. The recognition accuracy of PCNet is 73.7 % on the IP102 dataset, outperforming other models and 94.0 % on miniImageNet dataset, which is only lower than that of ResNet101 and MobileNet V3. The number of PCNet parameters is 20.7 M, which is less than those of traditional classic CNN models. The satisfactory accuracy and small size of this model makes it suitable for real-time pest recognition in the field with resource constrained mobile devices. Our code will be available at https://github.com/pby521/PCNet/tree/master. |
---|---|
ISSN: | 2215-0986 2215-0986 |
DOI: | 10.1016/j.jestch.2023.101335 |