Insects identification with convolutional neural network technique in the sweet corn field

A method to identify the type of insects with accurate and precise results is of importance. Nowadays, an automatic object identification system with increased accuracy, improved speed, and less cost have been developed. Convolutional Neural Network (CNN) implementation for image identification or c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IOP conference series. Earth and environmental science 2021-02, Vol.653 (1), p.12030
Hauptverfasser: Naufal, A P, Kanjanaphachoat, C, Wijaya, A, Setiawan, N A, Masithoh, R E
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A method to identify the type of insects with accurate and precise results is of importance. Nowadays, an automatic object identification system with increased accuracy, improved speed, and less cost have been developed. Convolutional Neural Network (CNN) implementation for image identification or classification can be done by collecting large-scale datasets containing hundreds to millions of images to study the many parameters involved in the network. This research was conducted to develop and apply the CNN model to identify eight species of insects in the sweet corn field in Thailand. Those insects were Calomycterus sp., Rhopalosiphum maidis, Frankliniella williamsi, Spodoptera frugiperda, Spodoptera litura, Ostrinia furnacalis, Mythimna separata, and Helicoverpa armigera. The CNN model in this research was built with four convolutional layers, which consist of Conv2D, batch normalization, max pooling, dropout sublayer, and a fully-connected layer. in total, 5568 images were trained with 10 trials and different train attempts for each trial, were then tested with 40 images. The result shows that the CNN model has succeeded in identifying images of sweet corn insects with 80% up to 95% prediction accuracy for images with no background.
ISSN:1755-1307
1755-1315
DOI:10.1088/1755-1315/653/1/012030