Evolutionary neural architecture search based on efficient CNN models population for image classification

The aim of this work is to search for a Convolutional Neural Network (CNN) architecture that performs optimally across all factors, including accuracy, memory footprint, and computing time, suitable for mobile devices. Although deep learning has evolved for use on devices with minimal resources, its...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2023-07, Vol.82 (16), p.23917-23943
Hauptverfasser: Termritthikun, Chakkrit, Jamtsho, Yeshi, Muneesawang, Paisarn, Zhao, Jia, Lee, Ivan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The aim of this work is to search for a Convolutional Neural Network (CNN) architecture that performs optimally across all factors, including accuracy, memory footprint, and computing time, suitable for mobile devices. Although deep learning has evolved for use on devices with minimal resources, its implementation is hampered by that these devices are not designed to tackle complex tasks, such as CNN architectures. To address this limitation, a Network Architecture Search (NAS) strategy is considered, which employs a Multi-Objective Evolutionary Algorithm (MOEA) to create an efficient and robust CNN architecture by focusing on three objectives: fast processing times, reduced storage, and high accuracy. Furthermore, we proposed a new Efficient CNN Population Initialization (ECNN-PI) method that utilizes a combination of random and selected strong models to generate the first-generation population. To validate the proposed method, CNN models are trained using CIFAR-10, CIFAR-100, ImageNet, STL-10, FOOD-101, THFOOD-50, FGVC Aircraft, DTD, and Oxford-IIIT Pets benchmark datasets. The MOEA-Net algorithm outperformed other models on CIFAR-10, whereas MOEANet with the ECNN-PI method outperformed other models on CIFAR-10 and CIFAR-100. Furthermore, both the MOEA-Net algorithm and MOEA-Net with the ECNN-PI method outperformed DARTS, P-DARTS, and Relative-NAS for small-scale multi-class and fine-grained datasets.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-14187-y