Parameter-less Pareto local search for multi-objective neural architecture search with the Interleaved Multi-start Scheme
With the emerging deployment of deep neural networks, such as in mobile devices and autonomous cars, there is a growing demand for neural architecture search (NAS) to automatically design powerful network architectures. It is more reasonable to formulate NAS as a multi-objective optimization problem...
Gespeichert in:
Veröffentlicht in: | Swarm and evolutionary computation 2024-06, Vol.87, p.101573, Article 101573 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the emerging deployment of deep neural networks, such as in mobile devices and autonomous cars, there is a growing demand for neural architecture search (NAS) to automatically design powerful network architectures. It is more reasonable to formulate NAS as a multi-objective optimization problem. In addition to prediction performance, multi-objective NAS (MONAS) problems take into account other criteria like the number of parameters and inference latency. Multi-objective evolutionary algorithms (MOEAs) are the preferred approach for tackling MONAS due to their effectiveness in dealing with multi-objective optimization problems. Recently, local search-based NAS algorithms have demonstrated their efficiency over MOEAs for MONAS problems. However, their performance has been only verified on bi-objective NAS problems. In this article, we propose a local search algorithm for multi-objective NAS (LOMONAS), an efficient local search framework for solving not only bi-objective NAS problems but also NAS problems having more than two objectives. We additionally present a parameter-less version of LOMONAS, namely IMS-LOMONAS, by combining LOMONAS with the Interleaved Multi-start Scheme (IMS) to help NAS practitioners avoid manual control parameter settings. Experimental results from a series of benchmark problems in the CEC’23 Competition demonstrate the competitiveness of LOMONAS and IMS-LOMONAS compared to MOEAs in tackling MONAS within both small-scale and large-scale search spaces. Source code is available at: https://github.com/ELO-Lab/IMS-LOMONAS. |
---|---|
ISSN: | 2210-6502 |
DOI: | 10.1016/j.swevo.2024.101573 |