EBNAS: Efficient binary network design for image classification via neural architecture search
To deploy Convolutional Neural Networks (CNNs) on resource-limited devices, binary CNNs with 1-bit activations and weights prove to be a promising approach. Meanwhile, Neural Architecture Search (NAS), which can design lightweight networks beyond artificial ones, has achieved optimal performance in...
Gespeichert in:
Veröffentlicht in: | Engineering applications of artificial intelligence 2023-04, Vol.120, p.105845, Article 105845 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To deploy Convolutional Neural Networks (CNNs) on resource-limited devices, binary CNNs with 1-bit activations and weights prove to be a promising approach. Meanwhile, Neural Architecture Search (NAS), which can design lightweight networks beyond artificial ones, has achieved optimal performance in various tasks. To design high-performance binary networks, we propose an efficient binary neural architecture search algorithm, namely EBNAS. In this paper, we propose corresponding improvement strategies to deal with the information loss due to binarization, the discrete error between search and evaluation, and the imbalanced operation advantage in the search space. Specifically, we adopt a new search space consisting of operations suitable for the binary domain. An L2 path regularization and a variance-based edge regularization are introduced to guide the search process and drive architecture parameters toward discretization. In addition, we present a search space simplification strategy and adjust the channel sampling proportions to balance the advantages of different operations. We perform extensive experiments on CIFAR10, CIFAR100, and ImageNet datasets. The results demonstrate the effectiveness of our proposed methods. For example, with binary weights and activations, EBNAS achieves a Top-1 accuracy of 95.61% on CIFAR10, 78.10% on CIFAR100, and 67.8% on ImageNet. With a similar number of model parameters, our algorithm outperforms other binary NAS methods in terms of accuracy and efficiency. Compared with manually designed binary networks, our algorithm remains competitive. The code is available at https://github.com/sscckk/EBNAS. |
---|---|
ISSN: | 0952-1976 1873-6769 |
DOI: | 10.1016/j.engappai.2023.105845 |