Efficient Re-parameterization Operations Search for Easy-to-Deploy Network Based on Directional Evolutionary Strategy

Traditional NAS methods improve performance by sacrificing the landing ability of the architecture and the re-parameterization technology is expected to solve this problem. However, most current Rep methods rely on prior knowledge to select the re-parameterization operations, which limits the archit...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural processing letters 2023-12, Vol.55 (7), p.8903-8926
Hauptverfasser: Yu, Xinyi, Wang, Xiaowei, Rong, Jintao, Zhang, Mingyang, Ou, Linlin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Traditional NAS methods improve performance by sacrificing the landing ability of the architecture and the re-parameterization technology is expected to solve this problem. However, most current Rep methods rely on prior knowledge to select the re-parameterization operations, which limits the architecture performance to the type of operations and prior knowledge. At same time, some re-parameterization operations hinder the optimization of the network. To break these restrictions, in this work, an improved re-parameterization search space is designed, including more type of re-parameterization operations. Concretely, the performance of convolutional networks can be further enhanced by the search space. An automatic re-parameterization enhancement strategy is designed to effectively explore this search space based on neural architecture search (NAS), which can search an excellent re-parameterization architecture. Then, we solved the optimization problem caused by using some re-parameterization operations to enhance ResNet-style network. Besides, we visualize the output features of the architecture to analyze the reasons for the formation of the re-parameterization architecture. On public datasets, we achieve better results. Under the same training conditions as ResNet, we improve the accuracy of ResNet-50 by 1.82% on ImageNet-1k.
ISSN:1370-4621
1573-773X
DOI:10.1007/s11063-023-11184-6