Pretraining Neural Architecture Search Controllers with Locality-based Self-Supervised Learning
Neural architecture search (NAS) has fostered various fields of machine learning. Despite its prominent dedications, many have criticized the intrinsic limitations of high computational cost. We aim to ameliorate this by proposing a pretraining scheme that can be generally applied to controller-base...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Neural architecture search (NAS) has fostered various fields of machine
learning. Despite its prominent dedications, many have criticized the intrinsic
limitations of high computational cost. We aim to ameliorate this by proposing
a pretraining scheme that can be generally applied to controller-based NAS. Our
method, locality-based self-supervised classification task, leverages the
structural similarity of network architectures to obtain good architecture
representations. We incorporate our method into neural architecture
optimization (NAO) to analyze the pretrained embeddings and its effectiveness
and highlight that adding metric learning loss brings a favorable impact on
NAS. Our code is available at
\url{https://github.com/Multi-Objective-NAS/self-supervised-nas}. |
---|---|
DOI: | 10.48550/arxiv.2103.08157 |