SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via Train-Free Metrics
Neural Architecture Search (NAS) algorithms are intended to remove the burden of manual neural network design, and have shown to be capable of designing excellent models for a variety of well-known problems. However, these algorithms require a variety of design parameters in the form of user configu...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Neural Architecture Search (NAS) algorithms are intended to remove the burden
of manual neural network design, and have shown to be capable of designing
excellent models for a variety of well-known problems. However, these
algorithms require a variety of design parameters in the form of user
configuration or hard-coded decisions which limit the variety of networks that
can be discovered. This means that NAS algorithms do not eliminate model design
tuning, they instead merely shift the burden of where that tuning needs to be
applied. In this paper, we present SpiderNet, a hybrid
differentiable-evolutionary and hardware-aware algorithm that rapidly and
efficiently produces state-of-the-art networks. More importantly, SpiderNet is
a proof-of-concept of a minimally-configured NAS algorithm; the majority of
design choices seen in other algorithms are incorporated into SpiderNet's
dynamically-evolving search space, minimizing the number of user choices to
just two: reduction cell count and initial channel count. SpiderNet produces
models highly-competitive with the state-of-the-art, and outperforms random
search in accuracy, runtime, memory size, and parameter count. |
---|---|
DOI: | 10.48550/arxiv.2204.09320 |