Prune and Tune Ensembles: Low-Cost Ensemble Learning With Sparse Independent Subnetworks
Ensemble Learning is an effective method for improving generalization in machine learning. However, as state-of-the-art neural networks grow larger, the computational cost associated with training several independent networks becomes expensive. We introduce a fast, low-cost method for creating diver...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Ensemble Learning is an effective method for improving generalization in
machine learning. However, as state-of-the-art neural networks grow larger, the
computational cost associated with training several independent networks
becomes expensive. We introduce a fast, low-cost method for creating diverse
ensembles of neural networks without needing to train multiple models from
scratch. We do this by first training a single parent network. We then create
child networks by cloning the parent and dramatically pruning the parameters of
each child to create an ensemble of members with unique and diverse topologies.
We then briefly train each child network for a small number of epochs, which
now converge significantly faster when compared to training from scratch. We
explore various ways to maximize diversity in the child networks, including the
use of anti-random pruning and one-cycle tuning. This diversity enables "Prune
and Tune" ensembles to achieve results that are competitive with traditional
ensembles at a fraction of the training cost. We benchmark our approach against
state of the art low-cost ensemble methods and display marked improvement in
both accuracy and uncertainty estimation on CIFAR-10 and CIFAR-100. |
---|---|
DOI: | 10.48550/arxiv.2202.11782 |