Multi-step Planning for Automated Hyperparameter Optimization with OptFormer
As machine learning permeates more industries and models become more expensive and time consuming to train, the need for efficient automated hyperparameter optimization (HPO) has never been more pressing. Multi-step planning based approaches to hyperparameter optimization promise improved efficiency...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | As machine learning permeates more industries and models become more
expensive and time consuming to train, the need for efficient automated
hyperparameter optimization (HPO) has never been more pressing. Multi-step
planning based approaches to hyperparameter optimization promise improved
efficiency over myopic alternatives by more effectively balancing out
exploration and exploitation. However, the potential of these approaches has
not been fully realized due to their technical complexity and computational
intensity. In this work, we leverage recent advances in Transformer-based,
natural-language-interfaced hyperparameter optimization to circumvent these
barriers. We build on top of the recently proposed OptFormer which casts both
hyperparameter suggestion and target function approximation as autoregressive
generation thus making planning via rollouts simple and efficient. We conduct
extensive exploration of different strategies for performing multi-step
planning on top of the OptFormer model to highlight its potential for use in
constructing non-myopic HPO strategies. |
---|---|
DOI: | 10.48550/arxiv.2210.04971 |