The Role of Adaptive Optimizers for Honest Private Hyperparameter Selection
Hyperparameter optimization is a ubiquitous challenge in machine learning, and the performance of a trained model depends crucially upon their effective selection. While a rich set of tools exist for this purpose, there are currently no practical hyperparameter selection methods under the constraint...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Hyperparameter optimization is a ubiquitous challenge in machine learning,
and the performance of a trained model depends crucially upon their effective
selection. While a rich set of tools exist for this purpose, there are
currently no practical hyperparameter selection methods under the constraint of
differential privacy (DP). We study honest hyperparameter selection for
differentially private machine learning, in which the process of hyperparameter
tuning is accounted for in the overall privacy budget. To this end, we i) show
that standard composition tools outperform more advanced techniques in many
settings, ii) empirically and theoretically demonstrate an intrinsic connection
between the learning rate and clipping norm hyperparameters, iii) show that
adaptive optimizers like DPAdam enjoy a significant advantage in the process of
honest hyperparameter tuning, and iv) draw upon novel limiting behaviour of
Adam in the DP setting to design a new and more efficient optimizer. |
---|---|
DOI: | 10.48550/arxiv.2111.04906 |