Accelerated Learning with Robustness to Adversarial Regressors
High order momentum-based parameter update algorithms have seen widespread applications in training machine learning models. Recently, connections with variational approaches have led to the derivation of new learning algorithms with accelerated learning guarantees. Such methods however, have only c...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | High order momentum-based parameter update algorithms have seen widespread
applications in training machine learning models. Recently, connections with
variational approaches have led to the derivation of new learning algorithms
with accelerated learning guarantees. Such methods however, have only
considered the case of static regressors. There is a significant need for
parameter update algorithms which can be proven stable in the presence of
adversarial time-varying regressors, as is commonplace in control theory. In
this paper, we propose a new discrete time algorithm which 1) provides
stability and asymptotic convergence guarantees in the presence of adversarial
regressors by leveraging insights from adaptive control theory and 2) provides
non-asymptotic accelerated learning guarantees leveraging insights from convex
optimization. In particular, our algorithm reaches an $\epsilon$ sub-optimal
point in at most $\tilde{\mathcal{O}}(1/\sqrt{\epsilon})$ iterations when
regressors are constant - matching lower bounds due to Nesterov of
$\Omega(1/\sqrt{\epsilon})$, up to a $\log(1/\epsilon)$ factor and provides
guaranteed bounds for stability when regressors are time-varying. We provide
numerical experiments for a variant of Nesterov's provably hard convex
optimization problem with time-varying regressors, as well as the problem of
recovering an image with a time-varying blur and noise using streaming data. |
---|---|
DOI: | 10.48550/arxiv.2005.01529 |