The Anytime Convergence of Stochastic Gradient Descent with Momentum: From a Continuous-Time Perspective

We study the stochastic optimization problem from a continuous-time perspective, with a focus on the Stochastic Gradient Descent with Momentum (SGDM) method. We show that the trajectory of SGDM, despite its \emph{stochastic} nature, converges in $L_2$-norm to a \emph{deterministic} second-order Ordi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Feng, Yasong, Jiang, Yifan, Wang, Tianyu, Ying, Zhiliang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!