Random Reshuffling with Momentum for Nonconvex Problems: Iteration Complexity and Last Iterate Convergence
Random reshuffling with momentum (RRM) corresponds to the SGD optimizer with momentum option enabled, as found in popular machine learning libraries like PyTorch and TensorFlow. Despite its widespread use in practical applications, the understanding of its convergence properties in nonconvex scenari...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-04 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Random reshuffling with momentum (RRM) corresponds to the SGD optimizer with momentum option enabled, as found in popular machine learning libraries like PyTorch and TensorFlow. Despite its widespread use in practical applications, the understanding of its convergence properties in nonconvex scenarios remains limited. Under a Lipschitz smoothness assumption, this paper provides one of the first iteration complexities for RRM. Specifically, we prove that RRM achieves the iteration complexity \(O(n^{-1/3}((1-\beta^n)T)^{-2/3})\) where \(n\) denotes the number of component functions \(f(\cdot;i)\) and \(\beta \in [0,1)\) is the momentum parameter. Furthermore, every accumulation point of a sequence of iterates \(\{x^k\}_k\) generated by RRM is shown to be a stationary point of the problem. In addition, under the Kurdyka-Lojasiewicz inequality - a local geometric property - the iterates \(\{x^k\}_k\) provably converge to a unique stationary point \(x^*\) of the objective function. Importantly, in our analysis, this last iterate convergence is obtained without requiring convexity nor a priori boundedness of the iterates. Finally, for polynomial step size schemes, convergence rates of the form \(\|x^k - x^*\| = O(k^{-p})\), \(\|\nabla f(x^k)\|^2 = O(k^{-q})\), and \(|f(x^k) - f(x^*)| = O(k^{-q})\), \(p \in (0,1]\), \(q \in (0,2]\) are derived. |
---|---|
ISSN: | 2331-8422 |