Riemannian SVRG: Fast Stochastic Optimization on Riemannian Manifolds
Advances in Neural Information Processing Systems 29 (NIPS 2016) We study optimization of finite sums of geodesically smooth functions on Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums have witnessed tremendous attention in the recent years, existing work is...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Advances in Neural Information Processing Systems 29 (NIPS 2016) We study optimization of finite sums of geodesically smooth functions on
Riemannian manifolds. Although variance reduction techniques for optimizing
finite-sums have witnessed tremendous attention in the recent years, existing
work is limited to vector space problems. We introduce Riemannian SVRG (RSVRG),
a new variance reduced Riemannian optimization method. We analyze RSVRG for
both geodesically convex and nonconvex (smooth) functions. Our analysis reveals
that RSVRG inherits advantages of the usual SVRG method, but with factors
depending on curvature of the manifold that influence its convergence. To our
knowledge, RSVRG is the first provably fast stochastic Riemannian method.
Moreover, our paper presents the first non-asymptotic complexity analysis
(novel even for the batch setting) for nonconvex Riemannian optimization. Our
results have several implications; for instance, they offer a Riemannian
perspective on variance reduced PCA, which promises a short, transparent
convergence analysis. |
---|---|
DOI: | 10.48550/arxiv.1605.07147 |