S-ADDOPT: Decentralized Stochastic First-Order Optimization Over Directed Graphs
We study decentralized stochastic optimization to minimize a sum of smooth and strongly convex cost functions when the functions are distributed over a directed network of nodes. In contrast to the existing work, we use gradient tracking to improve certain aspects of the resulting algorithm. In part...
Gespeichert in:
Veröffentlicht in: | IEEE control systems letters 2021-07, Vol.5 (3), p.953-958 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We study decentralized stochastic optimization to minimize a sum of smooth and strongly convex cost functions when the functions are distributed over a directed network of nodes. In contrast to the existing work, we use gradient tracking to improve certain aspects of the resulting algorithm. In particular, we propose the S-ADDOPT algorithm that assumes a stochastic first-order oracle at each node and show that for a constant step-size \alpha , each node converges linearly inside an error ball around the optimal solution, the size of which is controlled by \alpha . For decaying step-sizes \mathcal {O} (1/ {k} ), we show that S-ADDOPT reaches the exact solution sublinearly at \mathcal {O} (1/ {k} ) and its convergence is asymptotically network-independent. Thus the asymptotic behavior of S-ADDOPT is comparable to the centralized stochastic gradient descent. Numerical experiments over both strongly convex and non-convex problems illustrate the convergence behavior and the performance comparison of the proposed algorithm. |
---|---|
ISSN: | 2475-1456 2475-1456 |
DOI: | 10.1109/LCSYS.2020.3006420 |