Robustness and Convergence Analysis of First-Order Distributed Optimization Algorithms over Subspace Constraints
This paper extends algorithms that remove the fixed point bias of decentralized gradient descent to solve the more general problem of distributed optimization over subspace constraints. Leveraging the integral quadratic constraint framework, we analyze the performance of these generalized algorithms...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper extends algorithms that remove the fixed point bias of
decentralized gradient descent to solve the more general problem of distributed
optimization over subspace constraints. Leveraging the integral quadratic
constraint framework, we analyze the performance of these generalized
algorithms in terms of worst-case robustness and convergence rate. The utility
of our framework is demonstrated by showing how one of the extended algorithms,
originally designed for consensus, is now able to solve a multitask inference
problem. |
---|---|
DOI: | 10.48550/arxiv.2210.16277 |