Linear Convergence of Consensus-Based Quantized Optimization for Smooth and Strongly Convex Cost Functions
This article proposes a distributed optimization method for minimizing the sum of smooth and strongly convex functions with a finite communication bandwidth. Each agent has a state and an auxiliary variable to estimate the optimal solution and the average gradient of the global cost function. To coo...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on automatic control 2021-03, Vol.66 (3), p.1254-1261 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This article proposes a distributed optimization method for minimizing the sum of smooth and strongly convex functions with a finite communication bandwidth. Each agent has a state and an auxiliary variable to estimate the optimal solution and the average gradient of the global cost function. To cooperatively estimate the optimal solution, agents exchange the states and the auxiliary variables with their neighbors over weight-balanced networks by a dynamic encoding and decoding scheme. After the information exchanges, each agent locally updates the own state and auxiliary variable by a quantized gradient-tracking algorithm. We show that the state updated by the proposed quantized algorithm converges to the optimal solution at a linear convergence rate. We also show a sufficient condition for guaranteeing a finite communication bandwidth. |
---|---|
ISSN: | 0018-9286 1558-2523 |
DOI: | 10.1109/TAC.2020.2989281 |