Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks
Network consensus optimization has received increasing attention in recent years and has found important applications in many scientific and engineering fields. To solve network consensus optimization problems, one of the most well-known approaches is the distributed gradient descent method (DGD). H...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Network consensus optimization has received increasing attention in recent
years and has found important applications in many scientific and engineering
fields. To solve network consensus optimization problems, one of the most
well-known approaches is the distributed gradient descent method (DGD).
However, in networks with slow communication rates, DGD's performance is
unsatisfactory for solving high-dimensional network consensus problems due to
the communication bottleneck. This motivates us to design a
communication-efficient DGD-type algorithm based on compressed information
exchanges. Our contributions in this paper are three-fold: i) We develop a
communication-efficient algorithm called amplified-differential compression DGD
(ADC-DGD) and show that it converges under {\em any} unbiased compression
operator; ii) We rigorously prove the convergence performances of ADC-DGD and
show that they match with those of DGD without compression; iii) We reveal an
interesting phase transition phenomenon in the convergence speed of ADC-DGD.
Collectively, our findings advance the state-of-the-art of network consensus
optimization theory. |
---|---|
DOI: | 10.48550/arxiv.1812.04048 |