Byzantine-Robust Decentralized Learning via ClippedGossip
In this paper, we study the challenging task of Byzantine-robust decentralized training on arbitrary communication graphs. Unlike federated learning where workers communicate through a server, workers in the decentralized environment can only talk to their neighbors, making it harder to reach consen...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we study the challenging task of Byzantine-robust
decentralized training on arbitrary communication graphs. Unlike federated
learning where workers communicate through a server, workers in the
decentralized environment can only talk to their neighbors, making it harder to
reach consensus and benefit from collaborative training. To address these
issues, we propose a ClippedGossip algorithm for Byzantine-robust consensus and
optimization, which is the first to provably converge to a
$O(\delta_{\max}\zeta^2/\gamma^2)$ neighborhood of the stationary point for
non-convex objectives under standard assumptions. Finally, we demonstrate the
encouraging empirical performance of ClippedGossip under a large number of
attacks. |
---|---|
DOI: | 10.48550/arxiv.2202.01545 |