Papaya: Federated Learning, but Fully Decentralized
Federated Learning systems use a centralized server to aggregate model updates. This is a bandwidth and resource-heavy constraint and exposes the system to privacy concerns. We instead implement a peer to peer learning system in which nodes train on their own data and periodically perform a weighted...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Federated Learning systems use a centralized server to aggregate model
updates. This is a bandwidth and resource-heavy constraint and exposes the
system to privacy concerns. We instead implement a peer to peer learning system
in which nodes train on their own data and periodically perform a weighted
average of their parameters with that of their peers according to a learned
trust matrix. So far, we have created a model client framework and have been
using this to run experiments on the proposed system using multiple virtual
nodes which in reality exist on the same computer. We used this strategy as
stated in Iteration 1 of our proposal to prove the concept of peer to peer
learning with shared parameters. We now hope to run more experiments and build
a more deployable real world system for the same. |
---|---|
DOI: | 10.48550/arxiv.2303.06189 |