Learning Across Domains and Devices: Style-Driven Source-Free Domain Adaptation in Clustered Federated Learning
Federated Learning (FL) has recently emerged as a possible way to tackle the domain shift in real-world Semantic Segmentation (SS) without compromising the private nature of the collected data. However, most of the existing works on FL unrealistically assume labeled data in the remote clients. Here...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Federated Learning (FL) has recently emerged as a possible way to tackle the
domain shift in real-world Semantic Segmentation (SS) without compromising the
private nature of the collected data. However, most of the existing works on FL
unrealistically assume labeled data in the remote clients. Here we propose a
novel task (FFREEDA) in which the clients' data is unlabeled and the server
accesses a source labeled dataset for pre-training only. To solve FFREEDA, we
propose LADD, which leverages the knowledge of the pre-trained model by
employing self-supervision with ad-hoc regularization techniques for local
training and introducing a novel federated clustered aggregation scheme based
on the clients' style. Our experiments show that our algorithm is able to
efficiently tackle the new task outperforming existing approaches. The code is
available at https://github.com/Erosinho13/LADD. |
---|---|
DOI: | 10.48550/arxiv.2210.02326 |