Personalised Federated Learning On Heterogeneous Feature Spaces
Most personalised federated learning (FL) approaches assume that raw data of all clients are defined in a common subspace i.e. all clients store their data according to the same schema. For real-world applications, this assumption is restrictive as clients, having their own systems to collect and th...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Most personalised federated learning (FL) approaches assume that raw data of
all clients are defined in a common subspace i.e. all clients store their data
according to the same schema. For real-world applications, this assumption is
restrictive as clients, having their own systems to collect and then store
data, may use heterogeneous data representations. We aim at filling this gap.
To this end, we propose a general framework coined FLIC that maps client's data
onto a common feature space via local embedding functions. The common feature
space is learnt in a federated manner using Wasserstein barycenters while the
local embedding functions are trained on each client via distribution
alignment. We integrate this distribution alignement mechanism into a federated
learning approach and provide the algorithmics of FLIC. We compare its
performances against FL benchmarks involving heterogeneous input features
spaces. In addition, we provide theoretical insights supporting the relevance
of our methodology. |
---|---|
DOI: | 10.48550/arxiv.2301.11447 |