Heterogeneous client federal learning method based on channel distillation and decoupling knowledge distillation

The invention provides a channel distillation and decoupling knowledge distillation-based heterogeneous client federal learning method, which comprises the following steps that: a server is provided with a global model and an RC model selection strategy, the server is connected with K local clients,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: WEN XINTONG, QIAO SIHAI, AN MING, CHEN RONG
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The invention provides a channel distillation and decoupling knowledge distillation-based heterogeneous client federal learning method, which comprises the following steps that: a server is provided with a global model and an RC model selection strategy, the server is connected with K local clients, and at least two local clients have different structures; iterating the global model, all the local models and the RC model by adopting federated learning of channel distillation, decoupling knowledge distillation and an alternate chairman system; and when an iteration ending condition is met, all local models complete mutual learning. Privacy protection of private data is realized through channel distillation, knowledge transmission is realized through channel distillation and decoupling knowledge distillation under the condition of not depending on a model structure, a breakthrough of performing federated learning between heterogeneous local models is realized, the performance of a client group is improved, and