Optimizing Multi-User Semantic Communication via Transfer Learning and Knowledge Distillation

Semantic Communication (SemCom), notable for ensuring quality of service by jointly optimizing source and channel coding, effectively extracts data semantics, eliminates redundant information, and mitigates noise effects from wireless channel. However, most studies overlook multiple user scenarios a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE communications letters 2025-01, Vol.29 (1), p.90-94
Hauptverfasser: Nguyen, Loc X., Kim, Kitae, Lin Tun, Ye, Salman Hassan, Sheikh, Kyaw Tun, Yan, Han, Zhu, Seon Hong, Choong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Semantic Communication (SemCom), notable for ensuring quality of service by jointly optimizing source and channel coding, effectively extracts data semantics, eliminates redundant information, and mitigates noise effects from wireless channel. However, most studies overlook multiple user scenarios and resource availability, limiting real-world applications. This letter addresses this gap by focusing on downlink communication from a base station to multiple users with varying computing capacities. Users employ variants of Swin transformer models for source decoding and a simple architecture for channel decoding. We propose a novel training procedure FRENCA, incorporating transfer learning and knowledge distillation to improve low-computing users' performance. Extensive simulations validate the proposed methods.
ISSN:1089-7798
1558-2558
DOI:10.1109/LCOMM.2024.3499956