CsiGAN: Robust Channel State Information-Based Activity Recognition With GANs

As a cornerstone service for many Internet of Things applications, channel state information (CSI)-based activity recognition has received immense attention over recent years. However, recognition performance of general approaches might significantly decrease when applying the trained model to the l...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE internet of things journal 2019-12, Vol.6 (6), p.10191-10204
Hauptverfasser: Xiao, Chunjing, Han, Daojun, Ma, Yongsen, Qin, Zhiguang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As a cornerstone service for many Internet of Things applications, channel state information (CSI)-based activity recognition has received immense attention over recent years. However, recognition performance of general approaches might significantly decrease when applying the trained model to the left-out user whose CSI data are not used for model training. To overcome this challenge, we propose a semi-supervised generative adversarial network (GAN) for CSI-based activity recognition (CsiGAN). Based on the general semi-supervised GANs, we mainly design three components for CsiGAN to meet the scenarios that unlabeled data from left-out users are very limited and enhance recognition performance: 1) we introduce a new complement generator, which can use limited unlabeled data to produce diverse fake samples for training a robust discriminator; 2) for the discriminator, we change the number of probability outputs from k + 1 into 2k + 1 (here, k is the number of categories), which can help to obtain the correct decision boundary for each category; and 3) based on the introduced generator, we propose a manifold regularization, which can stabilize the learning process. The experiments suggest that CsiGAN attains significant gains compared to the state-of-the-art methods.
ISSN:2327-4662
2327-4662
DOI:10.1109/JIOT.2019.2936580