WiFlowCount: Device-Free People Flow Counting by Exploiting Doppler Effect in Commodity WiFi

People flow counting is to count the number of people passing through a passage or a gate. Conventional vision-based approaches require line-of-sight (LoS) and impose privacy concerns, while most radio-based approaches require dedicated equipment and incur high cost. In this article, we propose to e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE systems journal 2020-12, Vol.14 (4), p.4919-4930
Hauptverfasser: Zhou, Rui, Gong, Ziyuan, Lu, Xiang, Fu, Yang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:People flow counting is to count the number of people passing through a passage or a gate. Conventional vision-based approaches require line-of-sight (LoS) and impose privacy concerns, while most radio-based approaches require dedicated equipment and incur high cost. In this article, we propose to exploit commodity WiFi to count the number of people of continuous flows in a device-free way, requiring one pair of WiFi transmitter and receiver. Leveraging the Doppler effect induced by human passing, the proposed method, named WiFlowCount, first constructs the spectrogram of Doppler shifts from channel state information. Based on the spectrogram, WiFlowCount detects the people flows and the subflows according to the power distribution in the spectrogram. An optimal rotation and segmentation algorithm is proposed to segment the spectrogram of a continuous flow into the subspectrograms of its subflows. The number of people in each subflow is estimated from its subspectrogram via convolutional neural networks, which add up to the total people count in the continuous flow. Experimental evaluations demonstrate the effectiveness of the method WiFlowCount, able to detect people flows and subflows accurately and count the number of people in continuous flows with high accuracy, outperforming the existing work.
ISSN:1932-8184
1937-9234
DOI:10.1109/JSYST.2019.2961735