Wi-MoID: Human and Nonhuman Motion Discrimination Using WiFi With Edge Computing

Indoor intelligent perception systems have gained significant attention in recent years. However, accurately detecting human presence can be challenging in the presence of nonhuman subjects, such as pets, robots, and electrical appliances, limiting the practicality of these systems for widespread us...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE internet of things journal 2024-04, Vol.11 (8), p.13900-13912
Hauptverfasser: Zhu, Guozhen, Hu, Yuqian, Wang, Beibei, Wu, Chenshu, Zeng, Xiaolu, Liu, K. J. Ray
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Indoor intelligent perception systems have gained significant attention in recent years. However, accurately detecting human presence can be challenging in the presence of nonhuman subjects, such as pets, robots, and electrical appliances, limiting the practicality of these systems for widespread use. In this article, we propose a novel system ("Wi-MoID") that passively and unobtrusively distinguishes moving human and various nonhuman subjects using a single pair of commodity WiFi transceivers, without requiring any device on the subjects or restricting their movements. Wi-MoID leverages a novel statistical electromagnetic wave theory-based multipath model to detect moving subjects, extracts physically and statistically explainable features of their motion, and accurately differentiates human and various nonhuman movements through walls, even in complex environments. In addition, Wi-MoID is suitable for edge devices, requiring minimal computing resources and storage, and is environment independent, making it easy to deploy in new environments with minimum effort. We evaluate the performance of Wi-MoID in five distinct buildings with various moving subjects, including pets, vacuum robots, humans, and fans, and the results demonstrate that it achieves 97.34% accuracy and 1.75% false alarm rate for identification of human and nonhuman motion, and 95.98% accuracy in unseen environments without model tuning, demonstrating its robustness for ubiquitous use.
ISSN:2327-4662
2327-4662
DOI:10.1109/JIOT.2023.3339544