A dataset of ambient sensors in a meeting room for activity recognition

As IoT technology advances, using machine learning to detect user activities emerges as a promising strategy for delivering a variety of smart services. It is essential to have access to high-quality data that also respects privacy concerns and data streams from ambient sensors in the surrounding en...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific data 2024-05, Vol.11 (1), p.516-18, Article 516
Hauptverfasser: Kim, Hyunju, Kim, Geon, Lee, Taehoon, Kim, Kisoo, Lee, Dongman
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As IoT technology advances, using machine learning to detect user activities emerges as a promising strategy for delivering a variety of smart services. It is essential to have access to high-quality data that also respects privacy concerns and data streams from ambient sensors in the surrounding environment meet this requirement. However, despite growing interest in research, there is a noticeable lack of datasets from ambient sensors designed for public spaces, as opposed to those for private settings. To bridge this gap, we design the DOO-RE dataset within an actual meeting room environment, equipped with three types of ambient sensors: those triggered by actuators, users, and the environment itself. This dataset is compiled from the activities of over twenty students throughout a period of four months. DOO-RE provides reliable and purpose-oriented activity data in a public setting, with activity labels verified by multiple annotators through a process of cross-validation to guarantee data integrity. DOO-RE categorizes nine different types of activities and facilitates the study of both single and group activities. We are optimistic that DOO-RE will play a significant role in advancing human activity recognition technologies, enhancing smart automation systems, and enabling the rapid setup of smart spaces through ambient sensors.
ISSN:2052-4463
2052-4463
DOI:10.1038/s41597-024-03344-7