A Novel Approach Based on Time Cluster for Activity Recognition of Daily Living in Smart Homes

With the trend of the increasing ageing population, more elderly people often encounter some problems in their daily lives. To enable these people to have more carefree lives, smart homes are designed to assist elderly people by recognizing their daily activities. Although different models and algor...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Symmetry (Basel) 2017-10, Vol.9 (10), p.212
Hauptverfasser: Liu, Yaqing, Ouyang, Dantong, Liu, Yong, Chen, Rong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the trend of the increasing ageing population, more elderly people often encounter some problems in their daily lives. To enable these people to have more carefree lives, smart homes are designed to assist elderly people by recognizing their daily activities. Although different models and algorithms that use temporal and spatial features for activity recognition have been proposed, the rigid representations of these features damage the accuracy of activity recognition. In this paper, a two-stage approach is proposed to recognize the activities of a single resident. Firstly, in terms of temporal features, the approximate duration, start and end time are extracted from the activity records. Secondly, a set of activity records is clustered according to the records’ temporal features. Then, the classifiers are used to recognize the daily activities in each cluster according to the spatial features. Finally, two experiments are done to validate the recognition of daily activities in order to compare the proposed approach with a one-dimensional model. The results demonstrate that the proposed approach favorably outperforms the one-dimensional model. Two public datasets are used to evaluate the proposed approach. The experiment results show that the proposed approach achieves average accuracies of 80% and 89%, respectively.
ISSN:2073-8994
2073-8994
DOI:10.3390/sym9100212