Segmentation and Recognition of Basic and Transitional Activities for Continuous Physical Human Activity
Human activity recognition (HAR) is a hot research topic which aims to understand human behavior and can be applied in various applications. However, transitions between activities are usually disregarded due to their low incidence and short duration when compared against other activities, while in...
Gespeichert in:
Veröffentlicht in: | IEEE access 2019, Vol.7, p.42565-42576 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Human activity recognition (HAR) is a hot research topic which aims to understand human behavior and can be applied in various applications. However, transitions between activities are usually disregarded due to their low incidence and short duration when compared against other activities, while in fact, transitions can affect the performance of the recognition system if not dealt with properly. In this paper, we propose and implement a systematic human activity recognition method to recognize basic activities (BA) and transitional activities (TA) in a continuous sensor data stream. First, raw sensor data are segmented into fragments with sliding window and the features are constructed based on window segmentation. Then, cluster analysis with K-Means is used to aggregate activity fragments into periods. Next, generally, realize the classification of BA and TA according to the shortest duration of the BA, and then deal with the hidden phenomenon of BA. Third, the fragments between adjacent BA are evaluated to decide whether they are TA or disturbance process. Finally, random forest classifier is used to accurately recognize BA and TA. The proposed method is evaluated on the public dataset SBHAR. The results demonstrate that our method effectively recognizes different activities and can deliver high accuracy with all activities considered. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2905575 |