Novel features for intensive human activity recognition based on wearable and smartphone sensors
On the lap of this modern era, human activity recognition (HAR) has been of great help in case of health monitoring and rehabilitation. Existing works mostly use one or more specific devices (with embedded sensors) including smartphones for activity recognition and most of the time the detected acti...
Gespeichert in:
Veröffentlicht in: | Microsystem technologies : sensors, actuators, systems integration actuators, systems integration, 2020-06, Vol.26 (6), p.1889-1903 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | On the lap of this modern era, human activity recognition (HAR) has been of great help in case of health monitoring and rehabilitation. Existing works mostly use one or more specific devices (with embedded sensors) including smartphones for activity recognition and most of the time the detected activities are coarse grained like
sit
or
walk
rather than detailed and intensive like
sit carrying weight
or
walk carrying weight
. But, intensity of activities reflects valuable insight about a person’s health and more importantly, physical exertion for performing those activities. Consequently, in this paper, we propose an intense activity recognition framework that combines features from smartphone accelerometer (available in almost every smartphone) and that from wearable heartrate sensor. We introduce a set of novel heartrate features that takes into consideration finer variation of heartrate as compared to the resting heartrate of an individual. The proposed framework forms an ensemble model based on different classifiers to address the challenge of usage behavior in terms of how the smartphone is carried. The stack generalization based ensemble model predicts the intensity of activity. We have implemented the framework and tested for a real dataset collected from four users. We have observed that our work is able to identify both static and dynamic intense activities with 96% accuracy, and even found to be better than state of the art techniques. |
---|---|
ISSN: | 0946-7076 1432-1858 |
DOI: | 10.1007/s00542-019-04738-z |