Improving Human Activity Recognition Integrating LSTM with Different Data Sources: Features, Object Detection and Skeleton Tracking
Over the past few years, technologies in the field of computer vision have greatly advanced. The use of deep neural networks, together with the development of computing capabilities, has made it possible to solve problems of great interest to society. In this work, we focus on one such problem that...
Gespeichert in:
Veröffentlicht in: | IEEE access 2022, Vol.10, p.1-1 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Over the past few years, technologies in the field of computer vision have greatly advanced. The use of deep neural networks, together with the development of computing capabilities, has made it possible to solve problems of great interest to society. In this work, we focus on one such problem that has seen a great development, the recognition of actions in live videos. Although the problem has been oriented in different ways in the literature, we have focused on indoor residential environments, such as a house or a nursing home. Our system can be used to understand what actions a person or group of people are carrying out. Two of the approaches used to solve the problem have been 3D convolution networks and recurrent networks. In our case, we have created a model that accurately combines several recurrent networks with processed data from different techniques: image feature extraction, object detection and people's skeletons. The need to integrate these three techniques arises from the search to improve the detection of certain actions by taking advantage of the best recognition offered by each of the methods. In a complete experimentation, where several techniques have been evaluated against different datasets, the classification of the actions has been improved with respect to the existing models. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2022.3186465 |