A novel recurrent neural networks architecture for behavior analysis
Behavior analysis is an important yet challenging task on computer vision area. However, human behavior is still a necessity in differents sectors. In fact, in the increase of crimes, everyone needs video surveillance to keep their belongings safe and to automatically detect events by collecting imp...
Gespeichert in:
Veröffentlicht in: | International arab journal of information technology 2021, Vol.18 (2), p.133-139 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Behavior analysis is an important yet challenging task on computer vision area. However, human behavior is still a necessity in differents sectors. In fact, in the increase of crimes, everyone needs video surveillance to keep their belongings safe and to automatically detect events by collecting important information for the assistance of security guards. Moreover, the surveillance of human behavior is recently used in medicine fields to quickly detect physical and mental health problems of patients. The complex and the variety presentation of human features in video sequence encourage researches to find the effective presentation. An effective presentation is the most challenging part. It must be invariant to changes of point of view, robust to noise and efficient with a low computation time. In this paper, we propose new model for human behavior analysis which combine transfer learning model and Recurrent Neural Network (RNN). Our model can extract human features from frames using the pre-trained model of Convolutional Neural Network (CNN) the Inception V3. The human features obtained are trained using RNN with Gated Recurrent Unit (GRU). The performance of our proposed architecture is evaluated by three different dataset for human action, UCF Sport, UCF101 and KTH, and achieved good classification accuracy. |
---|---|
ISSN: | 1683-3198 1683-3198 |
DOI: | 10.34028/iajit/18/2/1 |