Recognizing Multiple Human Activities Using Deep Learning Framework

Multiple human activity detection is a trend in smart surveillance that comes with a number of difficulties, including real-time analysis of large amounts of video data while maintaining low processing complexity. In existing systems of EfficientNET: Scalable and Efficient Object Detection and Deep...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Revue d'Intelligence Artificielle 2022-10, Vol.36 (5), p.791-799
Hauptverfasser: Janardhanan, Jitha, Subbian, Umamaheswari
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Multiple human activity detection is a trend in smart surveillance that comes with a number of difficulties, including real-time analysis of large amounts of video data while maintaining low processing complexity. In existing systems of EfficientNET: Scalable and Efficient Object Detection and Deep Skip Connection Gated Recurrent Unit (DS-GRU) performs a compound scaling method that uniformly scales the resolution, depth, and width for all backbone, feature network, and box or class prediction networks at the same time. This study executes only compound scaling method and allows the scaled model to predict the objects with COCO 2017 image Dataset. It performs object recognition only and does not analyze multiple human activities with corresponding class interactions. To overcome this issue, the proposed system presents an Enhanced Bidirectional Gated Recurrent Unit with Long Short Term Memory (BGRU-LSTM) classification algorithm that is adapted to the Human Activity Recognition (HAR) task. The proposed multiple human activities with Pose estimation technique improves better accuracy using EfficientNET feature extraction model along with classification. The experimental results were evaluated with a real-time surveillance video dataset captured from the home apartment.
ISSN:0992-499X
1958-5748
DOI:10.18280/ria.360518