Robust fall detection in video surveillance based on weakly supervised learning
Fall event detection has been a research hotspot in recent years in the fields of medicine and health. Currently, vision-based fall detection methods have been considered the most promising methods due to their advantages of a non-contact characteristic and easy deployment. However, the existing vis...
Gespeichert in:
Veröffentlicht in: | Neural networks 2023-06, Vol.163, p.286-297 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Fall event detection has been a research hotspot in recent years in the fields of medicine and health. Currently, vision-based fall detection methods have been considered the most promising methods due to their advantages of a non-contact characteristic and easy deployment. However, the existing vision-based fall detection methods mainly use supervised learning in model training and require much time and energy for data annotations. To address these limitations, this work proposes a detection method that uses a weakly supervised learning-based dual-modal network. The proposed method adopts a deep multiple instance learning framework to learn the fall events using weak labels. As a result, the proposed method does not require time-consuming fine-grained annotations. The final detection result of each video is obtained by integrating the information obtained from two streams of the dual-modal network using the proposed dual-modal fusion strategy. Experimental results on two public benchmark datasets and a proposed dataset demonstrate the superiority of the proposed method over the current state-of-the-art methods.
•The first weakly supervised learning framework for fall detection is proposed.•A dual-modal network is proposed to achieve accurate and robust fall detection.•A large, diverse, complex and real-world fall detection dataset is proposed. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/j.neunet.2023.03.042 |