Abnormal Event Detection Using Deep Contrastive Learning for Intelligent Video Surveillance System
The continuous developments of urban and industrial environments have increased the demand for intelligent video surveillance. Deep learning has achieved remarkable performance for anomaly detection in surveillance videos. Previous approaches achieve anomaly detection with a single-pretext task (ima...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on industrial informatics 2022-08, Vol.18 (8), p.5171-5179 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The continuous developments of urban and industrial environments have increased the demand for intelligent video surveillance. Deep learning has achieved remarkable performance for anomaly detection in surveillance videos. Previous approaches achieve anomaly detection with a single-pretext task (image reconstruction or prediction) and detect anomalies by larger reconstruction error or poor prediction. However, they cannot fully exploit the discriminative semantics and temporal context information. Moreover, tackling anomaly detection with a single pretext task is suboptimal due to the nonalignment between the pretext task and anomaly detection. In this article, we propose a temporal-aware contrastive network (TAC-Net) to address the abovementioned problems of anomaly detection for intelligence video surveillance. TAC-Net is an unsupervised method that utilizes deep contrastive self-supervised learning to capture the high-level semantic features and tackles anomaly detection with multiple self-supervised tasks. During inference phase, the multiple task losses and contrastive similarity are utilized to calculate the anomaly score. Experimental results show that our method is superior to state-of-the-art approaches on three benchmarks, which demonstrates the validity and advancement of TAC-Net. |
---|---|
ISSN: | 1551-3203 1941-0050 |
DOI: | 10.1109/TII.2021.3122801 |