Motion segmentation-based surveillance video compression using adaptive particle swarm optimization
Video surveillance is one of the widely used and most active research applications of computer vision. Although lots of works have been done in the area of smart surveillance, but still there is a need of effective compression technique for compact archival and efficient transmission of vast amount...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2020-08, Vol.32 (15), p.11443-11457 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Video surveillance is one of the widely used and most active research applications of computer vision. Although lots of works have been done in the area of smart surveillance, but still there is a need of effective compression technique for compact archival and efficient transmission of vast amount of surveillance video data. In this work, we propose a hybrid video compression approach with the help of foreground motion compensation for the above application. This method works effectively by including the advantages of both block-based and object-based coding techniques as well as reducing the drawbacks of both. The proposed method first segments the foreground moving objects from the background with the help of adaptive thresholding-based optical flow techniques. Next, it determines the contour of the segmented foreground regions with the help of Freeman chain code. Subsequently, block-based motion estimation and compensation using variants of particle swarm optimization are computed. After that, motion failure areas are detected using change detection method, and finally, DCT and Huffman coding-based entropy encoding are done to compactly represent the data. Experimental results and analyses on different surveillance video sequences using Wilcoxon’s rank-sum test, PSNR and SSID show that our method outperforms other recent and relevant existing techniques. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-019-04635-6 |