Weighted Low-Rank Decomposition for Robust Grayscale-Thermal Foreground Detection
This paper investigates how to fuse grayscale and thermal video data for detecting foreground objects in challenging scenarios. To this end, we propose an intuitive yet effective method called weighted low-rank decomposition (WELD), which adaptively pursues the cross-modality low-rank representation...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on circuits and systems for video technology 2017-04, Vol.27 (4), p.725-738 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper investigates how to fuse grayscale and thermal video data for detecting foreground objects in challenging scenarios. To this end, we propose an intuitive yet effective method called weighted low-rank decomposition (WELD), which adaptively pursues the cross-modality low-rank representation. Specifically, we form two data matrices by accumulating sequential frames from the grayscale and the thermal videos, respectively. Within these two observing matrices, WELD detects moving foreground pixels as sparse outliers against the low-rank structure background and incorporates the weight variables to make the models of two modalities complementary to each other. The smoothness constraints of object motion are also introduced in WELD to further improve the robustness to noises. For optimization, we propose an iterative algorithm to efficiently solve the low-rank models with three subproblems. Moreover, we utilize an edge-preserving filtering-based method to substantially speed up WELD while preserving its accuracy. To provide a comprehensive evaluation benchmark of grayscale-thermal foreground detection, we create a new data set including 25 aligned grayscale-thermal video pairs with high diversity. Our extensive experiments on both the newly created data set and the public data set OSU3 suggest that WELD achieves superior performance and comparable efficiency against other state-of-the-art approaches. |
---|---|
ISSN: | 1051-8215 1558-2205 |
DOI: | 10.1109/TCSVT.2016.2556586 |