Beyond Background-Aware Correlation Filters: Adaptive Context Modeling by Hand-Crafted and Deep RGB Features for Visual Tracking
In recent years, the background-aware correlation filters have achie-ved a lot of research interest in the visual target tracking. However, these methods cannot suitably model the target appearance due to the exploitation of hand-crafted features. On the other hand, the recent deep learning-based vi...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent years, the background-aware correlation filters have achie-ved a
lot of research interest in the visual target tracking. However, these methods
cannot suitably model the target appearance due to the exploitation of
hand-crafted features. On the other hand, the recent deep learning-based visual
tracking methods have provided a competitive performance along with extensive
computations. In this paper, an adaptive background-aware correlation
filter-based tracker is proposed that effectively models the target appearance
by using either the histogram of oriented gradients (HOG) or convolutional
neural network (CNN) feature maps. The proposed method exploits the fast 2D
non-maximum suppression (NMS) algorithm and the semantic information comparison
to detect challenging situations. When the HOG-based response map is not
reliable, or the context region has a low semantic similarity with prior
regions, the proposed method constructs the CNN context model to improve the
target region estimation. Furthermore, the rejection option allows the proposed
method to update the CNN context model only on valid regions. Comprehensive
experimental results demonstrate that the proposed adaptive method clearly
outperforms the accuracy and robustness of visual target tracking compared to
the state-of-the-art methods on the OTB-50, OTB-100, TC-128, UAV-123, and
VOT-2015 datasets. |
---|---|
DOI: | 10.48550/arxiv.2004.02932 |