An online learning update modeling approach for aerial visual tracking

Since it is a crucial part of many computer vision applications, including video surveillance, human–computer interfaces, driver assistance systems, robots, etc., visual target tracking has been a popular study area during the past few decades. Target tracking has been the focus of many studies, and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of optics (New Delhi) 2024-02, Vol.53 (1), p.676-686
1. Verfasser: Wang, Limei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Since it is a crucial part of many computer vision applications, including video surveillance, human–computer interfaces, driver assistance systems, robots, etc., visual target tracking has been a popular study area during the past few decades. Target tracking has been the focus of many studies, and a substantial body of the literature has resulted. Visual tracking is a difficult subject, even though several algorithms have been created. A straightforward approach like template matching could be effective when the target’s look does not vary much and its velocity changes smoothly. The target’s appearance can be varying in indoor and outdoor environment and under camera and target motion conditions. Therefore, it is still required to investigate advanced methodologies to deal with existing challenges in aerial video tracking. In this study, a visual target tracking is proposed to deal with the visual tracking difficulty in aerial videos. The proposed tracking method is learning-based which taken advantage of prior knowledge which is indicated by the top-down. Using this combination, the proposed tracking method is used parallel processing and multi-instance learning model for multi-target detection and appearance variations difficulty. Experimental results show that the proposed method presents better results compared to other methods.
ISSN:0972-8821
0974-6900
DOI:10.1007/s12596-023-01209-7