Event-based Tracking of Any Point with Motion-Robust Correlation Features
Tracking any point (TAP) recently shifted the motion estimation paradigm from focusing on individual salient points with local templates to tracking arbitrary points with global image contexts. However, while research has mostly focused on driving the accuracy of models in nominal settings, addressi...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Tracking any point (TAP) recently shifted the motion estimation paradigm from
focusing on individual salient points with local templates to tracking
arbitrary points with global image contexts. However, while research has mostly
focused on driving the accuracy of models in nominal settings, addressing
scenarios with difficult lighting conditions and high-speed motions remains out
of reach due to the limitations of the sensor. This work addresses this
challenge with the first event camera-based TAP method. It leverages the high
temporal resolution and high dynamic range of event cameras for robust
high-speed tracking, and the global contexts in TAP methods to handle
asynchronous and sparse event measurements. We further extend the TAP framework
to handle event feature variations induced by motion - thereby addressing an
open challenge in purely event-based tracking - with a novel feature alignment
loss which ensures the learning of motion-robust features. Our method is
trained with data from a new data generation pipeline and systematically
ablated across all design decisions. Our method shows strong cross-dataset
generalization and performs 135% better on the average Jaccard metric than the
baselines. Moreover, on an established feature tracking benchmark, it achieves
a 19% improvement over the previous best event-only method and even surpasses
the previous best events-and-frames method by 3.7%. |
---|---|
DOI: | 10.48550/arxiv.2412.00133 |