Matching motion trajectories using scale-space

The goal is to design a recognition system which can distinguish between two objects with the same shape but different motion, or between two objects with the same motion but a different shape. The input to the system is a set of two-dimensional (2D) trajectories from an object tracked through a seq...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition 1993, Vol.26 (4), p.595-610
Hauptverfasser: Rangarajan, Krishnan, Allen, William, Shah, Mubarak
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The goal is to design a recognition system which can distinguish between two objects with the same shape but different motion, or between two objects with the same motion but a different shape. The input to the system is a set of two-dimensional (2D) trajectories from an object tracked through a sequence of n frames. The structure and three-dimensional (3D) trajectories of each object in the domain are stored in the model. The problem is to match the information in the model with the input set of 2D trajectories and determine if they represent the same object. The simplest way to perform these steps is to match the input 2D trajectories with the 2D projections of the 3D model trajectories. First, a simple algorithm is presented which matches two single trajectories using only motion information. The 2D motion trajectories are converted into two one-dimensional (1D) signals based on their speed and direction components. The signals are then represented by scale-space images, both to simplify matching and because the scale-space representations are translation and rotation invariant. The matching algorithm is extended to include spatial information and a second algorithm is proposed which matches multiple trajectories by combining motion and spatial match scores. Both algorithms are tested with real and synthetic data.
ISSN:0031-3203
1873-5142
DOI:10.1016/0031-3203(93)90113-B