Articulated body motion tracking using illumination invariant optical flow

We propose a model-based tracking method for articulated objects in monocular video sequences under varying illumination conditions. The tracking method uses estimates of optical flows constructed by projecting model textures into the camera images and comparing the projected textures with the recor...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of control, automation, and systems 2010, Automation, and Systems, 8(1), , pp.73-80
Hauptverfasser: Kim, Yeon-Ho, Yi, Soo-Yeong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We propose a model-based tracking method for articulated objects in monocular video sequences under varying illumination conditions. The tracking method uses estimates of optical flows constructed by projecting model textures into the camera images and comparing the projected textures with the recorded information. An articulated body is modelled in terms of 3D primitives, each possessing a specified texture on its surface. An important step in model-based tracking of 3D objects is the estimation of the pose of the object during the tracking process. The optimal pose is estimated by minimizing errors between the computed optical flow and the projected 2D velocities of the model textures. This estimation uses a least-squares method with kinematic constraints for the articulated object and a perspective camera model. We test our framework with an articulated robot and show results.
ISSN:1598-6446
2005-4092
DOI:10.1007/s12555-010-0110-2