Hand tracking for behaviour understanding
A real-time computer vision system is described for tracking hands thus enabling behavioural events to be interpreted. Forearms are tracked to provide structural context, enabling mutual occlusion, which occurs when hands cross one another, to be handled robustly. No prior skin colour models are use...
Gespeichert in:
Veröffentlicht in: | Image and vision computing 2002-10, Vol.20 (12), p.827-840 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | A real-time computer vision system is described for tracking hands thus enabling behavioural events to be interpreted. Forearms are tracked to provide structural context, enabling mutual occlusion, which occurs when hands cross one another, to be handled robustly. No prior skin colour models are used. Instead adaptive appearance models are learned on-line. A contour distance transform is used to control model adaptation and to fit 2D geometric models robustly. Hands can be tracked whether clothed or unclothed. Results are given for a ‘smart desk’ and an in-vehicle application. The ability to interpret behavioural events of interest when tracking a vehicle driver's hands is described. |
---|---|
ISSN: | 0262-8856 1872-8138 |
DOI: | 10.1016/S0262-8856(02)00093-8 |