Tracking the human arm using constraint fusion and multiple-cue localization

The use of hand gestures provides an attractive means of interacting naturally with a computer-generated display. Using one or more video cameras, the hand movements can potentially be interpreted as meaningful gestures. One key problem in, building such an interface without a restricted setup is th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Machine vision and applications 2003-03, Vol.13 (5-6), p.286-302
Hauptverfasser: Azoz, Y., Devi, L., Yeasin, M., Sharma, R.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 302
container_issue 5-6
container_start_page 286
container_title Machine vision and applications
container_volume 13
creator Azoz, Y.
Devi, L.
Yeasin, M.
Sharma, R.
description The use of hand gestures provides an attractive means of interacting naturally with a computer-generated display. Using one or more video cameras, the hand movements can potentially be interpreted as meaningful gestures. One key problem in, building such an interface without a restricted setup is the ability to localize and track the human arm robustly in video sequences. This paper proposes a multiple-cue localization scheme combined with a tracking framework to reliably track the dynamics of the human arm in unconstrained environments. The localization scheme integrates the multiple cues of motion, shape, and color for locating a set of key image features. Using constraint fusion, these features are tracked by a modified extended Kalman filter that exploits the articulated structure of the human arm. Moreover, an interaction scheme between tracking and localization is used for improving the estimation process while reducing the computational requirements. The performance of the localization/tracking framework is validated with the help of extensive experiments and simulations. These experiments include tracking with calibrated stereo camera and uncalibrated broadcast video.
doi_str_mv 10.1007/s00138-002-0110-1
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_28066452</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>28066452</sourcerecordid><originalsourceid>FETCH-LOGICAL-c274t-b8fd046f3f5dc38b9bbae1d073904d6759131ae1d2828d8631e788c7c43289223</originalsourceid><addsrcrecordid>eNotkE1PxCAQhonRxHX1B3jj5A2dAbbA0Ri_kk28rGdCKXWrtF2BHvTX22Y9zeR530wyDyHXCLcIoO4yAArNADgDRGB4QlYoBWeoKnNKVmDmXYPh5-Qi508AkErJFdnukvNf3fBByz7Q_dS7gbrU0ykvzI9DLsl1Q6HtTMY5GxraT7F0hxiYnwKNo3ex-3VlTi_JWetiDlf_c03enx53Dy9s-_b8-nC_ZZ4rWVit2wZk1Yp203iha1PXLmADShiQTaU2BgUuhGuuG10JDEprr_z8jzacizW5Od49pPF7CrnYvss-xOiGME7Zcg1VJTdLEY9Fn8acU2jtIXW9Sz8WwS7e7NGbnb3ZxZtF8QfW82CT</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>28066452</pqid></control><display><type>article</type><title>Tracking the human arm using constraint fusion and multiple-cue localization</title><source>Springer Nature - Complete Springer Journals</source><creator>Azoz, Y. ; Devi, L. ; Yeasin, M. ; Sharma, R.</creator><creatorcontrib>Azoz, Y. ; Devi, L. ; Yeasin, M. ; Sharma, R.</creatorcontrib><description>The use of hand gestures provides an attractive means of interacting naturally with a computer-generated display. Using one or more video cameras, the hand movements can potentially be interpreted as meaningful gestures. One key problem in, building such an interface without a restricted setup is the ability to localize and track the human arm robustly in video sequences. This paper proposes a multiple-cue localization scheme combined with a tracking framework to reliably track the dynamics of the human arm in unconstrained environments. The localization scheme integrates the multiple cues of motion, shape, and color for locating a set of key image features. Using constraint fusion, these features are tracked by a modified extended Kalman filter that exploits the articulated structure of the human arm. Moreover, an interaction scheme between tracking and localization is used for improving the estimation process while reducing the computational requirements. The performance of the localization/tracking framework is validated with the help of extensive experiments and simulations. These experiments include tracking with calibrated stereo camera and uncalibrated broadcast video.</description><identifier>ISSN: 0932-8092</identifier><identifier>EISSN: 1432-1769</identifier><identifier>DOI: 10.1007/s00138-002-0110-1</identifier><language>eng</language><ispartof>Machine vision and applications, 2003-03, Vol.13 (5-6), p.286-302</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c274t-b8fd046f3f5dc38b9bbae1d073904d6759131ae1d2828d8631e788c7c43289223</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Azoz, Y.</creatorcontrib><creatorcontrib>Devi, L.</creatorcontrib><creatorcontrib>Yeasin, M.</creatorcontrib><creatorcontrib>Sharma, R.</creatorcontrib><title>Tracking the human arm using constraint fusion and multiple-cue localization</title><title>Machine vision and applications</title><description>The use of hand gestures provides an attractive means of interacting naturally with a computer-generated display. Using one or more video cameras, the hand movements can potentially be interpreted as meaningful gestures. One key problem in, building such an interface without a restricted setup is the ability to localize and track the human arm robustly in video sequences. This paper proposes a multiple-cue localization scheme combined with a tracking framework to reliably track the dynamics of the human arm in unconstrained environments. The localization scheme integrates the multiple cues of motion, shape, and color for locating a set of key image features. Using constraint fusion, these features are tracked by a modified extended Kalman filter that exploits the articulated structure of the human arm. Moreover, an interaction scheme between tracking and localization is used for improving the estimation process while reducing the computational requirements. The performance of the localization/tracking framework is validated with the help of extensive experiments and simulations. These experiments include tracking with calibrated stereo camera and uncalibrated broadcast video.</description><issn>0932-8092</issn><issn>1432-1769</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2003</creationdate><recordtype>article</recordtype><recordid>eNotkE1PxCAQhonRxHX1B3jj5A2dAbbA0Ri_kk28rGdCKXWrtF2BHvTX22Y9zeR530wyDyHXCLcIoO4yAArNADgDRGB4QlYoBWeoKnNKVmDmXYPh5-Qi508AkErJFdnukvNf3fBByz7Q_dS7gbrU0ykvzI9DLsl1Q6HtTMY5GxraT7F0hxiYnwKNo3ex-3VlTi_JWetiDlf_c03enx53Dy9s-_b8-nC_ZZ4rWVit2wZk1Yp203iha1PXLmADShiQTaU2BgUuhGuuG10JDEprr_z8jzacizW5Od49pPF7CrnYvss-xOiGME7Zcg1VJTdLEY9Fn8acU2jtIXW9Sz8WwS7e7NGbnb3ZxZtF8QfW82CT</recordid><startdate>20030301</startdate><enddate>20030301</enddate><creator>Azoz, Y.</creator><creator>Devi, L.</creator><creator>Yeasin, M.</creator><creator>Sharma, R.</creator><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20030301</creationdate><title>Tracking the human arm using constraint fusion and multiple-cue localization</title><author>Azoz, Y. ; Devi, L. ; Yeasin, M. ; Sharma, R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c274t-b8fd046f3f5dc38b9bbae1d073904d6759131ae1d2828d8631e788c7c43289223</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2003</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Azoz, Y.</creatorcontrib><creatorcontrib>Devi, L.</creatorcontrib><creatorcontrib>Yeasin, M.</creatorcontrib><creatorcontrib>Sharma, R.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Machine vision and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Azoz, Y.</au><au>Devi, L.</au><au>Yeasin, M.</au><au>Sharma, R.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Tracking the human arm using constraint fusion and multiple-cue localization</atitle><jtitle>Machine vision and applications</jtitle><date>2003-03-01</date><risdate>2003</risdate><volume>13</volume><issue>5-6</issue><spage>286</spage><epage>302</epage><pages>286-302</pages><issn>0932-8092</issn><eissn>1432-1769</eissn><abstract>The use of hand gestures provides an attractive means of interacting naturally with a computer-generated display. Using one or more video cameras, the hand movements can potentially be interpreted as meaningful gestures. One key problem in, building such an interface without a restricted setup is the ability to localize and track the human arm robustly in video sequences. This paper proposes a multiple-cue localization scheme combined with a tracking framework to reliably track the dynamics of the human arm in unconstrained environments. The localization scheme integrates the multiple cues of motion, shape, and color for locating a set of key image features. Using constraint fusion, these features are tracked by a modified extended Kalman filter that exploits the articulated structure of the human arm. Moreover, an interaction scheme between tracking and localization is used for improving the estimation process while reducing the computational requirements. The performance of the localization/tracking framework is validated with the help of extensive experiments and simulations. These experiments include tracking with calibrated stereo camera and uncalibrated broadcast video.</abstract><doi>10.1007/s00138-002-0110-1</doi><tpages>17</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0932-8092
ispartof Machine vision and applications, 2003-03, Vol.13 (5-6), p.286-302
issn 0932-8092
1432-1769
language eng
recordid cdi_proquest_miscellaneous_28066452
source Springer Nature - Complete Springer Journals
title Tracking the human arm using constraint fusion and multiple-cue localization
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-18T21%3A26%3A42IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Tracking%20the%20human%20arm%20using%20constraint%20fusion%20and%20multiple-cue%20localization&rft.jtitle=Machine%20vision%20and%20applications&rft.au=Azoz,%20Y.&rft.date=2003-03-01&rft.volume=13&rft.issue=5-6&rft.spage=286&rft.epage=302&rft.pages=286-302&rft.issn=0932-8092&rft.eissn=1432-1769&rft_id=info:doi/10.1007/s00138-002-0110-1&rft_dat=%3Cproquest_cross%3E28066452%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=28066452&rft_id=info:pmid/&rfr_iscdi=true