Directing the Attention of aWearable Camera by Pointing Gestures

Wearable visual sensors provide views of the environment which are rich in information about the wearer's location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine met...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: De Campos, Teofilo E., Mayol Cuevas, Walterio W., Murray, David W.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 186
container_issue
container_start_page 179
container_title
container_volume
creator De Campos, Teofilo E.
Mayol Cuevas, Walterio W.
Murray, David W.
description Wearable visual sensors provide views of the environment which are rich in information about the wearer's location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine method for shape detection and a 3D tracking method that can identify pointing gestures and estimate their direction. The low computational complexity of both methods allows a real-time implementation that is applied to estimate the user's focus of attention and to control fast redirections of gaze of a wearable active camera. Experiments have demonstrated a level of robustness of this system in long and noisy image sequences
doi_str_mv 10.1109/SIBGRAPI.2006.13
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_4027066</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>4027066</ieee_id><sourcerecordid>4027066</sourcerecordid><originalsourceid>FETCH-ieee_primary_40270663</originalsourceid><addsrcrecordid>eNp9yr1uwjAUQOErClLTwo7UxS-Q9DpO7HgjhTZlQ6VSx8hBFzDKD7LNkLevhDp3OsN3AJYcE85Rv-63b9VXudsmKaJMuJhAlAql4jzj8gEWWhWopM5TWUg1hYjnAmNeiOwRnry_IHKtZRHBamMdHYLtTyyciZUhUB_s0LPhyMwPGWealtjadOQMa0a2G2x_vyvy4ebIz2F2NK2nxV-f4eXj_Xv9GVsiqq_OdsaNdYapQinF__oL38o8Ww</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Directing the Attention of aWearable Camera by Pointing Gestures</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>De Campos, Teofilo E. ; Mayol Cuevas, Walterio W. ; Murray, David W.</creator><creatorcontrib>De Campos, Teofilo E. ; Mayol Cuevas, Walterio W. ; Murray, David W.</creatorcontrib><description>Wearable visual sensors provide views of the environment which are rich in information about the wearer's location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine method for shape detection and a 3D tracking method that can identify pointing gestures and estimate their direction. The low computational complexity of both methods allows a real-time implementation that is applied to estimate the user's focus of attention and to control fast redirections of gaze of a wearable active camera. Experiments have demonstrated a level of robustness of this system in long and noisy image sequences</description><identifier>ISSN: 1530-1834</identifier><identifier>ISBN: 9780769526867</identifier><identifier>ISBN: 0769526861</identifier><identifier>EISSN: 2377-5416</identifier><identifier>DOI: 10.1109/SIBGRAPI.2006.13</identifier><language>eng</language><publisher>IEEE</publisher><subject>Cameras ; Computer science ; Focusing ; Humans ; Keyboards ; Magnetic heads ; Robot vision systems ; Scholarships ; Shape ; Wearable sensors</subject><ispartof>2006 19th Brazilian Symposium on Computer Graphics and Image Processing, 2006, p.179-186</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/4027066$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,777,781,786,787,2052,27906,54901</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/4027066$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>De Campos, Teofilo E.</creatorcontrib><creatorcontrib>Mayol Cuevas, Walterio W.</creatorcontrib><creatorcontrib>Murray, David W.</creatorcontrib><title>Directing the Attention of aWearable Camera by Pointing Gestures</title><title>2006 19th Brazilian Symposium on Computer Graphics and Image Processing</title><addtitle>SIBGRA</addtitle><description>Wearable visual sensors provide views of the environment which are rich in information about the wearer's location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine method for shape detection and a 3D tracking method that can identify pointing gestures and estimate their direction. The low computational complexity of both methods allows a real-time implementation that is applied to estimate the user's focus of attention and to control fast redirections of gaze of a wearable active camera. Experiments have demonstrated a level of robustness of this system in long and noisy image sequences</description><subject>Cameras</subject><subject>Computer science</subject><subject>Focusing</subject><subject>Humans</subject><subject>Keyboards</subject><subject>Magnetic heads</subject><subject>Robot vision systems</subject><subject>Scholarships</subject><subject>Shape</subject><subject>Wearable sensors</subject><issn>1530-1834</issn><issn>2377-5416</issn><isbn>9780769526867</isbn><isbn>0769526861</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2006</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNp9yr1uwjAUQOErClLTwo7UxS-Q9DpO7HgjhTZlQ6VSx8hBFzDKD7LNkLevhDp3OsN3AJYcE85Rv-63b9VXudsmKaJMuJhAlAql4jzj8gEWWhWopM5TWUg1hYjnAmNeiOwRnry_IHKtZRHBamMdHYLtTyyciZUhUB_s0LPhyMwPGWealtjadOQMa0a2G2x_vyvy4ebIz2F2NK2nxV-f4eXj_Xv9GVsiqq_OdsaNdYapQinF__oL38o8Ww</recordid><startdate>200610</startdate><enddate>200610</enddate><creator>De Campos, Teofilo E.</creator><creator>Mayol Cuevas, Walterio W.</creator><creator>Murray, David W.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>200610</creationdate><title>Directing the Attention of aWearable Camera by Pointing Gestures</title><author>De Campos, Teofilo E. ; Mayol Cuevas, Walterio W. ; Murray, David W.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-ieee_primary_40270663</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2006</creationdate><topic>Cameras</topic><topic>Computer science</topic><topic>Focusing</topic><topic>Humans</topic><topic>Keyboards</topic><topic>Magnetic heads</topic><topic>Robot vision systems</topic><topic>Scholarships</topic><topic>Shape</topic><topic>Wearable sensors</topic><toplevel>online_resources</toplevel><creatorcontrib>De Campos, Teofilo E.</creatorcontrib><creatorcontrib>Mayol Cuevas, Walterio W.</creatorcontrib><creatorcontrib>Murray, David W.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>De Campos, Teofilo E.</au><au>Mayol Cuevas, Walterio W.</au><au>Murray, David W.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Directing the Attention of aWearable Camera by Pointing Gestures</atitle><btitle>2006 19th Brazilian Symposium on Computer Graphics and Image Processing</btitle><stitle>SIBGRA</stitle><date>2006-10</date><risdate>2006</risdate><spage>179</spage><epage>186</epage><pages>179-186</pages><issn>1530-1834</issn><eissn>2377-5416</eissn><isbn>9780769526867</isbn><isbn>0769526861</isbn><abstract>Wearable visual sensors provide views of the environment which are rich in information about the wearer's location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine method for shape detection and a 3D tracking method that can identify pointing gestures and estimate their direction. The low computational complexity of both methods allows a real-time implementation that is applied to estimate the user's focus of attention and to control fast redirections of gaze of a wearable active camera. Experiments have demonstrated a level of robustness of this system in long and noisy image sequences</abstract><pub>IEEE</pub><doi>10.1109/SIBGRAPI.2006.13</doi></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1530-1834
ispartof 2006 19th Brazilian Symposium on Computer Graphics and Image Processing, 2006, p.179-186
issn 1530-1834
2377-5416
language eng
recordid cdi_ieee_primary_4027066
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Cameras
Computer science
Focusing
Humans
Keyboards
Magnetic heads
Robot vision systems
Scholarships
Shape
Wearable sensors
title Directing the Attention of aWearable Camera by Pointing Gestures
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T02%3A21%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Directing%20the%20Attention%20of%20aWearable%20Camera%20by%20Pointing%20Gestures&rft.btitle=2006%2019th%20Brazilian%20Symposium%20on%20Computer%20Graphics%20and%20Image%20Processing&rft.au=De%20Campos,%20Teofilo%20E.&rft.date=2006-10&rft.spage=179&rft.epage=186&rft.pages=179-186&rft.issn=1530-1834&rft.eissn=2377-5416&rft.isbn=9780769526867&rft.isbn_list=0769526861&rft_id=info:doi/10.1109/SIBGRAPI.2006.13&rft_dat=%3Cieee_6IE%3E4027066%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=4027066&rfr_iscdi=true