Smartwatch-based Early Gesture Detection 8 Trajectory Tracking for Interactive Gesture-Driven Applications
The paper explores the possibility of using wrist-worn devices (specifically, a smartwatch) to accurately track the hand movement and gestures for a new class of immersive, interactive gesture-driven applications. These interactive applications need two special features: (a) the ability to identify...
Gespeichert in:
Veröffentlicht in: | Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies mobile, wearable and ubiquitous technologies, 2018-03, Vol.2 (1), p.1-27, Article 39 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 27 |
---|---|
container_issue | 1 |
container_start_page | 1 |
container_title | Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies |
container_volume | 2 |
creator | Vu, Tran Huy Misra, Archan Roy, Quentin Wei, Kenny Choo Tsu Lee, Youngki |
description | The paper explores the possibility of using wrist-worn devices (specifically, a smartwatch) to accurately track the hand movement and gestures for a new class of immersive, interactive gesture-driven applications. These interactive applications need two special features: (a) the ability to identify gestures from a continuous stream of sensor data early--i.e., even before the gesture is complete, and (b) the ability to precisely track the hand's trajectory, even though the underlying inertial sensor data is noisy. We develop a new approach that tackles these requirements by first building a HMM-based gesture recognition framework that does not need an explicit segmentation step, and then using a per-gesture trajectory tracking solution that tracks the hand movement only during these predefined gestures. Using an elaborate setup that allows us to realistically study the table-tennis related hand movements of users, we show that our approach works: (a) it can achieve 95% stroke recognition accuracy. Within 50% of gesture, it can achieve a recall value of 92% for 10 novice users and 93% for 15 experienced users from a continuous sensor stream; (b) it can track hand movement during such strokeplay with a median accuracy of 6.2 cm. |
doi_str_mv | 10.1145/3191771 |
format | Article |
fullrecord | <record><control><sourceid>acm_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1145_3191771</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3191771</sourcerecordid><originalsourceid>FETCH-LOGICAL-a1221-85bc7d9a85c52cfae29660b052710f85429a6bde959d205e20235481b97c188d3</originalsourceid><addsrcrecordid>eNpNkDFPwzAQhS0EElWp2Jm8MQV8ThzbY9WWUqkSA2WOLo4DCWkS2QbUf0-itojp3tN996R7hNwCewBIxGMMGqSECzLhiUwiLVJ5-U9fk5n3NWMMdBwrJiekft2jCz8YzEeUo7cFXaFrDnRtffhyli5tsCZUXUsV3TmsB9O5wyjNZ9W-07JzdNMGO_hQfdvzXbR0g2vpvO-byuAY4G_IVYmNt7PTnJK3p9Vu8RxtX9abxXwbIXAOkRK5kYVGJYzgpkTLdZqynAkugZVKJFxjmhdWC11wJixnPBaJglxLA0oV8ZTcH3ON67x3tsx6Vw1fHjJg2dhSdmppIO-OJJr9H3Re_gIdQ2Hb</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Smartwatch-based Early Gesture Detection 8 Trajectory Tracking for Interactive Gesture-Driven Applications</title><source>ACM Digital Library Complete</source><creator>Vu, Tran Huy ; Misra, Archan ; Roy, Quentin ; Wei, Kenny Choo Tsu ; Lee, Youngki</creator><creatorcontrib>Vu, Tran Huy ; Misra, Archan ; Roy, Quentin ; Wei, Kenny Choo Tsu ; Lee, Youngki</creatorcontrib><description>The paper explores the possibility of using wrist-worn devices (specifically, a smartwatch) to accurately track the hand movement and gestures for a new class of immersive, interactive gesture-driven applications. These interactive applications need two special features: (a) the ability to identify gestures from a continuous stream of sensor data early--i.e., even before the gesture is complete, and (b) the ability to precisely track the hand's trajectory, even though the underlying inertial sensor data is noisy. We develop a new approach that tackles these requirements by first building a HMM-based gesture recognition framework that does not need an explicit segmentation step, and then using a per-gesture trajectory tracking solution that tracks the hand movement only during these predefined gestures. Using an elaborate setup that allows us to realistically study the table-tennis related hand movements of users, we show that our approach works: (a) it can achieve 95% stroke recognition accuracy. Within 50% of gesture, it can achieve a recall value of 92% for 10 novice users and 93% for 15 experienced users from a continuous sensor stream; (b) it can track hand movement during such strokeplay with a median accuracy of 6.2 cm.</description><identifier>ISSN: 2474-9567</identifier><identifier>EISSN: 2474-9567</identifier><identifier>DOI: 10.1145/3191771</identifier><language>eng</language><publisher>New York, NY, USA: ACM</publisher><subject>Gestural input ; Human computer interaction (HCI) ; Human-centered computing ; Interaction techniques ; Ubiquitous and mobile computing ; Ubiquitous and mobile computing systems and tools</subject><ispartof>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies, 2018-03, Vol.2 (1), p.1-27, Article 39</ispartof><rights>ACM</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-a1221-85bc7d9a85c52cfae29660b052710f85429a6bde959d205e20235481b97c188d3</citedby><cites>FETCH-LOGICAL-a1221-85bc7d9a85c52cfae29660b052710f85429a6bde959d205e20235481b97c188d3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://dl.acm.org/doi/pdf/10.1145/3191771$$EPDF$$P50$$Gacm$$H</linktopdf><link.rule.ids>314,777,781,2276,27905,27906,40177,75977</link.rule.ids></links><search><creatorcontrib>Vu, Tran Huy</creatorcontrib><creatorcontrib>Misra, Archan</creatorcontrib><creatorcontrib>Roy, Quentin</creatorcontrib><creatorcontrib>Wei, Kenny Choo Tsu</creatorcontrib><creatorcontrib>Lee, Youngki</creatorcontrib><title>Smartwatch-based Early Gesture Detection 8 Trajectory Tracking for Interactive Gesture-Driven Applications</title><title>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies</title><addtitle>ACM IMWUT</addtitle><description>The paper explores the possibility of using wrist-worn devices (specifically, a smartwatch) to accurately track the hand movement and gestures for a new class of immersive, interactive gesture-driven applications. These interactive applications need two special features: (a) the ability to identify gestures from a continuous stream of sensor data early--i.e., even before the gesture is complete, and (b) the ability to precisely track the hand's trajectory, even though the underlying inertial sensor data is noisy. We develop a new approach that tackles these requirements by first building a HMM-based gesture recognition framework that does not need an explicit segmentation step, and then using a per-gesture trajectory tracking solution that tracks the hand movement only during these predefined gestures. Using an elaborate setup that allows us to realistically study the table-tennis related hand movements of users, we show that our approach works: (a) it can achieve 95% stroke recognition accuracy. Within 50% of gesture, it can achieve a recall value of 92% for 10 novice users and 93% for 15 experienced users from a continuous sensor stream; (b) it can track hand movement during such strokeplay with a median accuracy of 6.2 cm.</description><subject>Gestural input</subject><subject>Human computer interaction (HCI)</subject><subject>Human-centered computing</subject><subject>Interaction techniques</subject><subject>Ubiquitous and mobile computing</subject><subject>Ubiquitous and mobile computing systems and tools</subject><issn>2474-9567</issn><issn>2474-9567</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><recordid>eNpNkDFPwzAQhS0EElWp2Jm8MQV8ThzbY9WWUqkSA2WOLo4DCWkS2QbUf0-itojp3tN996R7hNwCewBIxGMMGqSECzLhiUwiLVJ5-U9fk5n3NWMMdBwrJiekft2jCz8YzEeUo7cFXaFrDnRtffhyli5tsCZUXUsV3TmsB9O5wyjNZ9W-07JzdNMGO_hQfdvzXbR0g2vpvO-byuAY4G_IVYmNt7PTnJK3p9Vu8RxtX9abxXwbIXAOkRK5kYVGJYzgpkTLdZqynAkugZVKJFxjmhdWC11wJixnPBaJglxLA0oV8ZTcH3ON67x3tsx6Vw1fHjJg2dhSdmppIO-OJJr9H3Re_gIdQ2Hb</recordid><startdate>20180326</startdate><enddate>20180326</enddate><creator>Vu, Tran Huy</creator><creator>Misra, Archan</creator><creator>Roy, Quentin</creator><creator>Wei, Kenny Choo Tsu</creator><creator>Lee, Youngki</creator><general>ACM</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20180326</creationdate><title>Smartwatch-based Early Gesture Detection 8 Trajectory Tracking for Interactive Gesture-Driven Applications</title><author>Vu, Tran Huy ; Misra, Archan ; Roy, Quentin ; Wei, Kenny Choo Tsu ; Lee, Youngki</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a1221-85bc7d9a85c52cfae29660b052710f85429a6bde959d205e20235481b97c188d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Gestural input</topic><topic>Human computer interaction (HCI)</topic><topic>Human-centered computing</topic><topic>Interaction techniques</topic><topic>Ubiquitous and mobile computing</topic><topic>Ubiquitous and mobile computing systems and tools</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Vu, Tran Huy</creatorcontrib><creatorcontrib>Misra, Archan</creatorcontrib><creatorcontrib>Roy, Quentin</creatorcontrib><creatorcontrib>Wei, Kenny Choo Tsu</creatorcontrib><creatorcontrib>Lee, Youngki</creatorcontrib><collection>CrossRef</collection><jtitle>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Vu, Tran Huy</au><au>Misra, Archan</au><au>Roy, Quentin</au><au>Wei, Kenny Choo Tsu</au><au>Lee, Youngki</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Smartwatch-based Early Gesture Detection 8 Trajectory Tracking for Interactive Gesture-Driven Applications</atitle><jtitle>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies</jtitle><stitle>ACM IMWUT</stitle><date>2018-03-26</date><risdate>2018</risdate><volume>2</volume><issue>1</issue><spage>1</spage><epage>27</epage><pages>1-27</pages><artnum>39</artnum><issn>2474-9567</issn><eissn>2474-9567</eissn><abstract>The paper explores the possibility of using wrist-worn devices (specifically, a smartwatch) to accurately track the hand movement and gestures for a new class of immersive, interactive gesture-driven applications. These interactive applications need two special features: (a) the ability to identify gestures from a continuous stream of sensor data early--i.e., even before the gesture is complete, and (b) the ability to precisely track the hand's trajectory, even though the underlying inertial sensor data is noisy. We develop a new approach that tackles these requirements by first building a HMM-based gesture recognition framework that does not need an explicit segmentation step, and then using a per-gesture trajectory tracking solution that tracks the hand movement only during these predefined gestures. Using an elaborate setup that allows us to realistically study the table-tennis related hand movements of users, we show that our approach works: (a) it can achieve 95% stroke recognition accuracy. Within 50% of gesture, it can achieve a recall value of 92% for 10 novice users and 93% for 15 experienced users from a continuous sensor stream; (b) it can track hand movement during such strokeplay with a median accuracy of 6.2 cm.</abstract><cop>New York, NY, USA</cop><pub>ACM</pub><doi>10.1145/3191771</doi><tpages>27</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2474-9567 |
ispartof | Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies, 2018-03, Vol.2 (1), p.1-27, Article 39 |
issn | 2474-9567 2474-9567 |
language | eng |
recordid | cdi_crossref_primary_10_1145_3191771 |
source | ACM Digital Library Complete |
subjects | Gestural input Human computer interaction (HCI) Human-centered computing Interaction techniques Ubiquitous and mobile computing Ubiquitous and mobile computing systems and tools |
title | Smartwatch-based Early Gesture Detection 8 Trajectory Tracking for Interactive Gesture-Driven Applications |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T20%3A36%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-acm_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Smartwatch-based%20Early%20Gesture%20Detection%208%20Trajectory%20Tracking%20for%20Interactive%20Gesture-Driven%20Applications&rft.jtitle=Proceedings%20of%20ACM%20on%20interactive,%20mobile,%20wearable%20and%20ubiquitous%20technologies&rft.au=Vu,%20Tran%20Huy&rft.date=2018-03-26&rft.volume=2&rft.issue=1&rft.spage=1&rft.epage=27&rft.pages=1-27&rft.artnum=39&rft.issn=2474-9567&rft.eissn=2474-9567&rft_id=info:doi/10.1145/3191771&rft_dat=%3Cacm_cross%3E3191771%3C/acm_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |