Towards Open-World Gesture Recognition

Static machine learning methods in gesture recognition assume that training and test data come from the same underlying distribution. However, in real-world applications involving gesture recognition on wrist-worn devices, data distribution may change over time. We formulate this problem of adapting...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-01
Hauptverfasser: Shen, Junxiao, De Lange, Matthias, Xuhai "Orson" Xu, Zhou, Enmin, Tan, Ran, Suda, Naveen, Lazarewicz, Maciej, Kristensson, Per Ola, Karlson, Amy, Strasnick, Evan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Shen, Junxiao
De Lange, Matthias
Xuhai "Orson" Xu
Zhou, Enmin
Tan, Ran
Suda, Naveen
Lazarewicz, Maciej
Kristensson, Per Ola
Karlson, Amy
Strasnick, Evan
description Static machine learning methods in gesture recognition assume that training and test data come from the same underlying distribution. However, in real-world applications involving gesture recognition on wrist-worn devices, data distribution may change over time. We formulate this problem of adapting recognition models to new tasks, where new data patterns emerge, as open-world gesture recognition (OWGR). We propose leveraging continual learning to make machine learning models adaptive to new tasks without degrading performance on previously learned tasks. However, the exploration of parameters for questions around when and how to train and deploy recognition models requires time-consuming user studies and is sometimes impractical. To address this challenge, we propose a design engineering approach that enables offline analysis on a collected large-scale dataset with various parameters and compares different continual learning methods. Finally, design guidelines are provided to enhance the development of an open-world wrist-worn gesture recognition process.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2917676116</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2917676116</sourcerecordid><originalsourceid>FETCH-proquest_journals_29176761163</originalsourceid><addsrcrecordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRQC8kvTyxKKVbwL0jN0w3PL8pJUXBPLS4pLUpVCEpNzk_PyyzJzM_jYWBNS8wpTuWF0twMym6uIc4eugVF-YWlQPXxWfmlRXlAqXgjS0NzM3MzQ0MzY-JUAQA7Ny9y</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2917676116</pqid></control><display><type>article</type><title>Towards Open-World Gesture Recognition</title><source>Freely Accessible Journals</source><creator>Shen, Junxiao ; De Lange, Matthias ; Xuhai "Orson" Xu ; Zhou, Enmin ; Tan, Ran ; Suda, Naveen ; Lazarewicz, Maciej ; Kristensson, Per Ola ; Karlson, Amy ; Strasnick, Evan</creator><creatorcontrib>Shen, Junxiao ; De Lange, Matthias ; Xuhai "Orson" Xu ; Zhou, Enmin ; Tan, Ran ; Suda, Naveen ; Lazarewicz, Maciej ; Kristensson, Per Ola ; Karlson, Amy ; Strasnick, Evan</creatorcontrib><description>Static machine learning methods in gesture recognition assume that training and test data come from the same underlying distribution. However, in real-world applications involving gesture recognition on wrist-worn devices, data distribution may change over time. We formulate this problem of adapting recognition models to new tasks, where new data patterns emerge, as open-world gesture recognition (OWGR). We propose leveraging continual learning to make machine learning models adaptive to new tasks without degrading performance on previously learned tasks. However, the exploration of parameters for questions around when and how to train and deploy recognition models requires time-consuming user studies and is sometimes impractical. To address this challenge, we propose a design engineering approach that enables offline analysis on a collected large-scale dataset with various parameters and compares different continual learning methods. Finally, design guidelines are provided to enhance the development of an open-world wrist-worn gesture recognition process.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Design engineering ; Gesture recognition ; Machine learning ; Mathematical models ; Parameters ; Performance degradation ; Wrist</subject><ispartof>arXiv.org, 2024-01</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Shen, Junxiao</creatorcontrib><creatorcontrib>De Lange, Matthias</creatorcontrib><creatorcontrib>Xuhai "Orson" Xu</creatorcontrib><creatorcontrib>Zhou, Enmin</creatorcontrib><creatorcontrib>Tan, Ran</creatorcontrib><creatorcontrib>Suda, Naveen</creatorcontrib><creatorcontrib>Lazarewicz, Maciej</creatorcontrib><creatorcontrib>Kristensson, Per Ola</creatorcontrib><creatorcontrib>Karlson, Amy</creatorcontrib><creatorcontrib>Strasnick, Evan</creatorcontrib><title>Towards Open-World Gesture Recognition</title><title>arXiv.org</title><description>Static machine learning methods in gesture recognition assume that training and test data come from the same underlying distribution. However, in real-world applications involving gesture recognition on wrist-worn devices, data distribution may change over time. We formulate this problem of adapting recognition models to new tasks, where new data patterns emerge, as open-world gesture recognition (OWGR). We propose leveraging continual learning to make machine learning models adaptive to new tasks without degrading performance on previously learned tasks. However, the exploration of parameters for questions around when and how to train and deploy recognition models requires time-consuming user studies and is sometimes impractical. To address this challenge, we propose a design engineering approach that enables offline analysis on a collected large-scale dataset with various parameters and compares different continual learning methods. Finally, design guidelines are provided to enhance the development of an open-world wrist-worn gesture recognition process.</description><subject>Design engineering</subject><subject>Gesture recognition</subject><subject>Machine learning</subject><subject>Mathematical models</subject><subject>Parameters</subject><subject>Performance degradation</subject><subject>Wrist</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRQC8kvTyxKKVbwL0jN0w3PL8pJUXBPLS4pLUpVCEpNzk_PyyzJzM_jYWBNS8wpTuWF0twMym6uIc4eugVF-YWlQPXxWfmlRXlAqXgjS0NzM3MzQ0MzY-JUAQA7Ny9y</recordid><startdate>20240120</startdate><enddate>20240120</enddate><creator>Shen, Junxiao</creator><creator>De Lange, Matthias</creator><creator>Xuhai "Orson" Xu</creator><creator>Zhou, Enmin</creator><creator>Tan, Ran</creator><creator>Suda, Naveen</creator><creator>Lazarewicz, Maciej</creator><creator>Kristensson, Per Ola</creator><creator>Karlson, Amy</creator><creator>Strasnick, Evan</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240120</creationdate><title>Towards Open-World Gesture Recognition</title><author>Shen, Junxiao ; De Lange, Matthias ; Xuhai "Orson" Xu ; Zhou, Enmin ; Tan, Ran ; Suda, Naveen ; Lazarewicz, Maciej ; Kristensson, Per Ola ; Karlson, Amy ; Strasnick, Evan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_29176761163</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Design engineering</topic><topic>Gesture recognition</topic><topic>Machine learning</topic><topic>Mathematical models</topic><topic>Parameters</topic><topic>Performance degradation</topic><topic>Wrist</topic><toplevel>online_resources</toplevel><creatorcontrib>Shen, Junxiao</creatorcontrib><creatorcontrib>De Lange, Matthias</creatorcontrib><creatorcontrib>Xuhai "Orson" Xu</creatorcontrib><creatorcontrib>Zhou, Enmin</creatorcontrib><creatorcontrib>Tan, Ran</creatorcontrib><creatorcontrib>Suda, Naveen</creatorcontrib><creatorcontrib>Lazarewicz, Maciej</creatorcontrib><creatorcontrib>Kristensson, Per Ola</creatorcontrib><creatorcontrib>Karlson, Amy</creatorcontrib><creatorcontrib>Strasnick, Evan</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shen, Junxiao</au><au>De Lange, Matthias</au><au>Xuhai "Orson" Xu</au><au>Zhou, Enmin</au><au>Tan, Ran</au><au>Suda, Naveen</au><au>Lazarewicz, Maciej</au><au>Kristensson, Per Ola</au><au>Karlson, Amy</au><au>Strasnick, Evan</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Towards Open-World Gesture Recognition</atitle><jtitle>arXiv.org</jtitle><date>2024-01-20</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Static machine learning methods in gesture recognition assume that training and test data come from the same underlying distribution. However, in real-world applications involving gesture recognition on wrist-worn devices, data distribution may change over time. We formulate this problem of adapting recognition models to new tasks, where new data patterns emerge, as open-world gesture recognition (OWGR). We propose leveraging continual learning to make machine learning models adaptive to new tasks without degrading performance on previously learned tasks. However, the exploration of parameters for questions around when and how to train and deploy recognition models requires time-consuming user studies and is sometimes impractical. To address this challenge, we propose a design engineering approach that enables offline analysis on a collected large-scale dataset with various parameters and compares different continual learning methods. Finally, design guidelines are provided to enhance the development of an open-world wrist-worn gesture recognition process.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-01
issn 2331-8422
language eng
recordid cdi_proquest_journals_2917676116
source Freely Accessible Journals
subjects Design engineering
Gesture recognition
Machine learning
Mathematical models
Parameters
Performance degradation
Wrist
title Towards Open-World Gesture Recognition
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T05%3A08%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Towards%20Open-World%20Gesture%20Recognition&rft.jtitle=arXiv.org&rft.au=Shen,%20Junxiao&rft.date=2024-01-20&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2917676116%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2917676116&rft_id=info:pmid/&rfr_iscdi=true