Hand-Gesture-Recognition Based Text Input Method for AR/VR Wearable Devices

Static and dynamic hand movements are basic way for human-machine interactions. To recognize and classify these movements, first these movements are captured by the cameras mounted on the augmented reality (AR) or virtual reality (VR) wearable devices. The hand is segmented using segmentation method...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Maitlo, Nizamuddin, Wang, Yanbo, Chen, Chao Ping, Mi, Lantian, Zhang, Wenbo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Maitlo, Nizamuddin
Wang, Yanbo
Chen, Chao Ping
Mi, Lantian
Zhang, Wenbo
description Static and dynamic hand movements are basic way for human-machine interactions. To recognize and classify these movements, first these movements are captured by the cameras mounted on the augmented reality (AR) or virtual reality (VR) wearable devices. The hand is segmented using segmentation method and its gestures are passed to hand gesture recognition algorithm, which depends on depth-wise separable convolutional neural network for training, testing and finally running smoothly on mobile AR/VR devices, while maintaining the accuracy and balancing the load. A number of gestures are processed for identification of right gesture and to classify the gesture and ignore the all intermittent gestures. With proposed method, a user can write letters and numbers in air by just moving his/her hand in air. Gesture based operations are performed, and trajectory of hand is recorded as handwritten text. Finally, that handwritten text is processed for the text recognition.
doi_str_mv 10.48550/arxiv.1907.12188
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1907_12188</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1907_12188</sourcerecordid><originalsourceid>FETCH-LOGICAL-a678-efdebc860469f7940f24535c8ec3b8ef26d68933daa79f6317aaf29a4877b1c53</originalsourceid><addsrcrecordid>eNotz71OwzAUQGEvDKjwAEz4BZzacfw3lgJt1SKkKIIxurGvwVJJKietytsjCtPZjvQRcid4UVml-BzyOZ0K4bgpRCmsvSbbNfSBrXCcjhlZjX746NOUhp4-wIiBNnie6KY_HCf6gtPnEGgcMl3U87eaviNk6PZIH_GUPI435CrCfsTb_85I8_zULNds97raLBc7BtpYhjFg563mlXbRuIrHslJSeYtedhZjqYO2TsoAYFzUUhiAWDqorDGd8ErOyP3f9qJpDzl9Qf5uf1XtRSV_AAQYRvY</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Hand-Gesture-Recognition Based Text Input Method for AR/VR Wearable Devices</title><source>arXiv.org</source><creator>Maitlo, Nizamuddin ; Wang, Yanbo ; Chen, Chao Ping ; Mi, Lantian ; Zhang, Wenbo</creator><creatorcontrib>Maitlo, Nizamuddin ; Wang, Yanbo ; Chen, Chao Ping ; Mi, Lantian ; Zhang, Wenbo</creatorcontrib><description>Static and dynamic hand movements are basic way for human-machine interactions. To recognize and classify these movements, first these movements are captured by the cameras mounted on the augmented reality (AR) or virtual reality (VR) wearable devices. The hand is segmented using segmentation method and its gestures are passed to hand gesture recognition algorithm, which depends on depth-wise separable convolutional neural network for training, testing and finally running smoothly on mobile AR/VR devices, while maintaining the accuracy and balancing the load. A number of gestures are processed for identification of right gesture and to classify the gesture and ignore the all intermittent gestures. With proposed method, a user can write letters and numbers in air by just moving his/her hand in air. Gesture based operations are performed, and trajectory of hand is recorded as handwritten text. Finally, that handwritten text is processed for the text recognition.</description><identifier>DOI: 10.48550/arxiv.1907.12188</identifier><language>eng</language><subject>Computer Science - Human-Computer Interaction</subject><creationdate>2019-07</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1907.12188$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1907.12188$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Maitlo, Nizamuddin</creatorcontrib><creatorcontrib>Wang, Yanbo</creatorcontrib><creatorcontrib>Chen, Chao Ping</creatorcontrib><creatorcontrib>Mi, Lantian</creatorcontrib><creatorcontrib>Zhang, Wenbo</creatorcontrib><title>Hand-Gesture-Recognition Based Text Input Method for AR/VR Wearable Devices</title><description>Static and dynamic hand movements are basic way for human-machine interactions. To recognize and classify these movements, first these movements are captured by the cameras mounted on the augmented reality (AR) or virtual reality (VR) wearable devices. The hand is segmented using segmentation method and its gestures are passed to hand gesture recognition algorithm, which depends on depth-wise separable convolutional neural network for training, testing and finally running smoothly on mobile AR/VR devices, while maintaining the accuracy and balancing the load. A number of gestures are processed for identification of right gesture and to classify the gesture and ignore the all intermittent gestures. With proposed method, a user can write letters and numbers in air by just moving his/her hand in air. Gesture based operations are performed, and trajectory of hand is recorded as handwritten text. Finally, that handwritten text is processed for the text recognition.</description><subject>Computer Science - Human-Computer Interaction</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz71OwzAUQGEvDKjwAEz4BZzacfw3lgJt1SKkKIIxurGvwVJJKietytsjCtPZjvQRcid4UVml-BzyOZ0K4bgpRCmsvSbbNfSBrXCcjhlZjX746NOUhp4-wIiBNnie6KY_HCf6gtPnEGgcMl3U87eaviNk6PZIH_GUPI435CrCfsTb_85I8_zULNds97raLBc7BtpYhjFg563mlXbRuIrHslJSeYtedhZjqYO2TsoAYFzUUhiAWDqorDGd8ErOyP3f9qJpDzl9Qf5uf1XtRSV_AAQYRvY</recordid><startdate>20190728</startdate><enddate>20190728</enddate><creator>Maitlo, Nizamuddin</creator><creator>Wang, Yanbo</creator><creator>Chen, Chao Ping</creator><creator>Mi, Lantian</creator><creator>Zhang, Wenbo</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20190728</creationdate><title>Hand-Gesture-Recognition Based Text Input Method for AR/VR Wearable Devices</title><author>Maitlo, Nizamuddin ; Wang, Yanbo ; Chen, Chao Ping ; Mi, Lantian ; Zhang, Wenbo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a678-efdebc860469f7940f24535c8ec3b8ef26d68933daa79f6317aaf29a4877b1c53</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Science - Human-Computer Interaction</topic><toplevel>online_resources</toplevel><creatorcontrib>Maitlo, Nizamuddin</creatorcontrib><creatorcontrib>Wang, Yanbo</creatorcontrib><creatorcontrib>Chen, Chao Ping</creatorcontrib><creatorcontrib>Mi, Lantian</creatorcontrib><creatorcontrib>Zhang, Wenbo</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Maitlo, Nizamuddin</au><au>Wang, Yanbo</au><au>Chen, Chao Ping</au><au>Mi, Lantian</au><au>Zhang, Wenbo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Hand-Gesture-Recognition Based Text Input Method for AR/VR Wearable Devices</atitle><date>2019-07-28</date><risdate>2019</risdate><abstract>Static and dynamic hand movements are basic way for human-machine interactions. To recognize and classify these movements, first these movements are captured by the cameras mounted on the augmented reality (AR) or virtual reality (VR) wearable devices. The hand is segmented using segmentation method and its gestures are passed to hand gesture recognition algorithm, which depends on depth-wise separable convolutional neural network for training, testing and finally running smoothly on mobile AR/VR devices, while maintaining the accuracy and balancing the load. A number of gestures are processed for identification of right gesture and to classify the gesture and ignore the all intermittent gestures. With proposed method, a user can write letters and numbers in air by just moving his/her hand in air. Gesture based operations are performed, and trajectory of hand is recorded as handwritten text. Finally, that handwritten text is processed for the text recognition.</abstract><doi>10.48550/arxiv.1907.12188</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1907.12188
ispartof
issn
language eng
recordid cdi_arxiv_primary_1907_12188
source arXiv.org
subjects Computer Science - Human-Computer Interaction
title Hand-Gesture-Recognition Based Text Input Method for AR/VR Wearable Devices
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T20%3A56%3A53IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Hand-Gesture-Recognition%20Based%20Text%20Input%20Method%20for%20AR/VR%20Wearable%20Devices&rft.au=Maitlo,%20Nizamuddin&rft.date=2019-07-28&rft_id=info:doi/10.48550/arxiv.1907.12188&rft_dat=%3Carxiv_GOX%3E1907_12188%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true