SmartASL: "Point-of-Care" Comprehensive ASL Interpreter Using Wearables

Sign language builds up an important bridge between the d/Deaf and hard-of-hearing (DHH) and hearing people. Regrettably, most hearing people face challenges in comprehending sign language, necessitating sign language translation. However, state-of-the-art wearable-based techniques mainly concentrat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies mobile, wearable and ubiquitous technologies, 2023-06, Vol.7 (2), p.1-21, Article 60
Hauptverfasser: Jin, Yincheng, Zhang, Shibo, Gao, Yang, Xu, Xuhai, Choi, Seokmin, Li, Zhengxiong, Adler, Henry J., Jin, Zhanpeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 21
container_issue 2
container_start_page 1
container_title Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies
container_volume 7
creator Jin, Yincheng
Zhang, Shibo
Gao, Yang
Xu, Xuhai
Choi, Seokmin
Li, Zhengxiong
Adler, Henry J.
Jin, Zhanpeng
description Sign language builds up an important bridge between the d/Deaf and hard-of-hearing (DHH) and hearing people. Regrettably, most hearing people face challenges in comprehending sign language, necessitating sign language translation. However, state-of-the-art wearable-based techniques mainly concentrate on recognizing manual markers (e.g., hand gestures), while frequently overlooking non-manual markers, such as negative head shaking, question markers, and mouthing. This oversight results in the loss of substantial grammatical and semantic information in sign language. To address this limitation, we introduce SmartASL, a novel proof-of-concept system that can 1) recognize both manual and non-manual markers simultaneously using a combination of earbuds and a wrist-worn IMU, and 2) translate the recognized American Sign Language (ASL) glosses into spoken language. Our experiments demonstrate the SmartASL system's significant potential to accurately recognize the manual and non-manual markers in ASL, effectively bridging the communication gaps between ASL signers and hearing people using commercially available devices.
doi_str_mv 10.1145/3596255
format Article
fullrecord <record><control><sourceid>acm_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1145_3596255</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3596255</sourcerecordid><originalsourceid>FETCH-LOGICAL-a206t-bcb8b8f35ed000c8b632946850a269de56abfc45ec885c0e6daf50c7d6c91ce23</originalsourceid><addsrcrecordid>eNpNj09rAjEUxENRWLHSe7-Ap60vf97b5CjSamGhB-t5Sd4m0NJF2Xjx27uiFU8zMD-GGSFeJLxJaXCh0ZFCfBITZSpTOqRq9OALMcv5FwCk09pCNRHFtvP9cbmtn8U4-b8cZzedit3H-_dqU9Zf68_Vsi69AjqWgYMNNmmM7VDDNpBWzpBF8IpcG5F8SGwwsrXIEKn1CYGrlthJjkpPxfzay_0-5z6m5tD_DBtOjYTm8qG5fRjI1yvpubtD_-EZGrM-dw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>SmartASL: "Point-of-Care" Comprehensive ASL Interpreter Using Wearables</title><source>ACM Digital Library</source><creator>Jin, Yincheng ; Zhang, Shibo ; Gao, Yang ; Xu, Xuhai ; Choi, Seokmin ; Li, Zhengxiong ; Adler, Henry J. ; Jin, Zhanpeng</creator><creatorcontrib>Jin, Yincheng ; Zhang, Shibo ; Gao, Yang ; Xu, Xuhai ; Choi, Seokmin ; Li, Zhengxiong ; Adler, Henry J. ; Jin, Zhanpeng</creatorcontrib><description>Sign language builds up an important bridge between the d/Deaf and hard-of-hearing (DHH) and hearing people. Regrettably, most hearing people face challenges in comprehending sign language, necessitating sign language translation. However, state-of-the-art wearable-based techniques mainly concentrate on recognizing manual markers (e.g., hand gestures), while frequently overlooking non-manual markers, such as negative head shaking, question markers, and mouthing. This oversight results in the loss of substantial grammatical and semantic information in sign language. To address this limitation, we introduce SmartASL, a novel proof-of-concept system that can 1) recognize both manual and non-manual markers simultaneously using a combination of earbuds and a wrist-worn IMU, and 2) translate the recognized American Sign Language (ASL) glosses into spoken language. Our experiments demonstrate the SmartASL system's significant potential to accurately recognize the manual and non-manual markers in ASL, effectively bridging the communication gaps between ASL signers and hearing people using commercially available devices.</description><identifier>ISSN: 2474-9567</identifier><identifier>EISSN: 2474-9567</identifier><identifier>DOI: 10.1145/3596255</identifier><language>eng</language><publisher>New York, NY, USA: ACM</publisher><subject>Human-centered computing ; Ubiquitous and mobile computing ; Ubiquitous and mobile computing systems and tools</subject><ispartof>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies, 2023-06, Vol.7 (2), p.1-21, Article 60</ispartof><rights>ACM</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-a206t-bcb8b8f35ed000c8b632946850a269de56abfc45ec885c0e6daf50c7d6c91ce23</cites><orcidid>0000-0003-1880-5096 ; 0000-0002-1812-4234 ; 0000-0003-3054-9590 ; 0000-0002-3020-3736 ; 0000-0001-6811-0183 ; 0000-0003-4284-6728 ; 0000-0001-5930-3899 ; 0000-0003-2166-1417</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://dl.acm.org/doi/pdf/10.1145/3596255$$EPDF$$P50$$Gacm$$H</linktopdf><link.rule.ids>314,780,784,2282,27924,27925,40196,76228</link.rule.ids></links><search><creatorcontrib>Jin, Yincheng</creatorcontrib><creatorcontrib>Zhang, Shibo</creatorcontrib><creatorcontrib>Gao, Yang</creatorcontrib><creatorcontrib>Xu, Xuhai</creatorcontrib><creatorcontrib>Choi, Seokmin</creatorcontrib><creatorcontrib>Li, Zhengxiong</creatorcontrib><creatorcontrib>Adler, Henry J.</creatorcontrib><creatorcontrib>Jin, Zhanpeng</creatorcontrib><title>SmartASL: "Point-of-Care" Comprehensive ASL Interpreter Using Wearables</title><title>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies</title><addtitle>ACM IMWUT</addtitle><description>Sign language builds up an important bridge between the d/Deaf and hard-of-hearing (DHH) and hearing people. Regrettably, most hearing people face challenges in comprehending sign language, necessitating sign language translation. However, state-of-the-art wearable-based techniques mainly concentrate on recognizing manual markers (e.g., hand gestures), while frequently overlooking non-manual markers, such as negative head shaking, question markers, and mouthing. This oversight results in the loss of substantial grammatical and semantic information in sign language. To address this limitation, we introduce SmartASL, a novel proof-of-concept system that can 1) recognize both manual and non-manual markers simultaneously using a combination of earbuds and a wrist-worn IMU, and 2) translate the recognized American Sign Language (ASL) glosses into spoken language. Our experiments demonstrate the SmartASL system's significant potential to accurately recognize the manual and non-manual markers in ASL, effectively bridging the communication gaps between ASL signers and hearing people using commercially available devices.</description><subject>Human-centered computing</subject><subject>Ubiquitous and mobile computing</subject><subject>Ubiquitous and mobile computing systems and tools</subject><issn>2474-9567</issn><issn>2474-9567</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNpNj09rAjEUxENRWLHSe7-Ap60vf97b5CjSamGhB-t5Sd4m0NJF2Xjx27uiFU8zMD-GGSFeJLxJaXCh0ZFCfBITZSpTOqRq9OALMcv5FwCk09pCNRHFtvP9cbmtn8U4-b8cZzedit3H-_dqU9Zf68_Vsi69AjqWgYMNNmmM7VDDNpBWzpBF8IpcG5F8SGwwsrXIEKn1CYGrlthJjkpPxfzay_0-5z6m5tD_DBtOjYTm8qG5fRjI1yvpubtD_-EZGrM-dw</recordid><startdate>20230612</startdate><enddate>20230612</enddate><creator>Jin, Yincheng</creator><creator>Zhang, Shibo</creator><creator>Gao, Yang</creator><creator>Xu, Xuhai</creator><creator>Choi, Seokmin</creator><creator>Li, Zhengxiong</creator><creator>Adler, Henry J.</creator><creator>Jin, Zhanpeng</creator><general>ACM</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0003-1880-5096</orcidid><orcidid>https://orcid.org/0000-0002-1812-4234</orcidid><orcidid>https://orcid.org/0000-0003-3054-9590</orcidid><orcidid>https://orcid.org/0000-0002-3020-3736</orcidid><orcidid>https://orcid.org/0000-0001-6811-0183</orcidid><orcidid>https://orcid.org/0000-0003-4284-6728</orcidid><orcidid>https://orcid.org/0000-0001-5930-3899</orcidid><orcidid>https://orcid.org/0000-0003-2166-1417</orcidid></search><sort><creationdate>20230612</creationdate><title>SmartASL</title><author>Jin, Yincheng ; Zhang, Shibo ; Gao, Yang ; Xu, Xuhai ; Choi, Seokmin ; Li, Zhengxiong ; Adler, Henry J. ; Jin, Zhanpeng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a206t-bcb8b8f35ed000c8b632946850a269de56abfc45ec885c0e6daf50c7d6c91ce23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Human-centered computing</topic><topic>Ubiquitous and mobile computing</topic><topic>Ubiquitous and mobile computing systems and tools</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Jin, Yincheng</creatorcontrib><creatorcontrib>Zhang, Shibo</creatorcontrib><creatorcontrib>Gao, Yang</creatorcontrib><creatorcontrib>Xu, Xuhai</creatorcontrib><creatorcontrib>Choi, Seokmin</creatorcontrib><creatorcontrib>Li, Zhengxiong</creatorcontrib><creatorcontrib>Adler, Henry J.</creatorcontrib><creatorcontrib>Jin, Zhanpeng</creatorcontrib><collection>CrossRef</collection><jtitle>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Jin, Yincheng</au><au>Zhang, Shibo</au><au>Gao, Yang</au><au>Xu, Xuhai</au><au>Choi, Seokmin</au><au>Li, Zhengxiong</au><au>Adler, Henry J.</au><au>Jin, Zhanpeng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>SmartASL: "Point-of-Care" Comprehensive ASL Interpreter Using Wearables</atitle><jtitle>Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies</jtitle><stitle>ACM IMWUT</stitle><date>2023-06-12</date><risdate>2023</risdate><volume>7</volume><issue>2</issue><spage>1</spage><epage>21</epage><pages>1-21</pages><artnum>60</artnum><issn>2474-9567</issn><eissn>2474-9567</eissn><abstract>Sign language builds up an important bridge between the d/Deaf and hard-of-hearing (DHH) and hearing people. Regrettably, most hearing people face challenges in comprehending sign language, necessitating sign language translation. However, state-of-the-art wearable-based techniques mainly concentrate on recognizing manual markers (e.g., hand gestures), while frequently overlooking non-manual markers, such as negative head shaking, question markers, and mouthing. This oversight results in the loss of substantial grammatical and semantic information in sign language. To address this limitation, we introduce SmartASL, a novel proof-of-concept system that can 1) recognize both manual and non-manual markers simultaneously using a combination of earbuds and a wrist-worn IMU, and 2) translate the recognized American Sign Language (ASL) glosses into spoken language. Our experiments demonstrate the SmartASL system's significant potential to accurately recognize the manual and non-manual markers in ASL, effectively bridging the communication gaps between ASL signers and hearing people using commercially available devices.</abstract><cop>New York, NY, USA</cop><pub>ACM</pub><doi>10.1145/3596255</doi><tpages>21</tpages><orcidid>https://orcid.org/0000-0003-1880-5096</orcidid><orcidid>https://orcid.org/0000-0002-1812-4234</orcidid><orcidid>https://orcid.org/0000-0003-3054-9590</orcidid><orcidid>https://orcid.org/0000-0002-3020-3736</orcidid><orcidid>https://orcid.org/0000-0001-6811-0183</orcidid><orcidid>https://orcid.org/0000-0003-4284-6728</orcidid><orcidid>https://orcid.org/0000-0001-5930-3899</orcidid><orcidid>https://orcid.org/0000-0003-2166-1417</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 2474-9567
ispartof Proceedings of ACM on interactive, mobile, wearable and ubiquitous technologies, 2023-06, Vol.7 (2), p.1-21, Article 60
issn 2474-9567
2474-9567
language eng
recordid cdi_crossref_primary_10_1145_3596255
source ACM Digital Library
subjects Human-centered computing
Ubiquitous and mobile computing
Ubiquitous and mobile computing systems and tools
title SmartASL: "Point-of-Care" Comprehensive ASL Interpreter Using Wearables
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T06%3A56%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-acm_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=SmartASL:%20%22Point-of-Care%22%20Comprehensive%20ASL%20Interpreter%20Using%20Wearables&rft.jtitle=Proceedings%20of%20ACM%20on%20interactive,%20mobile,%20wearable%20and%20ubiquitous%20technologies&rft.au=Jin,%20Yincheng&rft.date=2023-06-12&rft.volume=7&rft.issue=2&rft.spage=1&rft.epage=21&rft.pages=1-21&rft.artnum=60&rft.issn=2474-9567&rft.eissn=2474-9567&rft_id=info:doi/10.1145/3596255&rft_dat=%3Cacm_cross%3E3596255%3C/acm_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true