AIGuide: Augmented Reality Hand Guidance in a Visual Prosthetic

Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with v...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on accessible computing 2022-06, Vol.15 (2), p.1-32, Article 12
Hauptverfasser: Lee, Sooyeon, Aldas, Nelson Daniel Troncoso, Lee, Chonghan, Rosson, Mary Beth, Carroll, John M., Narayanan, Vijaykrishnan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 32
container_issue 2
container_start_page 1
container_title ACM transactions on accessible computing
container_volume 15
creator Lee, Sooyeon
Aldas, Nelson Daniel Troncoso
Lee, Chonghan
Rosson, Mary Beth
Carroll, John M.
Narayanan, Vijaykrishnan
description Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with visual impairments. We present AIGuide, a self-contained smartphone application that leverages augmented reality technology to help users locate and pick up objects around them. We conducted a user study to investigate the effectiveness of AIGuide in a visual prosthetic for providing guidance; compare it to other assistive technology form factors; investigate the use of multimodal feedback, and provide feedback about the overall experience. We gathered performance data and participants’ reactions and analyzed videos to understand users’ interactions with the nonvisual smartphone user interface. Our results show that AIGuide is a promising technology to help people with visual impairments locate and acquire objects in their daily routine. The benefits of AIGuide may be enhanced with appropriate interaction design.
doi_str_mv 10.1145/3508501
format Article
fullrecord <record><control><sourceid>acm_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1145_3508501</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3508501</sourcerecordid><originalsourceid>FETCH-LOGICAL-a244t-72f7134ca25b803020a2d74e09008528df6fbfc452a64e0528a9432314365e3b3</originalsourceid><addsrcrecordid>eNo9j81Lw0AQxRdRsFbx7mlvnqKzn0m8SCi1LRRaRL2GyWZXV5Io2eTQ_94trT3N8N6PefMIuWXwwJhUj0JBpoCdkQnLhU5SLvT5aefZJbkK4RtA85yrCXkuVovR1_aJFuNna7vB1vTVYuOHHV1iV9O9i52x1HcU6YcPIzZ02_-E4csO3lyTC4dNsDfHOSXvL_O32TJZbxarWbFOkEs5xGSXMiENclVlIIAD8jqVFnKI3_KsdtpVzkjFUUc1KphLwQWTQisrKjEl94e7JkaH3rryt_ct9ruSQbnvXR57R_LuQKJpT9C_-QcVzU-R</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>AIGuide: Augmented Reality Hand Guidance in a Visual Prosthetic</title><source>ACM Digital Library</source><creator>Lee, Sooyeon ; Aldas, Nelson Daniel Troncoso ; Lee, Chonghan ; Rosson, Mary Beth ; Carroll, John M. ; Narayanan, Vijaykrishnan</creator><creatorcontrib>Lee, Sooyeon ; Aldas, Nelson Daniel Troncoso ; Lee, Chonghan ; Rosson, Mary Beth ; Carroll, John M. ; Narayanan, Vijaykrishnan</creatorcontrib><description>Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with visual impairments. We present AIGuide, a self-contained smartphone application that leverages augmented reality technology to help users locate and pick up objects around them. We conducted a user study to investigate the effectiveness of AIGuide in a visual prosthetic for providing guidance; compare it to other assistive technology form factors; investigate the use of multimodal feedback, and provide feedback about the overall experience. We gathered performance data and participants’ reactions and analyzed videos to understand users’ interactions with the nonvisual smartphone user interface. Our results show that AIGuide is a promising technology to help people with visual impairments locate and acquire objects in their daily routine. The benefits of AIGuide may be enhanced with appropriate interaction design.</description><identifier>ISSN: 1936-7228</identifier><identifier>EISSN: 1936-7236</identifier><identifier>DOI: 10.1145/3508501</identifier><language>eng</language><publisher>New York, NY: ACM</publisher><subject>Accessibility ; Accessibility design and evaluation methods ; Accessibility systems and tools ; Empirical studies in accessibility ; Human-centered computing</subject><ispartof>ACM transactions on accessible computing, 2022-06, Vol.15 (2), p.1-32, Article 12</ispartof><rights>Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-a244t-72f7134ca25b803020a2d74e09008528df6fbfc452a64e0528a9432314365e3b3</citedby><cites>FETCH-LOGICAL-a244t-72f7134ca25b803020a2d74e09008528df6fbfc452a64e0528a9432314365e3b3</cites><orcidid>0000-0001-5312-6860 ; 0000-0002-4971-2004</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://dl.acm.org/doi/pdf/10.1145/3508501$$EPDF$$P50$$Gacm$$H</linktopdf><link.rule.ids>314,776,780,2276,27903,27904,40175,75974</link.rule.ids></links><search><creatorcontrib>Lee, Sooyeon</creatorcontrib><creatorcontrib>Aldas, Nelson Daniel Troncoso</creatorcontrib><creatorcontrib>Lee, Chonghan</creatorcontrib><creatorcontrib>Rosson, Mary Beth</creatorcontrib><creatorcontrib>Carroll, John M.</creatorcontrib><creatorcontrib>Narayanan, Vijaykrishnan</creatorcontrib><title>AIGuide: Augmented Reality Hand Guidance in a Visual Prosthetic</title><title>ACM transactions on accessible computing</title><addtitle>ACM TACCESS</addtitle><description>Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with visual impairments. We present AIGuide, a self-contained smartphone application that leverages augmented reality technology to help users locate and pick up objects around them. We conducted a user study to investigate the effectiveness of AIGuide in a visual prosthetic for providing guidance; compare it to other assistive technology form factors; investigate the use of multimodal feedback, and provide feedback about the overall experience. We gathered performance data and participants’ reactions and analyzed videos to understand users’ interactions with the nonvisual smartphone user interface. Our results show that AIGuide is a promising technology to help people with visual impairments locate and acquire objects in their daily routine. The benefits of AIGuide may be enhanced with appropriate interaction design.</description><subject>Accessibility</subject><subject>Accessibility design and evaluation methods</subject><subject>Accessibility systems and tools</subject><subject>Empirical studies in accessibility</subject><subject>Human-centered computing</subject><issn>1936-7228</issn><issn>1936-7236</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNo9j81Lw0AQxRdRsFbx7mlvnqKzn0m8SCi1LRRaRL2GyWZXV5Io2eTQ_94trT3N8N6PefMIuWXwwJhUj0JBpoCdkQnLhU5SLvT5aefZJbkK4RtA85yrCXkuVovR1_aJFuNna7vB1vTVYuOHHV1iV9O9i52x1HcU6YcPIzZ02_-E4csO3lyTC4dNsDfHOSXvL_O32TJZbxarWbFOkEs5xGSXMiENclVlIIAD8jqVFnKI3_KsdtpVzkjFUUc1KphLwQWTQisrKjEl94e7JkaH3rryt_ct9ruSQbnvXR57R_LuQKJpT9C_-QcVzU-R</recordid><startdate>20220601</startdate><enddate>20220601</enddate><creator>Lee, Sooyeon</creator><creator>Aldas, Nelson Daniel Troncoso</creator><creator>Lee, Chonghan</creator><creator>Rosson, Mary Beth</creator><creator>Carroll, John M.</creator><creator>Narayanan, Vijaykrishnan</creator><general>ACM</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0001-5312-6860</orcidid><orcidid>https://orcid.org/0000-0002-4971-2004</orcidid></search><sort><creationdate>20220601</creationdate><title>AIGuide: Augmented Reality Hand Guidance in a Visual Prosthetic</title><author>Lee, Sooyeon ; Aldas, Nelson Daniel Troncoso ; Lee, Chonghan ; Rosson, Mary Beth ; Carroll, John M. ; Narayanan, Vijaykrishnan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a244t-72f7134ca25b803020a2d74e09008528df6fbfc452a64e0528a9432314365e3b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accessibility</topic><topic>Accessibility design and evaluation methods</topic><topic>Accessibility systems and tools</topic><topic>Empirical studies in accessibility</topic><topic>Human-centered computing</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lee, Sooyeon</creatorcontrib><creatorcontrib>Aldas, Nelson Daniel Troncoso</creatorcontrib><creatorcontrib>Lee, Chonghan</creatorcontrib><creatorcontrib>Rosson, Mary Beth</creatorcontrib><creatorcontrib>Carroll, John M.</creatorcontrib><creatorcontrib>Narayanan, Vijaykrishnan</creatorcontrib><collection>CrossRef</collection><jtitle>ACM transactions on accessible computing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lee, Sooyeon</au><au>Aldas, Nelson Daniel Troncoso</au><au>Lee, Chonghan</au><au>Rosson, Mary Beth</au><au>Carroll, John M.</au><au>Narayanan, Vijaykrishnan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>AIGuide: Augmented Reality Hand Guidance in a Visual Prosthetic</atitle><jtitle>ACM transactions on accessible computing</jtitle><stitle>ACM TACCESS</stitle><date>2022-06-01</date><risdate>2022</risdate><volume>15</volume><issue>2</issue><spage>1</spage><epage>32</epage><pages>1-32</pages><artnum>12</artnum><issn>1936-7228</issn><eissn>1936-7236</eissn><abstract>Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with visual impairments. We present AIGuide, a self-contained smartphone application that leverages augmented reality technology to help users locate and pick up objects around them. We conducted a user study to investigate the effectiveness of AIGuide in a visual prosthetic for providing guidance; compare it to other assistive technology form factors; investigate the use of multimodal feedback, and provide feedback about the overall experience. We gathered performance data and participants’ reactions and analyzed videos to understand users’ interactions with the nonvisual smartphone user interface. Our results show that AIGuide is a promising technology to help people with visual impairments locate and acquire objects in their daily routine. The benefits of AIGuide may be enhanced with appropriate interaction design.</abstract><cop>New York, NY</cop><pub>ACM</pub><doi>10.1145/3508501</doi><tpages>32</tpages><orcidid>https://orcid.org/0000-0001-5312-6860</orcidid><orcidid>https://orcid.org/0000-0002-4971-2004</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1936-7228
ispartof ACM transactions on accessible computing, 2022-06, Vol.15 (2), p.1-32, Article 12
issn 1936-7228
1936-7236
language eng
recordid cdi_crossref_primary_10_1145_3508501
source ACM Digital Library
subjects Accessibility
Accessibility design and evaluation methods
Accessibility systems and tools
Empirical studies in accessibility
Human-centered computing
title AIGuide: Augmented Reality Hand Guidance in a Visual Prosthetic
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T18%3A05%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-acm_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=AIGuide:%20Augmented%20Reality%20Hand%20Guidance%20in%20a%20Visual%20Prosthetic&rft.jtitle=ACM%20transactions%20on%20accessible%20computing&rft.au=Lee,%20Sooyeon&rft.date=2022-06-01&rft.volume=15&rft.issue=2&rft.spage=1&rft.epage=32&rft.pages=1-32&rft.artnum=12&rft.issn=1936-7228&rft.eissn=1936-7236&rft_id=info:doi/10.1145/3508501&rft_dat=%3Cacm_cross%3E3508501%3C/acm_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true