Augmented reality display based on user behavior
•The study presents a user-behavior-driven augmented content display approach.•A user behavior perception algorithm, that infers the current state of the user by crosschecking his/her past behavior, is presented.•Five augmented content display patterns corresponding to the modeled user's behavi...
Gespeichert in:
Veröffentlicht in: | Computer standards and interfaces 2018-01, Vol.55, p.171-181 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 181 |
---|---|
container_issue | |
container_start_page | 171 |
container_title | Computer standards and interfaces |
container_volume | 55 |
creator | Tsai, Chung-Hsien Huang, Jiung-Yao |
description | •The study presents a user-behavior-driven augmented content display approach.•A user behavior perception algorithm, that infers the current state of the user by crosschecking his/her past behavior, is presented.•Five augmented content display patterns corresponding to the modeled user's behavior states are designed accordingly.•The experimental results show that iDisplay can accurately infer user states and manage augmented content display efficiently.
The development and commercialization of smart glasses in recent years have made the exploration of one's surroundings with mobile augmented reality (MAR) browsers anytime and anywhere more practical. However, users often suffer from issues such as cognitive overload and inconvenient interactions when operating MAR browsers on smart glasses owing to the constraints of the screen resolution and size. To overcome these problems, this paper presents a user-behavior-driven augmented content display approach called iDisplay. First, user behaviors were modeled while the smart glasses were used. A user behavior perception algorithm that infers the current state of the user by crosschecking his/her past behavior and feature data extracted from the built-in sensors of the smart glasses was then developed. Five augmented content display patterns corresponding to the modeled user's behavior states were designed accordingly. To verify that iDisplay can be based upon the perceived user states to adaptively manage the smart glasses display, a prototype system was built to conduct a series of experiments. The experimental results show that iDisplay can accurately infer user states and manage augmented content display accordingly. A user study also shows that iDisplay can successfully reduce the user's cognitive load and split attention when searching for specific point-of-interest information while moving. Furthermore, all subjects claimed that iDisplay causes less dizziness during the experiments than the native overview + detail augmented reality interface. |
doi_str_mv | 10.1016/j.csi.2017.08.003 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2009175593</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0920548917300247</els_id><sourcerecordid>2009175593</sourcerecordid><originalsourceid>FETCH-LOGICAL-c325t-440cf79572e0547b791c66dc4115bcf30cb10c173973f9fd076aa0c7245c91863</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMouK7-AG8Fz60zSdM0eFoWv2DBi55DOk01Zbddk1bYf2-W9expYHifd4aHsVuEAgGr-76g6AsOqAqoCwBxxhZYK54rwPqcLUBzyGVZ60t2FWMPALwSasFgNX_u3DC5NgvObv10yFof91t7yBob03Ycsjm6kDXuy_74MVyzi85uo7v5m0v28fT4vn7JN2_Pr-vVJifB5ZSXJVCntFTcgSxVozRSVbVUIsqGOgHUIBAqoZXodNeCqqwFUryUpLGuxJLdnXr3YfyeXZxMP85hSCcNB9CopNQipfCUojDGGFxn9sHvbDgYBHMUY3qTxJijGAO1SWIS83BiXHr_x7tgInk3kGt9cDSZdvT_0L8nomk9</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2009175593</pqid></control><display><type>article</type><title>Augmented reality display based on user behavior</title><source>ScienceDirect Journals (5 years ago - present)</source><creator>Tsai, Chung-Hsien ; Huang, Jiung-Yao</creator><creatorcontrib>Tsai, Chung-Hsien ; Huang, Jiung-Yao</creatorcontrib><description>•The study presents a user-behavior-driven augmented content display approach.•A user behavior perception algorithm, that infers the current state of the user by crosschecking his/her past behavior, is presented.•Five augmented content display patterns corresponding to the modeled user's behavior states are designed accordingly.•The experimental results show that iDisplay can accurately infer user states and manage augmented content display efficiently.
The development and commercialization of smart glasses in recent years have made the exploration of one's surroundings with mobile augmented reality (MAR) browsers anytime and anywhere more practical. However, users often suffer from issues such as cognitive overload and inconvenient interactions when operating MAR browsers on smart glasses owing to the constraints of the screen resolution and size. To overcome these problems, this paper presents a user-behavior-driven augmented content display approach called iDisplay. First, user behaviors were modeled while the smart glasses were used. A user behavior perception algorithm that infers the current state of the user by crosschecking his/her past behavior and feature data extracted from the built-in sensors of the smart glasses was then developed. Five augmented content display patterns corresponding to the modeled user's behavior states were designed accordingly. To verify that iDisplay can be based upon the perceived user states to adaptively manage the smart glasses display, a prototype system was built to conduct a series of experiments. The experimental results show that iDisplay can accurately infer user states and manage augmented content display accordingly. A user study also shows that iDisplay can successfully reduce the user's cognitive load and split attention when searching for specific point-of-interest information while moving. Furthermore, all subjects claimed that iDisplay causes less dizziness during the experiments than the native overview + detail augmented reality interface.</description><identifier>ISSN: 0920-5489</identifier><identifier>EISSN: 1872-7018</identifier><identifier>DOI: 10.1016/j.csi.2017.08.003</identifier><language>eng</language><publisher>Amsterdam: Elsevier B.V</publisher><subject>Algorithms ; Augmented reality ; Behavior perception ; Commercialization ; Context-aware ; Experiments ; Feature extraction ; Sensors ; Smart sensors ; User behavior ; User interface ; Wearable computers</subject><ispartof>Computer standards and interfaces, 2018-01, Vol.55, p.171-181</ispartof><rights>2017</rights><rights>Copyright Elsevier BV Jan 2018</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c325t-440cf79572e0547b791c66dc4115bcf30cb10c173973f9fd076aa0c7245c91863</citedby><cites>FETCH-LOGICAL-c325t-440cf79572e0547b791c66dc4115bcf30cb10c173973f9fd076aa0c7245c91863</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.csi.2017.08.003$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995</link.rule.ids></links><search><creatorcontrib>Tsai, Chung-Hsien</creatorcontrib><creatorcontrib>Huang, Jiung-Yao</creatorcontrib><title>Augmented reality display based on user behavior</title><title>Computer standards and interfaces</title><description>•The study presents a user-behavior-driven augmented content display approach.•A user behavior perception algorithm, that infers the current state of the user by crosschecking his/her past behavior, is presented.•Five augmented content display patterns corresponding to the modeled user's behavior states are designed accordingly.•The experimental results show that iDisplay can accurately infer user states and manage augmented content display efficiently.
The development and commercialization of smart glasses in recent years have made the exploration of one's surroundings with mobile augmented reality (MAR) browsers anytime and anywhere more practical. However, users often suffer from issues such as cognitive overload and inconvenient interactions when operating MAR browsers on smart glasses owing to the constraints of the screen resolution and size. To overcome these problems, this paper presents a user-behavior-driven augmented content display approach called iDisplay. First, user behaviors were modeled while the smart glasses were used. A user behavior perception algorithm that infers the current state of the user by crosschecking his/her past behavior and feature data extracted from the built-in sensors of the smart glasses was then developed. Five augmented content display patterns corresponding to the modeled user's behavior states were designed accordingly. To verify that iDisplay can be based upon the perceived user states to adaptively manage the smart glasses display, a prototype system was built to conduct a series of experiments. The experimental results show that iDisplay can accurately infer user states and manage augmented content display accordingly. A user study also shows that iDisplay can successfully reduce the user's cognitive load and split attention when searching for specific point-of-interest information while moving. Furthermore, all subjects claimed that iDisplay causes less dizziness during the experiments than the native overview + detail augmented reality interface.</description><subject>Algorithms</subject><subject>Augmented reality</subject><subject>Behavior perception</subject><subject>Commercialization</subject><subject>Context-aware</subject><subject>Experiments</subject><subject>Feature extraction</subject><subject>Sensors</subject><subject>Smart sensors</subject><subject>User behavior</subject><subject>User interface</subject><subject>Wearable computers</subject><issn>0920-5489</issn><issn>1872-7018</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LxDAQhoMouK7-AG8Fz60zSdM0eFoWv2DBi55DOk01Zbddk1bYf2-W9expYHifd4aHsVuEAgGr-76g6AsOqAqoCwBxxhZYK54rwPqcLUBzyGVZ60t2FWMPALwSasFgNX_u3DC5NgvObv10yFof91t7yBob03Ycsjm6kDXuy_74MVyzi85uo7v5m0v28fT4vn7JN2_Pr-vVJifB5ZSXJVCntFTcgSxVozRSVbVUIsqGOgHUIBAqoZXodNeCqqwFUryUpLGuxJLdnXr3YfyeXZxMP85hSCcNB9CopNQipfCUojDGGFxn9sHvbDgYBHMUY3qTxJijGAO1SWIS83BiXHr_x7tgInk3kGt9cDSZdvT_0L8nomk9</recordid><startdate>201801</startdate><enddate>201801</enddate><creator>Tsai, Chung-Hsien</creator><creator>Huang, Jiung-Yao</creator><general>Elsevier B.V</general><general>Elsevier BV</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>201801</creationdate><title>Augmented reality display based on user behavior</title><author>Tsai, Chung-Hsien ; Huang, Jiung-Yao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c325t-440cf79572e0547b791c66dc4115bcf30cb10c173973f9fd076aa0c7245c91863</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Algorithms</topic><topic>Augmented reality</topic><topic>Behavior perception</topic><topic>Commercialization</topic><topic>Context-aware</topic><topic>Experiments</topic><topic>Feature extraction</topic><topic>Sensors</topic><topic>Smart sensors</topic><topic>User behavior</topic><topic>User interface</topic><topic>Wearable computers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tsai, Chung-Hsien</creatorcontrib><creatorcontrib>Huang, Jiung-Yao</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Computer standards and interfaces</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Tsai, Chung-Hsien</au><au>Huang, Jiung-Yao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Augmented reality display based on user behavior</atitle><jtitle>Computer standards and interfaces</jtitle><date>2018-01</date><risdate>2018</risdate><volume>55</volume><spage>171</spage><epage>181</epage><pages>171-181</pages><issn>0920-5489</issn><eissn>1872-7018</eissn><abstract>•The study presents a user-behavior-driven augmented content display approach.•A user behavior perception algorithm, that infers the current state of the user by crosschecking his/her past behavior, is presented.•Five augmented content display patterns corresponding to the modeled user's behavior states are designed accordingly.•The experimental results show that iDisplay can accurately infer user states and manage augmented content display efficiently.
The development and commercialization of smart glasses in recent years have made the exploration of one's surroundings with mobile augmented reality (MAR) browsers anytime and anywhere more practical. However, users often suffer from issues such as cognitive overload and inconvenient interactions when operating MAR browsers on smart glasses owing to the constraints of the screen resolution and size. To overcome these problems, this paper presents a user-behavior-driven augmented content display approach called iDisplay. First, user behaviors were modeled while the smart glasses were used. A user behavior perception algorithm that infers the current state of the user by crosschecking his/her past behavior and feature data extracted from the built-in sensors of the smart glasses was then developed. Five augmented content display patterns corresponding to the modeled user's behavior states were designed accordingly. To verify that iDisplay can be based upon the perceived user states to adaptively manage the smart glasses display, a prototype system was built to conduct a series of experiments. The experimental results show that iDisplay can accurately infer user states and manage augmented content display accordingly. A user study also shows that iDisplay can successfully reduce the user's cognitive load and split attention when searching for specific point-of-interest information while moving. Furthermore, all subjects claimed that iDisplay causes less dizziness during the experiments than the native overview + detail augmented reality interface.</abstract><cop>Amsterdam</cop><pub>Elsevier B.V</pub><doi>10.1016/j.csi.2017.08.003</doi><tpages>11</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0920-5489 |
ispartof | Computer standards and interfaces, 2018-01, Vol.55, p.171-181 |
issn | 0920-5489 1872-7018 |
language | eng |
recordid | cdi_proquest_journals_2009175593 |
source | ScienceDirect Journals (5 years ago - present) |
subjects | Algorithms Augmented reality Behavior perception Commercialization Context-aware Experiments Feature extraction Sensors Smart sensors User behavior User interface Wearable computers |
title | Augmented reality display based on user behavior |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T16%3A33%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Augmented%20reality%20display%20based%20on%20user%20behavior&rft.jtitle=Computer%20standards%20and%20interfaces&rft.au=Tsai,%20Chung-Hsien&rft.date=2018-01&rft.volume=55&rft.spage=171&rft.epage=181&rft.pages=171-181&rft.issn=0920-5489&rft.eissn=1872-7018&rft_id=info:doi/10.1016/j.csi.2017.08.003&rft_dat=%3Cproquest_cross%3E2009175593%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2009175593&rft_id=info:pmid/&rft_els_id=S0920548917300247&rfr_iscdi=true |