Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant

Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld dev...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Wasinger, Rainer, Krüger, Antonio, Jacobs, Oliver
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 314
container_issue
container_start_page 297
container_title
container_volume
creator Wasinger, Rainer
Krüger, Antonio
Jacobs, Oliver
description Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.
doi_str_mv 10.1007/11428572_18
format Conference Proceeding
fullrecord <record><control><sourceid>pascalfrancis_sprin</sourceid><recordid>TN_cdi_pascalfrancis_primary_17027231</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>17027231</sourcerecordid><originalsourceid>FETCH-LOGICAL-p219t-c039ae27d1af8bbd3694a90ffe3c5b3ba5c1b25d59e8395f6ea4ad3aa30d63763</originalsourceid><addsrcrecordid>eNpNkL1Ow0AQhI8_iRBS8QJuKCgMu7dnn6-MogCRElEQamttn4PBsS3fRYK3xyYUTLMjzbdTjBA3CPcIoB8QlUwiLVNMTsQVRQpIAil5KiYYI4ZEypyJmdHJmMkYIIFzMQECGRqt6FLMnPuAQYRGoZyI7arxdtezr5pdMPieA26KYPk1uifr_KG3Lqga3wYcbNqsqu0vsDnUvtq3BdfB63vbdeP73LnKeW78tbgouXZ29nen4u1xuV08h-uXp9Vivg47icaHOZBhK3WBXCZZVlBsFBsoS0t5lFHGUY6ZjIrI2IRMVMaWFRfETFDEpGOaittjb8cu57rsuckrl3Z9tef-O0UNUkvCgbs7cm6Imp3t06xtP12KkI6zpv9mpR93xmUf</addsrcrecordid><sourcetype>Index Database</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant</title><source>Springer Books</source><creator>Wasinger, Rainer ; Krüger, Antonio ; Jacobs, Oliver</creator><contributor>Gellersen, Hans -W. ; Schmidt, Albrecht ; Want, Roy</contributor><creatorcontrib>Wasinger, Rainer ; Krüger, Antonio ; Jacobs, Oliver ; Gellersen, Hans -W. ; Schmidt, Albrecht ; Want, Roy</creatorcontrib><description>Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.</description><identifier>ISSN: 0302-9743</identifier><identifier>ISBN: 9783540260080</identifier><identifier>ISBN: 3540260080</identifier><identifier>EISSN: 1611-3349</identifier><identifier>EISBN: 3540320342</identifier><identifier>EISBN: 9783540320340</identifier><identifier>DOI: 10.1007/11428572_18</identifier><language>eng</language><publisher>Berlin, Heidelberg: Springer Berlin Heidelberg</publisher><subject>Applied sciences ; Computer science; control theory; systems ; Computer systems and distributed systems. User interface ; Exact sciences and technology ; Intelligent User Interface ; Modality Combination ; Multimodal Interaction ; Multimodal User ; Software ; Speech Utterance</subject><ispartof>Lecture notes in computer science, 2005, p.297-314</ispartof><rights>Springer-Verlag Berlin Heidelberg 2005</rights><rights>2005 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/11428572_18$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/11428572_18$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>309,310,779,780,784,789,790,793,4048,4049,27924,38254,41441,42510</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=17027231$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><contributor>Gellersen, Hans -W.</contributor><contributor>Schmidt, Albrecht</contributor><contributor>Want, Roy</contributor><creatorcontrib>Wasinger, Rainer</creatorcontrib><creatorcontrib>Krüger, Antonio</creatorcontrib><creatorcontrib>Jacobs, Oliver</creatorcontrib><title>Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant</title><title>Lecture notes in computer science</title><description>Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.</description><subject>Applied sciences</subject><subject>Computer science; control theory; systems</subject><subject>Computer systems and distributed systems. User interface</subject><subject>Exact sciences and technology</subject><subject>Intelligent User Interface</subject><subject>Modality Combination</subject><subject>Multimodal Interaction</subject><subject>Multimodal User</subject><subject>Software</subject><subject>Speech Utterance</subject><issn>0302-9743</issn><issn>1611-3349</issn><isbn>9783540260080</isbn><isbn>3540260080</isbn><isbn>3540320342</isbn><isbn>9783540320340</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2005</creationdate><recordtype>conference_proceeding</recordtype><recordid>eNpNkL1Ow0AQhI8_iRBS8QJuKCgMu7dnn6-MogCRElEQamttn4PBsS3fRYK3xyYUTLMjzbdTjBA3CPcIoB8QlUwiLVNMTsQVRQpIAil5KiYYI4ZEypyJmdHJmMkYIIFzMQECGRqt6FLMnPuAQYRGoZyI7arxdtezr5pdMPieA26KYPk1uifr_KG3Lqga3wYcbNqsqu0vsDnUvtq3BdfB63vbdeP73LnKeW78tbgouXZ29nen4u1xuV08h-uXp9Vivg47icaHOZBhK3WBXCZZVlBsFBsoS0t5lFHGUY6ZjIrI2IRMVMaWFRfETFDEpGOaittjb8cu57rsuckrl3Z9tef-O0UNUkvCgbs7cm6Imp3t06xtP12KkI6zpv9mpR93xmUf</recordid><startdate>2005</startdate><enddate>2005</enddate><creator>Wasinger, Rainer</creator><creator>Krüger, Antonio</creator><creator>Jacobs, Oliver</creator><general>Springer Berlin Heidelberg</general><general>Springer</general><scope>IQODW</scope></search><sort><creationdate>2005</creationdate><title>Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant</title><author>Wasinger, Rainer ; Krüger, Antonio ; Jacobs, Oliver</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p219t-c039ae27d1af8bbd3694a90ffe3c5b3ba5c1b25d59e8395f6ea4ad3aa30d63763</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2005</creationdate><topic>Applied sciences</topic><topic>Computer science; control theory; systems</topic><topic>Computer systems and distributed systems. User interface</topic><topic>Exact sciences and technology</topic><topic>Intelligent User Interface</topic><topic>Modality Combination</topic><topic>Multimodal Interaction</topic><topic>Multimodal User</topic><topic>Software</topic><topic>Speech Utterance</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wasinger, Rainer</creatorcontrib><creatorcontrib>Krüger, Antonio</creatorcontrib><creatorcontrib>Jacobs, Oliver</creatorcontrib><collection>Pascal-Francis</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wasinger, Rainer</au><au>Krüger, Antonio</au><au>Jacobs, Oliver</au><au>Gellersen, Hans -W.</au><au>Schmidt, Albrecht</au><au>Want, Roy</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant</atitle><btitle>Lecture notes in computer science</btitle><date>2005</date><risdate>2005</risdate><spage>297</spage><epage>314</epage><pages>297-314</pages><issn>0302-9743</issn><eissn>1611-3349</eissn><isbn>9783540260080</isbn><isbn>3540260080</isbn><eisbn>3540320342</eisbn><eisbn>9783540320340</eisbn><abstract>Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.</abstract><cop>Berlin, Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/11428572_18</doi><tpages>18</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0302-9743
ispartof Lecture notes in computer science, 2005, p.297-314
issn 0302-9743
1611-3349
language eng
recordid cdi_pascalfrancis_primary_17027231
source Springer Books
subjects Applied sciences
Computer science
control theory
systems
Computer systems and distributed systems. User interface
Exact sciences and technology
Intelligent User Interface
Modality Combination
Multimodal Interaction
Multimodal User
Software
Speech Utterance
title Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-13T02%3A07%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-pascalfrancis_sprin&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Integrating%20Intra%20and%20Extra%20Gestures%20into%20a%20Mobile%20and%20Multimodal%20Shopping%20Assistant&rft.btitle=Lecture%20notes%20in%20computer%20science&rft.au=Wasinger,%20Rainer&rft.date=2005&rft.spage=297&rft.epage=314&rft.pages=297-314&rft.issn=0302-9743&rft.eissn=1611-3349&rft.isbn=9783540260080&rft.isbn_list=3540260080&rft_id=info:doi/10.1007/11428572_18&rft_dat=%3Cpascalfrancis_sprin%3E17027231%3C/pascalfrancis_sprin%3E%3Curl%3E%3C/url%3E&rft.eisbn=3540320342&rft.eisbn_list=9783540320340&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true