Tactile Echoes: Multisensory Augmented Reality for the Hand

Touch interactions are central to many human activities, but there are few technologies for computationally augmenting free-hand interactions with real environments. Here, we describe Tactile Echoes, a finger-wearable system for augmenting touch interactions with physical objects. This system captur...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on haptics 2021-10, Vol.14 (4), p.835-848
Hauptverfasser: Kawazoe, Anzu, Reardon, Gregory, Woo, Erin, Luca, Massimiliano Di, Visell, Yon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 848
container_issue 4
container_start_page 835
container_title IEEE transactions on haptics
container_volume 14
creator Kawazoe, Anzu
Reardon, Gregory
Woo, Erin
Luca, Massimiliano Di
Visell, Yon
description Touch interactions are central to many human activities, but there are few technologies for computationally augmenting free-hand interactions with real environments. Here, we describe Tactile Echoes, a finger-wearable system for augmenting touch interactions with physical objects. This system captures and processes touch-elicited vibrations in real-time in order to enliven tactile experiences. In this article, we process these signals via a parametric signal processing network in order to generate responsive tactile and auditory feedback. Just as acoustic echoes are produced through the delayed replication and modification of sounds, so are Tactile Echoes produced through transformations of vibrotactile inputs in the skin. The echoes also reflect the contact interactions and touched objects involved. A transient tap produces discrete echoes, while a continuous slide yields sustained feedback. We also demonstrate computational and spatial tracking methods that allow these effects to be selectively assigned to different objects or actions. A large variety of distinct multisensory effects can be designed via ten processing parameters. We investigated how Tactile Echoes are perceived in several perceptual experiments using multidimensional scaling methods. This allowed us to deduce low-dimensional, semantically grounded perceptual descriptions. We present several virtual and augmented reality applications of Tactile Echoes. In a user study, we found that these effects made interactions more responsive and engaging. Our findings show how to endow a large variety of touch interactions with expressive multisensory effects.
doi_str_mv 10.1109/TOH.2021.3084117
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TOH_2021_3084117</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9442315</ieee_id><sourcerecordid>2533311156</sourcerecordid><originalsourceid>FETCH-LOGICAL-c389t-4b2818edad39d6c2dda2e8d45616c49813efe9c39c637ba1fde75403aa11acaa3</originalsourceid><addsrcrecordid>eNpdkMFLwzAUxoMoOqd3QZCCFy-deXlp2uhJhjphIsg8hyx51UrXatIe9t_bsenB0zu83_fx8WPsDPgEgOvrxctsIriACfJCAuR7bCRQ6FTyDPbZCDTqFCSII3Yc4yfnSuRaHrIjlBwLVHrEbhfWdVVNyb37aCneJM993VWRmtiGdXLXv6-o6cgnr2TrqlsnZRuS7oOSmW38CTsobR3pdHfH7O3hfjGdpfOXx6fp3Tx1WOgulUtRQEHeetReOeG9FVR4mSlQTuoCkErSDrVTmC8tlJ7ybBhoLYB11uKYXW17v0L73VPszKqKjuraNtT20YgMEQEgUwN6-Q_9bPvQDOuMUIMxzVXOB4pvKRfaGAOV5itUKxvWBrjZiDWDWLMRa3Zih8jFrrhfrsj_BX5NDsD5FqiI6O-tpRQIGf4AhTR6yA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2610990670</pqid></control><display><type>article</type><title>Tactile Echoes: Multisensory Augmented Reality for the Hand</title><source>IEEE Electronic Library (IEL)</source><creator>Kawazoe, Anzu ; Reardon, Gregory ; Woo, Erin ; Luca, Massimiliano Di ; Visell, Yon</creator><creatorcontrib>Kawazoe, Anzu ; Reardon, Gregory ; Woo, Erin ; Luca, Massimiliano Di ; Visell, Yon</creatorcontrib><description>Touch interactions are central to many human activities, but there are few technologies for computationally augmenting free-hand interactions with real environments. Here, we describe Tactile Echoes, a finger-wearable system for augmenting touch interactions with physical objects. This system captures and processes touch-elicited vibrations in real-time in order to enliven tactile experiences. In this article, we process these signals via a parametric signal processing network in order to generate responsive tactile and auditory feedback. Just as acoustic echoes are produced through the delayed replication and modification of sounds, so are Tactile Echoes produced through transformations of vibrotactile inputs in the skin. The echoes also reflect the contact interactions and touched objects involved. A transient tap produces discrete echoes, while a continuous slide yields sustained feedback. We also demonstrate computational and spatial tracking methods that allow these effects to be selectively assigned to different objects or actions. A large variety of distinct multisensory effects can be designed via ten processing parameters. We investigated how Tactile Echoes are perceived in several perceptual experiments using multidimensional scaling methods. This allowed us to deduce low-dimensional, semantically grounded perceptual descriptions. We present several virtual and augmented reality applications of Tactile Echoes. In a user study, we found that these effects made interactions more responsive and engaging. Our findings show how to endow a large variety of touch interactions with expressive multisensory effects.</description><identifier>ISSN: 1939-1412</identifier><identifier>EISSN: 2329-4051</identifier><identifier>DOI: 10.1109/TOH.2021.3084117</identifier><identifier>PMID: 34038369</identifier><identifier>CODEN: ITHEBX</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Acoustics ; Augmented Reality ; Feedback ; Feedback, Sensory ; Fingers ; Hand ; Haptic interfaces ; haptic rendering ; Humans ; Multidimensional methods ; multisensory feedback ; Process parameters ; Signal processing ; Skin ; Tactile augmented reality ; Tactile sensors ; Touch ; Touch Perception ; Vibrations ; Virtual reality ; wearable haptics</subject><ispartof>IEEE transactions on haptics, 2021-10, Vol.14 (4), p.835-848</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c389t-4b2818edad39d6c2dda2e8d45616c49813efe9c39c637ba1fde75403aa11acaa3</citedby><cites>FETCH-LOGICAL-c389t-4b2818edad39d6c2dda2e8d45616c49813efe9c39c637ba1fde75403aa11acaa3</cites><orcidid>0000-0001-5269-0734 ; 0000-0001-6987-1136 ; 0000-0002-1401-8450 ; 0000-0003-3085-7251</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9442315$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9442315$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/34038369$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Kawazoe, Anzu</creatorcontrib><creatorcontrib>Reardon, Gregory</creatorcontrib><creatorcontrib>Woo, Erin</creatorcontrib><creatorcontrib>Luca, Massimiliano Di</creatorcontrib><creatorcontrib>Visell, Yon</creatorcontrib><title>Tactile Echoes: Multisensory Augmented Reality for the Hand</title><title>IEEE transactions on haptics</title><addtitle>TOH</addtitle><addtitle>IEEE Trans Haptics</addtitle><description>Touch interactions are central to many human activities, but there are few technologies for computationally augmenting free-hand interactions with real environments. Here, we describe Tactile Echoes, a finger-wearable system for augmenting touch interactions with physical objects. This system captures and processes touch-elicited vibrations in real-time in order to enliven tactile experiences. In this article, we process these signals via a parametric signal processing network in order to generate responsive tactile and auditory feedback. Just as acoustic echoes are produced through the delayed replication and modification of sounds, so are Tactile Echoes produced through transformations of vibrotactile inputs in the skin. The echoes also reflect the contact interactions and touched objects involved. A transient tap produces discrete echoes, while a continuous slide yields sustained feedback. We also demonstrate computational and spatial tracking methods that allow these effects to be selectively assigned to different objects or actions. A large variety of distinct multisensory effects can be designed via ten processing parameters. We investigated how Tactile Echoes are perceived in several perceptual experiments using multidimensional scaling methods. This allowed us to deduce low-dimensional, semantically grounded perceptual descriptions. We present several virtual and augmented reality applications of Tactile Echoes. In a user study, we found that these effects made interactions more responsive and engaging. Our findings show how to endow a large variety of touch interactions with expressive multisensory effects.</description><subject>Acoustics</subject><subject>Augmented Reality</subject><subject>Feedback</subject><subject>Feedback, Sensory</subject><subject>Fingers</subject><subject>Hand</subject><subject>Haptic interfaces</subject><subject>haptic rendering</subject><subject>Humans</subject><subject>Multidimensional methods</subject><subject>multisensory feedback</subject><subject>Process parameters</subject><subject>Signal processing</subject><subject>Skin</subject><subject>Tactile augmented reality</subject><subject>Tactile sensors</subject><subject>Touch</subject><subject>Touch Perception</subject><subject>Vibrations</subject><subject>Virtual reality</subject><subject>wearable haptics</subject><issn>1939-1412</issn><issn>2329-4051</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNpdkMFLwzAUxoMoOqd3QZCCFy-deXlp2uhJhjphIsg8hyx51UrXatIe9t_bsenB0zu83_fx8WPsDPgEgOvrxctsIriACfJCAuR7bCRQ6FTyDPbZCDTqFCSII3Yc4yfnSuRaHrIjlBwLVHrEbhfWdVVNyb37aCneJM993VWRmtiGdXLXv6-o6cgnr2TrqlsnZRuS7oOSmW38CTsobR3pdHfH7O3hfjGdpfOXx6fp3Tx1WOgulUtRQEHeetReOeG9FVR4mSlQTuoCkErSDrVTmC8tlJ7ybBhoLYB11uKYXW17v0L73VPszKqKjuraNtT20YgMEQEgUwN6-Q_9bPvQDOuMUIMxzVXOB4pvKRfaGAOV5itUKxvWBrjZiDWDWLMRa3Zih8jFrrhfrsj_BX5NDsD5FqiI6O-tpRQIGf4AhTR6yA</recordid><startdate>20211001</startdate><enddate>20211001</enddate><creator>Kawazoe, Anzu</creator><creator>Reardon, Gregory</creator><creator>Woo, Erin</creator><creator>Luca, Massimiliano Di</creator><creator>Visell, Yon</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-5269-0734</orcidid><orcidid>https://orcid.org/0000-0001-6987-1136</orcidid><orcidid>https://orcid.org/0000-0002-1401-8450</orcidid><orcidid>https://orcid.org/0000-0003-3085-7251</orcidid></search><sort><creationdate>20211001</creationdate><title>Tactile Echoes: Multisensory Augmented Reality for the Hand</title><author>Kawazoe, Anzu ; Reardon, Gregory ; Woo, Erin ; Luca, Massimiliano Di ; Visell, Yon</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c389t-4b2818edad39d6c2dda2e8d45616c49813efe9c39c637ba1fde75403aa11acaa3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Acoustics</topic><topic>Augmented Reality</topic><topic>Feedback</topic><topic>Feedback, Sensory</topic><topic>Fingers</topic><topic>Hand</topic><topic>Haptic interfaces</topic><topic>haptic rendering</topic><topic>Humans</topic><topic>Multidimensional methods</topic><topic>multisensory feedback</topic><topic>Process parameters</topic><topic>Signal processing</topic><topic>Skin</topic><topic>Tactile augmented reality</topic><topic>Tactile sensors</topic><topic>Touch</topic><topic>Touch Perception</topic><topic>Vibrations</topic><topic>Virtual reality</topic><topic>wearable haptics</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kawazoe, Anzu</creatorcontrib><creatorcontrib>Reardon, Gregory</creatorcontrib><creatorcontrib>Woo, Erin</creatorcontrib><creatorcontrib>Luca, Massimiliano Di</creatorcontrib><creatorcontrib>Visell, Yon</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on haptics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kawazoe, Anzu</au><au>Reardon, Gregory</au><au>Woo, Erin</au><au>Luca, Massimiliano Di</au><au>Visell, Yon</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Tactile Echoes: Multisensory Augmented Reality for the Hand</atitle><jtitle>IEEE transactions on haptics</jtitle><stitle>TOH</stitle><addtitle>IEEE Trans Haptics</addtitle><date>2021-10-01</date><risdate>2021</risdate><volume>14</volume><issue>4</issue><spage>835</spage><epage>848</epage><pages>835-848</pages><issn>1939-1412</issn><eissn>2329-4051</eissn><coden>ITHEBX</coden><abstract>Touch interactions are central to many human activities, but there are few technologies for computationally augmenting free-hand interactions with real environments. Here, we describe Tactile Echoes, a finger-wearable system for augmenting touch interactions with physical objects. This system captures and processes touch-elicited vibrations in real-time in order to enliven tactile experiences. In this article, we process these signals via a parametric signal processing network in order to generate responsive tactile and auditory feedback. Just as acoustic echoes are produced through the delayed replication and modification of sounds, so are Tactile Echoes produced through transformations of vibrotactile inputs in the skin. The echoes also reflect the contact interactions and touched objects involved. A transient tap produces discrete echoes, while a continuous slide yields sustained feedback. We also demonstrate computational and spatial tracking methods that allow these effects to be selectively assigned to different objects or actions. A large variety of distinct multisensory effects can be designed via ten processing parameters. We investigated how Tactile Echoes are perceived in several perceptual experiments using multidimensional scaling methods. This allowed us to deduce low-dimensional, semantically grounded perceptual descriptions. We present several virtual and augmented reality applications of Tactile Echoes. In a user study, we found that these effects made interactions more responsive and engaging. Our findings show how to endow a large variety of touch interactions with expressive multisensory effects.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>34038369</pmid><doi>10.1109/TOH.2021.3084117</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0001-5269-0734</orcidid><orcidid>https://orcid.org/0000-0001-6987-1136</orcidid><orcidid>https://orcid.org/0000-0002-1401-8450</orcidid><orcidid>https://orcid.org/0000-0003-3085-7251</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1939-1412
ispartof IEEE transactions on haptics, 2021-10, Vol.14 (4), p.835-848
issn 1939-1412
2329-4051
language eng
recordid cdi_crossref_primary_10_1109_TOH_2021_3084117
source IEEE Electronic Library (IEL)
subjects Acoustics
Augmented Reality
Feedback
Feedback, Sensory
Fingers
Hand
Haptic interfaces
haptic rendering
Humans
Multidimensional methods
multisensory feedback
Process parameters
Signal processing
Skin
Tactile augmented reality
Tactile sensors
Touch
Touch Perception
Vibrations
Virtual reality
wearable haptics
title Tactile Echoes: Multisensory Augmented Reality for the Hand
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T03%3A12%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Tactile%20Echoes:%20Multisensory%20Augmented%20Reality%20for%20the%20Hand&rft.jtitle=IEEE%20transactions%20on%20haptics&rft.au=Kawazoe,%20Anzu&rft.date=2021-10-01&rft.volume=14&rft.issue=4&rft.spage=835&rft.epage=848&rft.pages=835-848&rft.issn=1939-1412&rft.eissn=2329-4051&rft.coden=ITHEBX&rft_id=info:doi/10.1109/TOH.2021.3084117&rft_dat=%3Cproquest_RIE%3E2533311156%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2610990670&rft_id=info:pmid/34038369&rft_ieee_id=9442315&rfr_iscdi=true