GAZE-BASED SOUND SELECTION

Various systems and methods for implementing gaze-based sound selection are described herein. A system for gaze-based sound selection, the system includes a gaze detection circuit to determine a gaze direction of a user, the gaze direction being toward an object; an audio capture mechanism to obtain...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Ota Jeffrey, Ivers Emily N, Kim Kahyun, Miossec-Backer Jeremy, Sorenson Paul F, Essaian Alexander
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Ota Jeffrey
Ivers Emily N
Kim Kahyun
Miossec-Backer Jeremy
Sorenson Paul F
Essaian Alexander
description Various systems and methods for implementing gaze-based sound selection are described herein. A system for gaze-based sound selection, the system includes a gaze detection circuit to determine a gaze direction of a user, the gaze direction being toward an object; an audio capture mechanism to obtain audio data from the object, the audio capture mechanism selectively configured based on the gaze direction; an audio transformation circuit to transform the audio data to an output data; and a presentation mechanism to present the output data to the user.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2017277257A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2017277257A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2017277257A13</originalsourceid><addsrcrecordid>eNrjZJByd4xy1XVyDHZ1UQj2D_UDkq4-rs4hnv5-PAysaYk5xam8UJqbQdnNNcTZQze1ID8-tbggMTk1L7UkPjTYyMDQ3Mjc3MjU3NHQmDhVAKlTISc</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>GAZE-BASED SOUND SELECTION</title><source>esp@cenet</source><creator>Ota Jeffrey ; Ivers Emily N ; Kim Kahyun ; Miossec-Backer Jeremy ; Sorenson Paul F ; Essaian Alexander</creator><creatorcontrib>Ota Jeffrey ; Ivers Emily N ; Kim Kahyun ; Miossec-Backer Jeremy ; Sorenson Paul F ; Essaian Alexander</creatorcontrib><description>Various systems and methods for implementing gaze-based sound selection are described herein. A system for gaze-based sound selection, the system includes a gaze detection circuit to determine a gaze direction of a user, the gaze direction being toward an object; an audio capture mechanism to obtain audio data from the object, the audio capture mechanism selectively configured based on the gaze direction; an audio transformation circuit to transform the audio data to an output data; and a presentation mechanism to present the output data to the user.</description><language>eng</language><subject>ACOUSTICS ; CALCULATING ; COMPUTING ; COUNTING ; DEAF-AID SETS ; ELECTRIC COMMUNICATION TECHNIQUE ; ELECTRIC DIGITAL DATA PROCESSING ; ELECTRICITY ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKEACOUSTIC ELECTROMECHANICAL TRANSDUCERS ; MUSICAL INSTRUMENTS ; OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS ; OPTICS ; PHYSICS ; PUBLIC ADDRESS SYSTEMS ; SPEECH ANALYSIS OR SYNTHESIS ; SPEECH OR AUDIO CODING OR DECODING ; SPEECH OR VOICE PROCESSING ; SPEECH RECOGNITION</subject><creationdate>2017</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20170928&amp;DB=EPODOC&amp;CC=US&amp;NR=2017277257A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76516</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20170928&amp;DB=EPODOC&amp;CC=US&amp;NR=2017277257A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Ota Jeffrey</creatorcontrib><creatorcontrib>Ivers Emily N</creatorcontrib><creatorcontrib>Kim Kahyun</creatorcontrib><creatorcontrib>Miossec-Backer Jeremy</creatorcontrib><creatorcontrib>Sorenson Paul F</creatorcontrib><creatorcontrib>Essaian Alexander</creatorcontrib><title>GAZE-BASED SOUND SELECTION</title><description>Various systems and methods for implementing gaze-based sound selection are described herein. A system for gaze-based sound selection, the system includes a gaze detection circuit to determine a gaze direction of a user, the gaze direction being toward an object; an audio capture mechanism to obtain audio data from the object, the audio capture mechanism selectively configured based on the gaze direction; an audio transformation circuit to transform the audio data to an output data; and a presentation mechanism to present the output data to the user.</description><subject>ACOUSTICS</subject><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>DEAF-AID SETS</subject><subject>ELECTRIC COMMUNICATION TECHNIQUE</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>ELECTRICITY</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKEACOUSTIC ELECTROMECHANICAL TRANSDUCERS</subject><subject>MUSICAL INSTRUMENTS</subject><subject>OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS</subject><subject>OPTICS</subject><subject>PHYSICS</subject><subject>PUBLIC ADDRESS SYSTEMS</subject><subject>SPEECH ANALYSIS OR SYNTHESIS</subject><subject>SPEECH OR AUDIO CODING OR DECODING</subject><subject>SPEECH OR VOICE PROCESSING</subject><subject>SPEECH RECOGNITION</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2017</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZJByd4xy1XVyDHZ1UQj2D_UDkq4-rs4hnv5-PAysaYk5xam8UJqbQdnNNcTZQze1ID8-tbggMTk1L7UkPjTYyMDQ3Mjc3MjU3NHQmDhVAKlTISc</recordid><startdate>20170928</startdate><enddate>20170928</enddate><creator>Ota Jeffrey</creator><creator>Ivers Emily N</creator><creator>Kim Kahyun</creator><creator>Miossec-Backer Jeremy</creator><creator>Sorenson Paul F</creator><creator>Essaian Alexander</creator><scope>EVB</scope></search><sort><creationdate>20170928</creationdate><title>GAZE-BASED SOUND SELECTION</title><author>Ota Jeffrey ; Ivers Emily N ; Kim Kahyun ; Miossec-Backer Jeremy ; Sorenson Paul F ; Essaian Alexander</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2017277257A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2017</creationdate><topic>ACOUSTICS</topic><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>DEAF-AID SETS</topic><topic>ELECTRIC COMMUNICATION TECHNIQUE</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>ELECTRICITY</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKEACOUSTIC ELECTROMECHANICAL TRANSDUCERS</topic><topic>MUSICAL INSTRUMENTS</topic><topic>OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS</topic><topic>OPTICS</topic><topic>PHYSICS</topic><topic>PUBLIC ADDRESS SYSTEMS</topic><topic>SPEECH ANALYSIS OR SYNTHESIS</topic><topic>SPEECH OR AUDIO CODING OR DECODING</topic><topic>SPEECH OR VOICE PROCESSING</topic><topic>SPEECH RECOGNITION</topic><toplevel>online_resources</toplevel><creatorcontrib>Ota Jeffrey</creatorcontrib><creatorcontrib>Ivers Emily N</creatorcontrib><creatorcontrib>Kim Kahyun</creatorcontrib><creatorcontrib>Miossec-Backer Jeremy</creatorcontrib><creatorcontrib>Sorenson Paul F</creatorcontrib><creatorcontrib>Essaian Alexander</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ota Jeffrey</au><au>Ivers Emily N</au><au>Kim Kahyun</au><au>Miossec-Backer Jeremy</au><au>Sorenson Paul F</au><au>Essaian Alexander</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>GAZE-BASED SOUND SELECTION</title><date>2017-09-28</date><risdate>2017</risdate><abstract>Various systems and methods for implementing gaze-based sound selection are described herein. A system for gaze-based sound selection, the system includes a gaze detection circuit to determine a gaze direction of a user, the gaze direction being toward an object; an audio capture mechanism to obtain audio data from the object, the audio capture mechanism selectively configured based on the gaze direction; an audio transformation circuit to transform the audio data to an output data; and a presentation mechanism to present the output data to the user.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_US2017277257A1
source esp@cenet
subjects ACOUSTICS
CALCULATING
COMPUTING
COUNTING
DEAF-AID SETS
ELECTRIC COMMUNICATION TECHNIQUE
ELECTRIC DIGITAL DATA PROCESSING
ELECTRICITY
IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKEACOUSTIC ELECTROMECHANICAL TRANSDUCERS
MUSICAL INSTRUMENTS
OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
OPTICS
PHYSICS
PUBLIC ADDRESS SYSTEMS
SPEECH ANALYSIS OR SYNTHESIS
SPEECH OR AUDIO CODING OR DECODING
SPEECH OR VOICE PROCESSING
SPEECH RECOGNITION
title GAZE-BASED SOUND SELECTION
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-14T13%3A53%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Ota%20Jeffrey&rft.date=2017-09-28&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2017277257A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true