Anaphora resolution
A method, a structure, and a computer system for resolving an anaphora. The exemplary embodiments may include extracting individual context data from an individual expression and determining whether the individual expression includes an anaphora representation based on the individual context data. T...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | WATANABE, Kenta ISHIKAWA, Shunsuke HASEGAWA, Tohru TOMINAGA, Yasuyuki UETSUKI, Hiroaki ONO, Asako |
description | A method, a structure, and a computer system for resolving an anaphora. The exemplary embodiments may include extracting individual context data from an individual expression and determining whether the individual expression includes an anaphora representation based on the individual context data. The exemplary embodiments may further include, based on determining that the individual expression includes the anaphora representation, extracting anaphora context data and identifying an object of one or more objects to which the anaphora representation refers based on comparing the individual context data and the anaphora context data to data detailing the one or more objects. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_AU2020400345A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>AU2020400345A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_AU2020400345A13</originalsourceid><addsrcrecordid>eNrjZBB2zEssyMgvSlQoSi3OzyktyczP42FgTUvMKU7lhdLcDMpuriHOHrqpBfnxqcUFicmpeakl8Y6hRgZGBiYGBsYmpo6GxsSpAgCz9yFQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Anaphora resolution</title><source>esp@cenet</source><creator>WATANABE, Kenta ; ISHIKAWA, Shunsuke ; HASEGAWA, Tohru ; TOMINAGA, Yasuyuki ; UETSUKI, Hiroaki ; ONO, Asako</creator><creatorcontrib>WATANABE, Kenta ; ISHIKAWA, Shunsuke ; HASEGAWA, Tohru ; TOMINAGA, Yasuyuki ; UETSUKI, Hiroaki ; ONO, Asako</creatorcontrib><description>A method, a structure, and a computer system for resolving an anaphora. The exemplary embodiments may include extracting individual context data from an individual expression and determining whether the individual expression includes an anaphora representation based on the individual context data. The exemplary embodiments may further include, based on determining that the individual expression includes the anaphora representation, extracting anaphora context data and identifying an object of one or more objects to which the anaphora representation refers based on comparing the individual context data and the anaphora context data to data detailing the one or more objects.</description><language>eng</language><subject>ACOUSTICS ; CALCULATING ; COMPUTING ; COUNTING ; DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FORADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORYOR FORECASTING PURPOSES ; MUSICAL INSTRUMENTS ; PHYSICS ; SPEECH ANALYSIS OR SYNTHESIS ; SPEECH OR AUDIO CODING OR DECODING ; SPEECH OR VOICE PROCESSING ; SPEECH RECOGNITION ; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE,COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTINGPURPOSES, NOT OTHERWISE PROVIDED FOR</subject><creationdate>2022</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20220526&DB=EPODOC&CC=AU&NR=2020400345A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25544,76293</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20220526&DB=EPODOC&CC=AU&NR=2020400345A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>WATANABE, Kenta</creatorcontrib><creatorcontrib>ISHIKAWA, Shunsuke</creatorcontrib><creatorcontrib>HASEGAWA, Tohru</creatorcontrib><creatorcontrib>TOMINAGA, Yasuyuki</creatorcontrib><creatorcontrib>UETSUKI, Hiroaki</creatorcontrib><creatorcontrib>ONO, Asako</creatorcontrib><title>Anaphora resolution</title><description>A method, a structure, and a computer system for resolving an anaphora. The exemplary embodiments may include extracting individual context data from an individual expression and determining whether the individual expression includes an anaphora representation based on the individual context data. The exemplary embodiments may further include, based on determining that the individual expression includes the anaphora representation, extracting anaphora context data and identifying an object of one or more objects to which the anaphora representation refers based on comparing the individual context data and the anaphora context data to data detailing the one or more objects.</description><subject>ACOUSTICS</subject><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FORADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORYOR FORECASTING PURPOSES</subject><subject>MUSICAL INSTRUMENTS</subject><subject>PHYSICS</subject><subject>SPEECH ANALYSIS OR SYNTHESIS</subject><subject>SPEECH OR AUDIO CODING OR DECODING</subject><subject>SPEECH OR VOICE PROCESSING</subject><subject>SPEECH RECOGNITION</subject><subject>SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE,COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTINGPURPOSES, NOT OTHERWISE PROVIDED FOR</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2022</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZBB2zEssyMgvSlQoSi3OzyktyczP42FgTUvMKU7lhdLcDMpuriHOHrqpBfnxqcUFicmpeakl8Y6hRgZGBiYGBsYmpo6GxsSpAgCz9yFQ</recordid><startdate>20220526</startdate><enddate>20220526</enddate><creator>WATANABE, Kenta</creator><creator>ISHIKAWA, Shunsuke</creator><creator>HASEGAWA, Tohru</creator><creator>TOMINAGA, Yasuyuki</creator><creator>UETSUKI, Hiroaki</creator><creator>ONO, Asako</creator><scope>EVB</scope></search><sort><creationdate>20220526</creationdate><title>Anaphora resolution</title><author>WATANABE, Kenta ; ISHIKAWA, Shunsuke ; HASEGAWA, Tohru ; TOMINAGA, Yasuyuki ; UETSUKI, Hiroaki ; ONO, Asako</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_AU2020400345A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2022</creationdate><topic>ACOUSTICS</topic><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FORADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORYOR FORECASTING PURPOSES</topic><topic>MUSICAL INSTRUMENTS</topic><topic>PHYSICS</topic><topic>SPEECH ANALYSIS OR SYNTHESIS</topic><topic>SPEECH OR AUDIO CODING OR DECODING</topic><topic>SPEECH OR VOICE PROCESSING</topic><topic>SPEECH RECOGNITION</topic><topic>SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE,COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTINGPURPOSES, NOT OTHERWISE PROVIDED FOR</topic><toplevel>online_resources</toplevel><creatorcontrib>WATANABE, Kenta</creatorcontrib><creatorcontrib>ISHIKAWA, Shunsuke</creatorcontrib><creatorcontrib>HASEGAWA, Tohru</creatorcontrib><creatorcontrib>TOMINAGA, Yasuyuki</creatorcontrib><creatorcontrib>UETSUKI, Hiroaki</creatorcontrib><creatorcontrib>ONO, Asako</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>WATANABE, Kenta</au><au>ISHIKAWA, Shunsuke</au><au>HASEGAWA, Tohru</au><au>TOMINAGA, Yasuyuki</au><au>UETSUKI, Hiroaki</au><au>ONO, Asako</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Anaphora resolution</title><date>2022-05-26</date><risdate>2022</risdate><abstract>A method, a structure, and a computer system for resolving an anaphora. The exemplary embodiments may include extracting individual context data from an individual expression and determining whether the individual expression includes an anaphora representation based on the individual context data. The exemplary embodiments may further include, based on determining that the individual expression includes the anaphora representation, extracting anaphora context data and identifying an object of one or more objects to which the anaphora representation refers based on comparing the individual context data and the anaphora context data to data detailing the one or more objects.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_AU2020400345A1 |
source | esp@cenet |
subjects | ACOUSTICS CALCULATING COMPUTING COUNTING DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FORADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORYOR FORECASTING PURPOSES MUSICAL INSTRUMENTS PHYSICS SPEECH ANALYSIS OR SYNTHESIS SPEECH OR AUDIO CODING OR DECODING SPEECH OR VOICE PROCESSING SPEECH RECOGNITION SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE,COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTINGPURPOSES, NOT OTHERWISE PROVIDED FOR |
title | Anaphora resolution |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T12%3A16%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=WATANABE,%20Kenta&rft.date=2022-05-26&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EAU2020400345A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |