Answer Generation through Unified Memories over Multiple Passages

Machine reading comprehension methods that generate answers by referring to multiple passages for a question have gained much attention in AI and NLP communities. The current methods, however, do not investigate the relationships among multiple passages in the answer generation process, even though...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Nakatsuji, Makoto, Okui, Sohei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Nakatsuji, Makoto
Okui, Sohei
description Machine reading comprehension methods that generate answers by referring to multiple passages for a question have gained much attention in AI and NLP communities. The current methods, however, do not investigate the relationships among multiple passages in the answer generation process, even though topics correlated among the passages may be answer candidates. Our method, called neural answer Generation through Unified Memories over Multiple Passages (GUM-MP), solves this problem as follows. First, it determines which tokens in the passages are matched to the question. In particular, it investigates matches between tokens in positive passages, which are assigned to the question, and those in negative passages, which are not related to the question. Next, it determines which tokens in the passage are matched to other passages assigned to the same question and at the same time it investigates the topics in which they are matched. Finally, it encodes the token sequences with the above two matching results into unified memories in the passage encoders and learns the answer sequence by using an encoder-decoder with a multiple-pointer-generator mechanism. As a result, GUM-MP can generate answers by pointing to important tokens present across passages. Evaluations indicate that GUM-MP generates much more accurate results than the current models do.
doi_str_mv 10.48550/arxiv.2004.13829
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2004_13829</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2004_13829</sourcerecordid><originalsourceid>FETCH-LOGICAL-a679-7fbe971ad0295c6a9c403bfa513d8ca397cdd04b5b5504e6af906d24d53ba6af3</originalsourceid><addsrcrecordid>eNotj0FuwjAURL3pAlEOwKq-QIIT20m8jFChlUCwgHX0HX-DpZAgO0C5PSmwGo00eppHyDRhsSikZDPwf-4ap4yJOOFFqkakLNtwQ0-X2KKH3nUt7Y--uxyOdN8669DQNZ467zDQ7joM15emd-cG6RZCgAOGT_JhoQk4eeeY7Bbfu_lPtNosf-flKoIsV1FuNao8AcNSJesMVC0Y1xZkwk1RA1d5bQwTWurhp8AMrGKZSYWRXMPQ-Jh8vbBPh-rs3Qn8vfp3qZ4u_AHFkkUf</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Answer Generation through Unified Memories over Multiple Passages</title><source>arXiv.org</source><creator>Nakatsuji, Makoto ; Okui, Sohei</creator><creatorcontrib>Nakatsuji, Makoto ; Okui, Sohei</creatorcontrib><description>Machine reading comprehension methods that generate answers by referring to multiple passages for a question have gained much attention in AI and NLP communities. The current methods, however, do not investigate the relationships among multiple passages in the answer generation process, even though topics correlated among the passages may be answer candidates. Our method, called neural answer Generation through Unified Memories over Multiple Passages (GUM-MP), solves this problem as follows. First, it determines which tokens in the passages are matched to the question. In particular, it investigates matches between tokens in positive passages, which are assigned to the question, and those in negative passages, which are not related to the question. Next, it determines which tokens in the passage are matched to other passages assigned to the same question and at the same time it investigates the topics in which they are matched. Finally, it encodes the token sequences with the above two matching results into unified memories in the passage encoders and learns the answer sequence by using an encoder-decoder with a multiple-pointer-generator mechanism. As a result, GUM-MP can generate answers by pointing to important tokens present across passages. Evaluations indicate that GUM-MP generates much more accurate results than the current models do.</description><identifier>DOI: 10.48550/arxiv.2004.13829</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Computation and Language ; Computer Science - Learning</subject><creationdate>2020-04</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2004.13829$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2004.13829$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Nakatsuji, Makoto</creatorcontrib><creatorcontrib>Okui, Sohei</creatorcontrib><title>Answer Generation through Unified Memories over Multiple Passages</title><description>Machine reading comprehension methods that generate answers by referring to multiple passages for a question have gained much attention in AI and NLP communities. The current methods, however, do not investigate the relationships among multiple passages in the answer generation process, even though topics correlated among the passages may be answer candidates. Our method, called neural answer Generation through Unified Memories over Multiple Passages (GUM-MP), solves this problem as follows. First, it determines which tokens in the passages are matched to the question. In particular, it investigates matches between tokens in positive passages, which are assigned to the question, and those in negative passages, which are not related to the question. Next, it determines which tokens in the passage are matched to other passages assigned to the same question and at the same time it investigates the topics in which they are matched. Finally, it encodes the token sequences with the above two matching results into unified memories in the passage encoders and learns the answer sequence by using an encoder-decoder with a multiple-pointer-generator mechanism. As a result, GUM-MP can generate answers by pointing to important tokens present across passages. Evaluations indicate that GUM-MP generates much more accurate results than the current models do.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Computation and Language</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj0FuwjAURL3pAlEOwKq-QIIT20m8jFChlUCwgHX0HX-DpZAgO0C5PSmwGo00eppHyDRhsSikZDPwf-4ap4yJOOFFqkakLNtwQ0-X2KKH3nUt7Y--uxyOdN8669DQNZ467zDQ7joM15emd-cG6RZCgAOGT_JhoQk4eeeY7Bbfu_lPtNosf-flKoIsV1FuNao8AcNSJesMVC0Y1xZkwk1RA1d5bQwTWurhp8AMrGKZSYWRXMPQ-Jh8vbBPh-rs3Qn8vfp3qZ4u_AHFkkUf</recordid><startdate>20200422</startdate><enddate>20200422</enddate><creator>Nakatsuji, Makoto</creator><creator>Okui, Sohei</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20200422</creationdate><title>Answer Generation through Unified Memories over Multiple Passages</title><author>Nakatsuji, Makoto ; Okui, Sohei</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a679-7fbe971ad0295c6a9c403bfa513d8ca397cdd04b5b5504e6af906d24d53ba6af3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Computation and Language</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Nakatsuji, Makoto</creatorcontrib><creatorcontrib>Okui, Sohei</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Nakatsuji, Makoto</au><au>Okui, Sohei</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Answer Generation through Unified Memories over Multiple Passages</atitle><date>2020-04-22</date><risdate>2020</risdate><abstract>Machine reading comprehension methods that generate answers by referring to multiple passages for a question have gained much attention in AI and NLP communities. The current methods, however, do not investigate the relationships among multiple passages in the answer generation process, even though topics correlated among the passages may be answer candidates. Our method, called neural answer Generation through Unified Memories over Multiple Passages (GUM-MP), solves this problem as follows. First, it determines which tokens in the passages are matched to the question. In particular, it investigates matches between tokens in positive passages, which are assigned to the question, and those in negative passages, which are not related to the question. Next, it determines which tokens in the passage are matched to other passages assigned to the same question and at the same time it investigates the topics in which they are matched. Finally, it encodes the token sequences with the above two matching results into unified memories in the passage encoders and learns the answer sequence by using an encoder-decoder with a multiple-pointer-generator mechanism. As a result, GUM-MP can generate answers by pointing to important tokens present across passages. Evaluations indicate that GUM-MP generates much more accurate results than the current models do.</abstract><doi>10.48550/arxiv.2004.13829</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2004.13829
ispartof
issn
language eng
recordid cdi_arxiv_primary_2004_13829
source arXiv.org
subjects Computer Science - Artificial Intelligence
Computer Science - Computation and Language
Computer Science - Learning
title Answer Generation through Unified Memories over Multiple Passages
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T12%3A29%3A23IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Answer%20Generation%20through%20Unified%20Memories%20over%20Multiple%20Passages&rft.au=Nakatsuji,%20Makoto&rft.date=2020-04-22&rft_id=info:doi/10.48550/arxiv.2004.13829&rft_dat=%3Carxiv_GOX%3E2004_13829%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true