Contextualized Knowledge-aware Attentive Neural Network: Enhancing Answer Selection with Knowledge

Answer selection, which is involved in many natural language processing applications, such as dialog systems and question answering (QA), is an important yet challenging task in practice, since conventional methods typically suffer from the issues of ignoring diverse real-world background knowledge....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on information systems 2022-01, Vol.40 (1), p.1-33, Article 2
Hauptverfasser: Deng, Yang, Xie, Yuexiang, Li, Yaliang, Yang, Min, Lam, Wai, Shen, Ying
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 33
container_issue 1
container_start_page 1
container_title ACM transactions on information systems
container_volume 40
creator Deng, Yang
Xie, Yuexiang
Li, Yaliang
Yang, Min
Lam, Wai
Shen, Ying
description Answer selection, which is involved in many natural language processing applications, such as dialog systems and question answering (QA), is an important yet challenging task in practice, since conventional methods typically suffer from the issues of ignoring diverse real-world background knowledge. In this article, we extensively investigate approaches to enhancing the answer selection model with external knowledge from knowledge graph (KG). First, we present a context-knowledge interaction learning framework, Knowledge-aware Neural Network, which learns the QA sentence representations by considering a tight interaction with the external knowledge from KG and the textual information. Then, we develop two kinds of knowledge-aware attention mechanism to summarize both the context-based and knowledge-based interactions between questions and answers. To handle the diversity and complexity of KG information, we further propose a Contextualized Knowledge-aware Attentive Neural Network, which improves the knowledge representation learning with structure information via a customized Graph Convolutional Network and comprehensively learns context-based and knowledge-based sentence representation via the multi-view knowledge-aware attention mechanism. We evaluate our method on four widely used benchmark QA datasets, including WikiQA, TREC QA, InsuranceQA, and Yahoo QA. Results verify the benefits of incorporating external knowledge from KG and show the robust superiority and extensive applicability of our method.
doi_str_mv 10.1145/3457533
format Article
fullrecord <record><control><sourceid>acm_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1145_3457533</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3457533</sourcerecordid><originalsourceid>FETCH-LOGICAL-a277t-7f8ad203891ff3e1b28dde7d71d7156eb823ad032b3ca6d795ef2f1a046afa723</originalsourceid><addsrcrecordid>eNpFkM1Lw0AQxRdRsFbx7mlvnqL7kc1uvYXSqlj0oJ7DJDvbRtONbLZG_euNtCoMvBnmx4P3CDnl7ILzVF3KVGkl5R4ZcaVMIkxm9oedpVliuDGH5KjrXhgb7oyNSDltfcSPuIGm_kJL73zbN2iXmEAPAWkeI_pYvyO9x02AZpDYt-H1is78CnxV-yXNfddjoI_YYBXr1tO-jqt_p2Ny4KDp8GSnY_I8nz1Nb5LFw_XtNF8kILSOiXYGrGDSTLhzEnkpjLWorebDqAxLIyRYJkUpK8isnih0wnEYgoADLeSYnG99q9B2XUBXvIV6DeGz4Kz4qabYVTOQZ1sSqvUf9Pv8Bq3EYAo</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Contextualized Knowledge-aware Attentive Neural Network: Enhancing Answer Selection with Knowledge</title><source>Access via ACM Digital Library</source><creator>Deng, Yang ; Xie, Yuexiang ; Li, Yaliang ; Yang, Min ; Lam, Wai ; Shen, Ying</creator><creatorcontrib>Deng, Yang ; Xie, Yuexiang ; Li, Yaliang ; Yang, Min ; Lam, Wai ; Shen, Ying</creatorcontrib><description>Answer selection, which is involved in many natural language processing applications, such as dialog systems and question answering (QA), is an important yet challenging task in practice, since conventional methods typically suffer from the issues of ignoring diverse real-world background knowledge. In this article, we extensively investigate approaches to enhancing the answer selection model with external knowledge from knowledge graph (KG). First, we present a context-knowledge interaction learning framework, Knowledge-aware Neural Network, which learns the QA sentence representations by considering a tight interaction with the external knowledge from KG and the textual information. Then, we develop two kinds of knowledge-aware attention mechanism to summarize both the context-based and knowledge-based interactions between questions and answers. To handle the diversity and complexity of KG information, we further propose a Contextualized Knowledge-aware Attentive Neural Network, which improves the knowledge representation learning with structure information via a customized Graph Convolutional Network and comprehensively learns context-based and knowledge-based sentence representation via the multi-view knowledge-aware attention mechanism. We evaluate our method on four widely used benchmark QA datasets, including WikiQA, TREC QA, InsuranceQA, and Yahoo QA. Results verify the benefits of incorporating external knowledge from KG and show the robust superiority and extensive applicability of our method.</description><identifier>ISSN: 1046-8188</identifier><identifier>EISSN: 1558-2868</identifier><identifier>DOI: 10.1145/3457533</identifier><language>eng</language><publisher>New York, NY: ACM</publisher><subject>Information systems ; Question answering</subject><ispartof>ACM transactions on information systems, 2022-01, Vol.40 (1), p.1-33, Article 2</ispartof><rights>Association for Computing Machinery.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-a277t-7f8ad203891ff3e1b28dde7d71d7156eb823ad032b3ca6d795ef2f1a046afa723</citedby><cites>FETCH-LOGICAL-a277t-7f8ad203891ff3e1b28dde7d71d7156eb823ad032b3ca6d795ef2f1a046afa723</cites><orcidid>0000-0002-8122-5943</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://dl.acm.org/doi/pdf/10.1145/3457533$$EPDF$$P50$$Gacm$$H</linktopdf><link.rule.ids>314,780,784,2282,27924,27925,40196,76228</link.rule.ids></links><search><creatorcontrib>Deng, Yang</creatorcontrib><creatorcontrib>Xie, Yuexiang</creatorcontrib><creatorcontrib>Li, Yaliang</creatorcontrib><creatorcontrib>Yang, Min</creatorcontrib><creatorcontrib>Lam, Wai</creatorcontrib><creatorcontrib>Shen, Ying</creatorcontrib><title>Contextualized Knowledge-aware Attentive Neural Network: Enhancing Answer Selection with Knowledge</title><title>ACM transactions on information systems</title><addtitle>ACM TOIS</addtitle><description>Answer selection, which is involved in many natural language processing applications, such as dialog systems and question answering (QA), is an important yet challenging task in practice, since conventional methods typically suffer from the issues of ignoring diverse real-world background knowledge. In this article, we extensively investigate approaches to enhancing the answer selection model with external knowledge from knowledge graph (KG). First, we present a context-knowledge interaction learning framework, Knowledge-aware Neural Network, which learns the QA sentence representations by considering a tight interaction with the external knowledge from KG and the textual information. Then, we develop two kinds of knowledge-aware attention mechanism to summarize both the context-based and knowledge-based interactions between questions and answers. To handle the diversity and complexity of KG information, we further propose a Contextualized Knowledge-aware Attentive Neural Network, which improves the knowledge representation learning with structure information via a customized Graph Convolutional Network and comprehensively learns context-based and knowledge-based sentence representation via the multi-view knowledge-aware attention mechanism. We evaluate our method on four widely used benchmark QA datasets, including WikiQA, TREC QA, InsuranceQA, and Yahoo QA. Results verify the benefits of incorporating external knowledge from KG and show the robust superiority and extensive applicability of our method.</description><subject>Information systems</subject><subject>Question answering</subject><issn>1046-8188</issn><issn>1558-2868</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNpFkM1Lw0AQxRdRsFbx7mlvnqL7kc1uvYXSqlj0oJ7DJDvbRtONbLZG_euNtCoMvBnmx4P3CDnl7ILzVF3KVGkl5R4ZcaVMIkxm9oedpVliuDGH5KjrXhgb7oyNSDltfcSPuIGm_kJL73zbN2iXmEAPAWkeI_pYvyO9x02AZpDYt-H1is78CnxV-yXNfddjoI_YYBXr1tO-jqt_p2Ny4KDp8GSnY_I8nz1Nb5LFw_XtNF8kILSOiXYGrGDSTLhzEnkpjLWorebDqAxLIyRYJkUpK8isnih0wnEYgoADLeSYnG99q9B2XUBXvIV6DeGz4Kz4qabYVTOQZ1sSqvUf9Pv8Bq3EYAo</recordid><startdate>20220101</startdate><enddate>20220101</enddate><creator>Deng, Yang</creator><creator>Xie, Yuexiang</creator><creator>Li, Yaliang</creator><creator>Yang, Min</creator><creator>Lam, Wai</creator><creator>Shen, Ying</creator><general>ACM</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0002-8122-5943</orcidid></search><sort><creationdate>20220101</creationdate><title>Contextualized Knowledge-aware Attentive Neural Network: Enhancing Answer Selection with Knowledge</title><author>Deng, Yang ; Xie, Yuexiang ; Li, Yaliang ; Yang, Min ; Lam, Wai ; Shen, Ying</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a277t-7f8ad203891ff3e1b28dde7d71d7156eb823ad032b3ca6d795ef2f1a046afa723</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Information systems</topic><topic>Question answering</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Deng, Yang</creatorcontrib><creatorcontrib>Xie, Yuexiang</creatorcontrib><creatorcontrib>Li, Yaliang</creatorcontrib><creatorcontrib>Yang, Min</creatorcontrib><creatorcontrib>Lam, Wai</creatorcontrib><creatorcontrib>Shen, Ying</creatorcontrib><collection>CrossRef</collection><jtitle>ACM transactions on information systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Deng, Yang</au><au>Xie, Yuexiang</au><au>Li, Yaliang</au><au>Yang, Min</au><au>Lam, Wai</au><au>Shen, Ying</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Contextualized Knowledge-aware Attentive Neural Network: Enhancing Answer Selection with Knowledge</atitle><jtitle>ACM transactions on information systems</jtitle><stitle>ACM TOIS</stitle><date>2022-01-01</date><risdate>2022</risdate><volume>40</volume><issue>1</issue><spage>1</spage><epage>33</epage><pages>1-33</pages><artnum>2</artnum><issn>1046-8188</issn><eissn>1558-2868</eissn><abstract>Answer selection, which is involved in many natural language processing applications, such as dialog systems and question answering (QA), is an important yet challenging task in practice, since conventional methods typically suffer from the issues of ignoring diverse real-world background knowledge. In this article, we extensively investigate approaches to enhancing the answer selection model with external knowledge from knowledge graph (KG). First, we present a context-knowledge interaction learning framework, Knowledge-aware Neural Network, which learns the QA sentence representations by considering a tight interaction with the external knowledge from KG and the textual information. Then, we develop two kinds of knowledge-aware attention mechanism to summarize both the context-based and knowledge-based interactions between questions and answers. To handle the diversity and complexity of KG information, we further propose a Contextualized Knowledge-aware Attentive Neural Network, which improves the knowledge representation learning with structure information via a customized Graph Convolutional Network and comprehensively learns context-based and knowledge-based sentence representation via the multi-view knowledge-aware attention mechanism. We evaluate our method on four widely used benchmark QA datasets, including WikiQA, TREC QA, InsuranceQA, and Yahoo QA. Results verify the benefits of incorporating external knowledge from KG and show the robust superiority and extensive applicability of our method.</abstract><cop>New York, NY</cop><pub>ACM</pub><doi>10.1145/3457533</doi><tpages>33</tpages><orcidid>https://orcid.org/0000-0002-8122-5943</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1046-8188
ispartof ACM transactions on information systems, 2022-01, Vol.40 (1), p.1-33, Article 2
issn 1046-8188
1558-2868
language eng
recordid cdi_crossref_primary_10_1145_3457533
source Access via ACM Digital Library
subjects Information systems
Question answering
title Contextualized Knowledge-aware Attentive Neural Network: Enhancing Answer Selection with Knowledge
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T18%3A14%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-acm_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Contextualized%20Knowledge-aware%20Attentive%20Neural%20Network:%20Enhancing%20Answer%20Selection%20with%20Knowledge&rft.jtitle=ACM%20transactions%20on%20information%20systems&rft.au=Deng,%20Yang&rft.date=2022-01-01&rft.volume=40&rft.issue=1&rft.spage=1&rft.epage=33&rft.pages=1-33&rft.artnum=2&rft.issn=1046-8188&rft.eissn=1558-2868&rft_id=info:doi/10.1145/3457533&rft_dat=%3Cacm_cross%3E3457533%3C/acm_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true