Toward Subgraph-Guided Knowledge Graph Question Generation With Graph Neural Networks

Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers. Previous works mostly focus on a simple setting that is to generate questions from a single KG triple. In this work, we focus on a more realistic setting where we aim to generate qu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2024-09, Vol.35 (9), p.12706-12717
Hauptverfasser: Chen, Yu, Wu, Lingfei, Zaki, Mohammed J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 12717
container_issue 9
container_start_page 12706
container_title IEEE transaction on neural networks and learning systems
container_volume 35
creator Chen, Yu
Wu, Lingfei
Zaki, Mohammed J.
description Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers. Previous works mostly focus on a simple setting that is to generate questions from a single KG triple. In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers. In addition, most previous works built on either RNN- or Transformer-based models to encode a linearized KG subgraph, which totally discards the explicit structure information of a KG subgraph. To address this issue, we propose to apply a bidirectional Graph2Seq model to encode the KG subgraph. Furthermore, we enhance our RNN decoder with a node-level copying mechanism to allow direct copying of node attributes from the KG subgraph to the output question. Both automatic and human evaluation results demonstrate that our model achieves new state-of-the-art scores, outperforming existing methods by a significant margin on two QG benchmarks. Experimental results also show that our QG model can consistently benefit the question-answering (QA) task as a means of data augmentation.
doi_str_mv 10.1109/TNNLS.2023.3264519
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TNNLS_2023_3264519</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10107656</ieee_id><sourcerecordid>2805516143</sourcerecordid><originalsourceid>FETCH-LOGICAL-c368t-d41ea8121c6791501e0cbf98c802b24fe169926422c81487eaeea78363edcd63</originalsourceid><addsrcrecordid>eNpNkMFOwzAMhiMEYtPYCyCEeuTSESdtmh7RBAUxDaEVwS1KW3crdO1IWk28Pe02JnyxZf_-ZX-EXAKdANDwNp7PZ4sJo4xPOBOeD-EJGTIQzGVcytNjHXwMyNjaT9qFoL7wwnMy4AENecBgSN7ieqtN5izaZGn0ZuVGbZFh5jxX9bbEbIlO1Led1xZtU9SVE2GFRu_K96JZHcZzbI0uu9Rsa_NlL8hZrkuL40MekfjhPp4-urOX6Gl6N3NTLmTjZh6glsAgFUEIPgWkaZKHMpWUJczLEUQYds8xlkrwZIAaUQeSC45Zmgk-Ijd7242pv_sD1bqwKZalrrBurWKS-j4I8HgnZXtpamprDeZqY4q1Nj8KqOqBqh1Q1QNVB6Dd0vXBv03WmB1X_vB1gqu9oEDEf45AA-EL_gt14HnG</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2805516143</pqid></control><display><type>article</type><title>Toward Subgraph-Guided Knowledge Graph Question Generation With Graph Neural Networks</title><source>IEEE Electronic Library (IEL)</source><creator>Chen, Yu ; Wu, Lingfei ; Zaki, Mohammed J.</creator><creatorcontrib>Chen, Yu ; Wu, Lingfei ; Zaki, Mohammed J.</creatorcontrib><description>Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers. Previous works mostly focus on a simple setting that is to generate questions from a single KG triple. In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers. In addition, most previous works built on either RNN- or Transformer-based models to encode a linearized KG subgraph, which totally discards the explicit structure information of a KG subgraph. To address this issue, we propose to apply a bidirectional Graph2Seq model to encode the KG subgraph. Furthermore, we enhance our RNN decoder with a node-level copying mechanism to allow direct copying of node attributes from the KG subgraph to the output question. Both automatic and human evaluation results demonstrate that our model achieves new state-of-the-art scores, outperforming existing methods by a significant margin on two QG benchmarks. Experimental results also show that our QG model can consistently benefit the question-answering (QA) task as a means of data augmentation.</description><identifier>ISSN: 2162-237X</identifier><identifier>ISSN: 2162-2388</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2023.3264519</identifier><identifier>PMID: 37093721</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Benchmark testing ; Data models ; Decoding ; Deep learning ; Graph neural networks ; graph neural networks (GNNs) ; Knowledge graphs ; knowledge graphs (KGs) ; natural language (NL) processing ; question generation (QG) ; Task analysis ; Transformers</subject><ispartof>IEEE transaction on neural networks and learning systems, 2024-09, Vol.35 (9), p.12706-12717</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c368t-d41ea8121c6791501e0cbf98c802b24fe169926422c81487eaeea78363edcd63</citedby><cites>FETCH-LOGICAL-c368t-d41ea8121c6791501e0cbf98c802b24fe169926422c81487eaeea78363edcd63</cites><orcidid>0000-0003-4711-0234 ; 0000-0003-0966-8026 ; 0000-0002-3660-651X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10107656$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10107656$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37093721$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Chen, Yu</creatorcontrib><creatorcontrib>Wu, Lingfei</creatorcontrib><creatorcontrib>Zaki, Mohammed J.</creatorcontrib><title>Toward Subgraph-Guided Knowledge Graph Question Generation With Graph Neural Networks</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers. Previous works mostly focus on a simple setting that is to generate questions from a single KG triple. In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers. In addition, most previous works built on either RNN- or Transformer-based models to encode a linearized KG subgraph, which totally discards the explicit structure information of a KG subgraph. To address this issue, we propose to apply a bidirectional Graph2Seq model to encode the KG subgraph. Furthermore, we enhance our RNN decoder with a node-level copying mechanism to allow direct copying of node attributes from the KG subgraph to the output question. Both automatic and human evaluation results demonstrate that our model achieves new state-of-the-art scores, outperforming existing methods by a significant margin on two QG benchmarks. Experimental results also show that our QG model can consistently benefit the question-answering (QA) task as a means of data augmentation.</description><subject>Benchmark testing</subject><subject>Data models</subject><subject>Decoding</subject><subject>Deep learning</subject><subject>Graph neural networks</subject><subject>graph neural networks (GNNs)</subject><subject>Knowledge graphs</subject><subject>knowledge graphs (KGs)</subject><subject>natural language (NL) processing</subject><subject>question generation (QG)</subject><subject>Task analysis</subject><subject>Transformers</subject><issn>2162-237X</issn><issn>2162-2388</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkMFOwzAMhiMEYtPYCyCEeuTSESdtmh7RBAUxDaEVwS1KW3crdO1IWk28Pe02JnyxZf_-ZX-EXAKdANDwNp7PZ4sJo4xPOBOeD-EJGTIQzGVcytNjHXwMyNjaT9qFoL7wwnMy4AENecBgSN7ieqtN5izaZGn0ZuVGbZFh5jxX9bbEbIlO1Led1xZtU9SVE2GFRu_K96JZHcZzbI0uu9Rsa_NlL8hZrkuL40MekfjhPp4-urOX6Gl6N3NTLmTjZh6glsAgFUEIPgWkaZKHMpWUJczLEUQYds8xlkrwZIAaUQeSC45Zmgk-Ijd7242pv_sD1bqwKZalrrBurWKS-j4I8HgnZXtpamprDeZqY4q1Nj8KqOqBqh1Q1QNVB6Dd0vXBv03WmB1X_vB1gqu9oEDEf45AA-EL_gt14HnG</recordid><startdate>20240901</startdate><enddate>20240901</enddate><creator>Chen, Yu</creator><creator>Wu, Lingfei</creator><creator>Zaki, Mohammed J.</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-4711-0234</orcidid><orcidid>https://orcid.org/0000-0003-0966-8026</orcidid><orcidid>https://orcid.org/0000-0002-3660-651X</orcidid></search><sort><creationdate>20240901</creationdate><title>Toward Subgraph-Guided Knowledge Graph Question Generation With Graph Neural Networks</title><author>Chen, Yu ; Wu, Lingfei ; Zaki, Mohammed J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c368t-d41ea8121c6791501e0cbf98c802b24fe169926422c81487eaeea78363edcd63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Benchmark testing</topic><topic>Data models</topic><topic>Decoding</topic><topic>Deep learning</topic><topic>Graph neural networks</topic><topic>graph neural networks (GNNs)</topic><topic>Knowledge graphs</topic><topic>knowledge graphs (KGs)</topic><topic>natural language (NL) processing</topic><topic>question generation (QG)</topic><topic>Task analysis</topic><topic>Transformers</topic><toplevel>online_resources</toplevel><creatorcontrib>Chen, Yu</creatorcontrib><creatorcontrib>Wu, Lingfei</creatorcontrib><creatorcontrib>Zaki, Mohammed J.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Chen, Yu</au><au>Wu, Lingfei</au><au>Zaki, Mohammed J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Toward Subgraph-Guided Knowledge Graph Question Generation With Graph Neural Networks</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2024-09-01</date><risdate>2024</risdate><volume>35</volume><issue>9</issue><spage>12706</spage><epage>12717</epage><pages>12706-12717</pages><issn>2162-237X</issn><issn>2162-2388</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers. Previous works mostly focus on a simple setting that is to generate questions from a single KG triple. In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers. In addition, most previous works built on either RNN- or Transformer-based models to encode a linearized KG subgraph, which totally discards the explicit structure information of a KG subgraph. To address this issue, we propose to apply a bidirectional Graph2Seq model to encode the KG subgraph. Furthermore, we enhance our RNN decoder with a node-level copying mechanism to allow direct copying of node attributes from the KG subgraph to the output question. Both automatic and human evaluation results demonstrate that our model achieves new state-of-the-art scores, outperforming existing methods by a significant margin on two QG benchmarks. Experimental results also show that our QG model can consistently benefit the question-answering (QA) task as a means of data augmentation.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>37093721</pmid><doi>10.1109/TNNLS.2023.3264519</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-4711-0234</orcidid><orcidid>https://orcid.org/0000-0003-0966-8026</orcidid><orcidid>https://orcid.org/0000-0002-3660-651X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2162-237X
ispartof IEEE transaction on neural networks and learning systems, 2024-09, Vol.35 (9), p.12706-12717
issn 2162-237X
2162-2388
2162-2388
language eng
recordid cdi_crossref_primary_10_1109_TNNLS_2023_3264519
source IEEE Electronic Library (IEL)
subjects Benchmark testing
Data models
Decoding
Deep learning
Graph neural networks
graph neural networks (GNNs)
Knowledge graphs
knowledge graphs (KGs)
natural language (NL) processing
question generation (QG)
Task analysis
Transformers
title Toward Subgraph-Guided Knowledge Graph Question Generation With Graph Neural Networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-11T10%3A39%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Toward%20Subgraph-Guided%20Knowledge%20Graph%20Question%20Generation%20With%20Graph%20Neural%20Networks&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Chen,%20Yu&rft.date=2024-09-01&rft.volume=35&rft.issue=9&rft.spage=12706&rft.epage=12717&rft.pages=12706-12717&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2023.3264519&rft_dat=%3Cproquest_RIE%3E2805516143%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2805516143&rft_id=info:pmid/37093721&rft_ieee_id=10107656&rfr_iscdi=true