EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph Completion
Commonsense knowledge graph completion is a new challenge for commonsense knowledge graph construction and application. In contrast to factual knowledge graphs such as Freebase and YAGO, commonsense knowledge graphs (CSKGs; e.g., ConceptNet) utilize free-form text to represent named entities, short...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-02 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Su, Ying Fang, Tianqing Xiao, Huiru Wang, Weiqi Song, Yangqiu Zhang, Tong Chen, Lei |
description | Commonsense knowledge graph completion is a new challenge for commonsense knowledge graph construction and application. In contrast to factual knowledge graphs such as Freebase and YAGO, commonsense knowledge graphs (CSKGs; e.g., ConceptNet) utilize free-form text to represent named entities, short phrases, and events as their nodes. Such a loose structure results in large and sparse CSKGs, which makes the semantic understanding of these nodes more critical for learning rich commonsense knowledge graph embedding. While current methods leverage semantic similarities to increase the graph density, the semantic plausibility of the nodes and their relations are under-explored. Previous works adopt conceptual abstraction to improve the consistency of modeling (event) plausibility, but they are not scalable enough and still suffer from data sparsity. In this paper, we propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class, which indicates a similar level of plausibility. Each node in CSKG finds its top entailed nodes using a finetuned transformer over natural language inference (NLI) tasks, which sufficiently capture textual entailment signals. The entailment relation between these nodes are further utilized to: 1) build new connections between source triplets and entailed nodes to densify the sparse CSKGs; 2) enrich the generalization ability of node representations by comparing the node embeddings with a contrastive loss. Experiments on two standard CSKGs demonstrate that our proposed framework EntailE can improve the performance of CSKG completion tasks under both transductive and inductive settings. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2927724149</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2927724149</sourcerecordid><originalsourceid>FETCH-proquest_journals_29277241493</originalsourceid><addsrcrecordid>eNqNyssKgkAYhuEhCJLyHgZaCzqjmW3FDrQK3MuQfzYy_mNzoC4_oy4g-OBdPN-MBIzzJNqmjC1IaG0fxzHb5CzLeEAuFTohVbWjJ3RGt_4qsaM1vJwXin5xAHRUIi31MGi0MI2eUT8VtB3QgxHj_WOjAic1rsj8JpSF8NclWe-rujxGo9EPD9Y1vfYGJ2pYwfKcpUla8P9eb7wZP5g</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2927724149</pqid></control><display><type>article</type><title>EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph Completion</title><source>Free E- Journals</source><creator>Su, Ying ; Fang, Tianqing ; Xiao, Huiru ; Wang, Weiqi ; Song, Yangqiu ; Zhang, Tong ; Chen, Lei</creator><creatorcontrib>Su, Ying ; Fang, Tianqing ; Xiao, Huiru ; Wang, Weiqi ; Song, Yangqiu ; Zhang, Tong ; Chen, Lei</creatorcontrib><description>Commonsense knowledge graph completion is a new challenge for commonsense knowledge graph construction and application. In contrast to factual knowledge graphs such as Freebase and YAGO, commonsense knowledge graphs (CSKGs; e.g., ConceptNet) utilize free-form text to represent named entities, short phrases, and events as their nodes. Such a loose structure results in large and sparse CSKGs, which makes the semantic understanding of these nodes more critical for learning rich commonsense knowledge graph embedding. While current methods leverage semantic similarities to increase the graph density, the semantic plausibility of the nodes and their relations are under-explored. Previous works adopt conceptual abstraction to improve the consistency of modeling (event) plausibility, but they are not scalable enough and still suffer from data sparsity. In this paper, we propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class, which indicates a similar level of plausibility. Each node in CSKG finds its top entailed nodes using a finetuned transformer over natural language inference (NLI) tasks, which sufficiently capture textual entailment signals. The entailment relation between these nodes are further utilized to: 1) build new connections between source triplets and entailed nodes to densify the sparse CSKGs; 2) enrich the generalization ability of node representations by comparing the node embeddings with a contrastive loss. Experiments on two standard CSKGs demonstrate that our proposed framework EntailE can improve the performance of CSKG completion tasks under both transductive and inductive settings.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Free form ; Graph theory ; Graphs ; Knowledge representation ; Natural language processing ; Nodes ; Semantics</subject><ispartof>arXiv.org, 2024-02</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Su, Ying</creatorcontrib><creatorcontrib>Fang, Tianqing</creatorcontrib><creatorcontrib>Xiao, Huiru</creatorcontrib><creatorcontrib>Wang, Weiqi</creatorcontrib><creatorcontrib>Song, Yangqiu</creatorcontrib><creatorcontrib>Zhang, Tong</creatorcontrib><creatorcontrib>Chen, Lei</creatorcontrib><title>EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph Completion</title><title>arXiv.org</title><description>Commonsense knowledge graph completion is a new challenge for commonsense knowledge graph construction and application. In contrast to factual knowledge graphs such as Freebase and YAGO, commonsense knowledge graphs (CSKGs; e.g., ConceptNet) utilize free-form text to represent named entities, short phrases, and events as their nodes. Such a loose structure results in large and sparse CSKGs, which makes the semantic understanding of these nodes more critical for learning rich commonsense knowledge graph embedding. While current methods leverage semantic similarities to increase the graph density, the semantic plausibility of the nodes and their relations are under-explored. Previous works adopt conceptual abstraction to improve the consistency of modeling (event) plausibility, but they are not scalable enough and still suffer from data sparsity. In this paper, we propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class, which indicates a similar level of plausibility. Each node in CSKG finds its top entailed nodes using a finetuned transformer over natural language inference (NLI) tasks, which sufficiently capture textual entailment signals. The entailment relation between these nodes are further utilized to: 1) build new connections between source triplets and entailed nodes to densify the sparse CSKGs; 2) enrich the generalization ability of node representations by comparing the node embeddings with a contrastive loss. Experiments on two standard CSKGs demonstrate that our proposed framework EntailE can improve the performance of CSKG completion tasks under both transductive and inductive settings.</description><subject>Free form</subject><subject>Graph theory</subject><subject>Graphs</subject><subject>Knowledge representation</subject><subject>Natural language processing</subject><subject>Nodes</subject><subject>Semantics</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNyssKgkAYhuEhCJLyHgZaCzqjmW3FDrQK3MuQfzYy_mNzoC4_oy4g-OBdPN-MBIzzJNqmjC1IaG0fxzHb5CzLeEAuFTohVbWjJ3RGt_4qsaM1vJwXin5xAHRUIi31MGi0MI2eUT8VtB3QgxHj_WOjAic1rsj8JpSF8NclWe-rujxGo9EPD9Y1vfYGJ2pYwfKcpUla8P9eb7wZP5g</recordid><startdate>20240215</startdate><enddate>20240215</enddate><creator>Su, Ying</creator><creator>Fang, Tianqing</creator><creator>Xiao, Huiru</creator><creator>Wang, Weiqi</creator><creator>Song, Yangqiu</creator><creator>Zhang, Tong</creator><creator>Chen, Lei</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240215</creationdate><title>EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph Completion</title><author>Su, Ying ; Fang, Tianqing ; Xiao, Huiru ; Wang, Weiqi ; Song, Yangqiu ; Zhang, Tong ; Chen, Lei</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_29277241493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Free form</topic><topic>Graph theory</topic><topic>Graphs</topic><topic>Knowledge representation</topic><topic>Natural language processing</topic><topic>Nodes</topic><topic>Semantics</topic><toplevel>online_resources</toplevel><creatorcontrib>Su, Ying</creatorcontrib><creatorcontrib>Fang, Tianqing</creatorcontrib><creatorcontrib>Xiao, Huiru</creatorcontrib><creatorcontrib>Wang, Weiqi</creatorcontrib><creatorcontrib>Song, Yangqiu</creatorcontrib><creatorcontrib>Zhang, Tong</creatorcontrib><creatorcontrib>Chen, Lei</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Su, Ying</au><au>Fang, Tianqing</au><au>Xiao, Huiru</au><au>Wang, Weiqi</au><au>Song, Yangqiu</au><au>Zhang, Tong</au><au>Chen, Lei</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph Completion</atitle><jtitle>arXiv.org</jtitle><date>2024-02-15</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Commonsense knowledge graph completion is a new challenge for commonsense knowledge graph construction and application. In contrast to factual knowledge graphs such as Freebase and YAGO, commonsense knowledge graphs (CSKGs; e.g., ConceptNet) utilize free-form text to represent named entities, short phrases, and events as their nodes. Such a loose structure results in large and sparse CSKGs, which makes the semantic understanding of these nodes more critical for learning rich commonsense knowledge graph embedding. While current methods leverage semantic similarities to increase the graph density, the semantic plausibility of the nodes and their relations are under-explored. Previous works adopt conceptual abstraction to improve the consistency of modeling (event) plausibility, but they are not scalable enough and still suffer from data sparsity. In this paper, we propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class, which indicates a similar level of plausibility. Each node in CSKG finds its top entailed nodes using a finetuned transformer over natural language inference (NLI) tasks, which sufficiently capture textual entailment signals. The entailment relation between these nodes are further utilized to: 1) build new connections between source triplets and entailed nodes to densify the sparse CSKGs; 2) enrich the generalization ability of node representations by comparing the node embeddings with a contrastive loss. Experiments on two standard CSKGs demonstrate that our proposed framework EntailE can improve the performance of CSKG completion tasks under both transductive and inductive settings.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-02 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2927724149 |
source | Free E- Journals |
subjects | Free form Graph theory Graphs Knowledge representation Natural language processing Nodes Semantics |
title | EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph Completion |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T04%3A11%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=EntailE:%20Introducing%20Textual%20Entailment%20in%20Commonsense%20Knowledge%20Graph%20Completion&rft.jtitle=arXiv.org&rft.au=Su,%20Ying&rft.date=2024-02-15&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2927724149%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2927724149&rft_id=info:pmid/&rfr_iscdi=true |