DSKG: A Deep Sequential Model for Knowledge Graph Completion

Knowledge graph (KG) completion aims to fill the missing facts in a KG, where a fact is represented as a triple in the form of $(subject, relation, object)$. Current KG completion models compel two-thirds of a triple provided (e.g., $subject$ and $relation$) to predict the remaining one. In this pap...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Guo, Lingbing, Zhang, Qingheng, Ge, Weiyi, Hu, Wei, Qu, Yuzhong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Guo, Lingbing
Zhang, Qingheng
Ge, Weiyi
Hu, Wei
Qu, Yuzhong
description Knowledge graph (KG) completion aims to fill the missing facts in a KG, where a fact is represented as a triple in the form of $(subject, relation, object)$. Current KG completion models compel two-thirds of a triple provided (e.g., $subject$ and $relation$) to predict the remaining one. In this paper, we propose a new model, which uses a KG-specific multi-layer recurrent neural network (RNN) to model triples in a KG as sequences. It outperformed several state-of-the-art KG completion models on the conventional entity prediction task for many evaluation metrics, based on two benchmark datasets and a more difficult dataset. Furthermore, our model is enabled by the sequential characteristic and thus capable of predicting the whole triples only given one entity. Our experiments demonstrated that our model achieved promising performance on this new triple prediction task.
doi_str_mv 10.48550/arxiv.1810.12582
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1810_12582</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1810_12582</sourcerecordid><originalsourceid>FETCH-LOGICAL-a672-2fd99c987695301c6d569772839764977fcd7fd85a1ee033a62fd29bcc31a1d3</originalsourceid><addsrcrecordid>eNotj01OwzAUhL3pArUcgBW-QEps139VN1UKAbWIRdhHD_sZIrlxMIXC7UkLqxmNNKP5CLli5XxhpCxvIH93X3NmxoBxafgFWW2abb2ka7pBHGiD75_YHzqI9DF5jDSkTLd9Okb0r0jrDMMbrdJ-iHjoUj8jkwDxAy__dUqau9vn6r7YPdUP1XpXgNK84MFb66zRykpRMqe8VFZrboTVajG64LwO3khgiKUQoMYGty_OCQbMiym5_ls9v2-H3O0h_7QnivZMIX4BXHFAQg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>DSKG: A Deep Sequential Model for Knowledge Graph Completion</title><source>arXiv.org</source><creator>Guo, Lingbing ; Zhang, Qingheng ; Ge, Weiyi ; Hu, Wei ; Qu, Yuzhong</creator><creatorcontrib>Guo, Lingbing ; Zhang, Qingheng ; Ge, Weiyi ; Hu, Wei ; Qu, Yuzhong</creatorcontrib><description>Knowledge graph (KG) completion aims to fill the missing facts in a KG, where a fact is represented as a triple in the form of $(subject, relation, object)$. Current KG completion models compel two-thirds of a triple provided (e.g., $subject$ and $relation$) to predict the remaining one. In this paper, we propose a new model, which uses a KG-specific multi-layer recurrent neural network (RNN) to model triples in a KG as sequences. It outperformed several state-of-the-art KG completion models on the conventional entity prediction task for many evaluation metrics, based on two benchmark datasets and a more difficult dataset. Furthermore, our model is enabled by the sequential characteristic and thus capable of predicting the whole triples only given one entity. Our experiments demonstrated that our model achieved promising performance on this new triple prediction task.</description><identifier>DOI: 10.48550/arxiv.1810.12582</identifier><language>eng</language><subject>Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2018-10</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1810.12582$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1810.12582$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Guo, Lingbing</creatorcontrib><creatorcontrib>Zhang, Qingheng</creatorcontrib><creatorcontrib>Ge, Weiyi</creatorcontrib><creatorcontrib>Hu, Wei</creatorcontrib><creatorcontrib>Qu, Yuzhong</creatorcontrib><title>DSKG: A Deep Sequential Model for Knowledge Graph Completion</title><description>Knowledge graph (KG) completion aims to fill the missing facts in a KG, where a fact is represented as a triple in the form of $(subject, relation, object)$. Current KG completion models compel two-thirds of a triple provided (e.g., $subject$ and $relation$) to predict the remaining one. In this paper, we propose a new model, which uses a KG-specific multi-layer recurrent neural network (RNN) to model triples in a KG as sequences. It outperformed several state-of-the-art KG completion models on the conventional entity prediction task for many evaluation metrics, based on two benchmark datasets and a more difficult dataset. Furthermore, our model is enabled by the sequential characteristic and thus capable of predicting the whole triples only given one entity. Our experiments demonstrated that our model achieved promising performance on this new triple prediction task.</description><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj01OwzAUhL3pArUcgBW-QEps139VN1UKAbWIRdhHD_sZIrlxMIXC7UkLqxmNNKP5CLli5XxhpCxvIH93X3NmxoBxafgFWW2abb2ka7pBHGiD75_YHzqI9DF5jDSkTLd9Okb0r0jrDMMbrdJ-iHjoUj8jkwDxAy__dUqau9vn6r7YPdUP1XpXgNK84MFb66zRykpRMqe8VFZrboTVajG64LwO3khgiKUQoMYGty_OCQbMiym5_ls9v2-H3O0h_7QnivZMIX4BXHFAQg</recordid><startdate>20181030</startdate><enddate>20181030</enddate><creator>Guo, Lingbing</creator><creator>Zhang, Qingheng</creator><creator>Ge, Weiyi</creator><creator>Hu, Wei</creator><creator>Qu, Yuzhong</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20181030</creationdate><title>DSKG: A Deep Sequential Model for Knowledge Graph Completion</title><author>Guo, Lingbing ; Zhang, Qingheng ; Ge, Weiyi ; Hu, Wei ; Qu, Yuzhong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a672-2fd99c987695301c6d569772839764977fcd7fd85a1ee033a62fd29bcc31a1d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Guo, Lingbing</creatorcontrib><creatorcontrib>Zhang, Qingheng</creatorcontrib><creatorcontrib>Ge, Weiyi</creatorcontrib><creatorcontrib>Hu, Wei</creatorcontrib><creatorcontrib>Qu, Yuzhong</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Guo, Lingbing</au><au>Zhang, Qingheng</au><au>Ge, Weiyi</au><au>Hu, Wei</au><au>Qu, Yuzhong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DSKG: A Deep Sequential Model for Knowledge Graph Completion</atitle><date>2018-10-30</date><risdate>2018</risdate><abstract>Knowledge graph (KG) completion aims to fill the missing facts in a KG, where a fact is represented as a triple in the form of $(subject, relation, object)$. Current KG completion models compel two-thirds of a triple provided (e.g., $subject$ and $relation$) to predict the remaining one. In this paper, we propose a new model, which uses a KG-specific multi-layer recurrent neural network (RNN) to model triples in a KG as sequences. It outperformed several state-of-the-art KG completion models on the conventional entity prediction task for many evaluation metrics, based on two benchmark datasets and a more difficult dataset. Furthermore, our model is enabled by the sequential characteristic and thus capable of predicting the whole triples only given one entity. Our experiments demonstrated that our model achieved promising performance on this new triple prediction task.</abstract><doi>10.48550/arxiv.1810.12582</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1810.12582
ispartof
issn
language eng
recordid cdi_arxiv_primary_1810_12582
source arXiv.org
subjects Computer Science - Learning
Statistics - Machine Learning
title DSKG: A Deep Sequential Model for Knowledge Graph Completion
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T02%3A18%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DSKG:%20A%20Deep%20Sequential%20Model%20for%20Knowledge%20Graph%20Completion&rft.au=Guo,%20Lingbing&rft.date=2018-10-30&rft_id=info:doi/10.48550/arxiv.1810.12582&rft_dat=%3Carxiv_GOX%3E1810_12582%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true