Graph attention neural network method based on shortest path
The invention discloses a graph attention neural network method based on shortest path attention. The shortest path is the path with the minimum weight sum of each edge in the paths from the source node to the target node. The shortest path is used as an attention weight coefficient, and the obtaine...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , |
---|---|
Format: | Patent |
Sprache: | chi ; eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | YAN GUANGHUI LI PENG YANG ZHIFEI DING YUHANG BAO CHENGQI SHI JIANQIANG LUO HAO CHEN GUANGWU CHANG WENWEN XING DONGFENG QU LILI |
description | The invention discloses a graph attention neural network method based on shortest path attention. The shortest path is the path with the minimum weight sum of each edge in the paths from the source node to the target node. The shortest path is used as an attention weight coefficient, and the obtained result has high interpretability. And meanwhile, the shortest path is directly calculated outside to serve as the attention coefficient weight, so that tedious calculation is not needed during training, and the training time is greatly shortened. And compared with some baseline methods on a cora data set, some more excellent results are obtained.
本发明公开了一种基于最短路径注意力的图注意力神经网络方法。最短路径即源节点到目标节点所经过路径中,各边权值和最小的一条路径。最短路径作为注意力权重系数,得到的结果具有较强的可解释性。同时因为在外部直接计算好了最短路径作为注意力系数权重,在训练时就不必再进行繁琐的计算,大大的降低了训练的时间。在cora数据集上与一些基线方法做了对比也得到了一些更优秀的结果。 |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN113537473A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN113537473A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN113537473A3</originalsourceid><addsrcrecordid>eNrjZLBxL0osyFBILClJzSvJzM9TyEstLUrMAVIl5flF2Qq5qSUZ-SkKSYnFqSkKQOnijPyiktTiEoWCxJIMHgbWtMSc4lReKM3NoOjmGuLsoZtakB-fWlyQmJwKNCfe2c_Q0NjU2NzE3NjRmBg1AAM8MAY</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Graph attention neural network method based on shortest path</title><source>esp@cenet</source><creator>YAN GUANGHUI ; LI PENG ; YANG ZHIFEI ; DING YUHANG ; BAO CHENGQI ; SHI JIANQIANG ; LUO HAO ; CHEN GUANGWU ; CHANG WENWEN ; XING DONGFENG ; QU LILI</creator><creatorcontrib>YAN GUANGHUI ; LI PENG ; YANG ZHIFEI ; DING YUHANG ; BAO CHENGQI ; SHI JIANQIANG ; LUO HAO ; CHEN GUANGWU ; CHANG WENWEN ; XING DONGFENG ; QU LILI</creatorcontrib><description>The invention discloses a graph attention neural network method based on shortest path attention. The shortest path is the path with the minimum weight sum of each edge in the paths from the source node to the target node. The shortest path is used as an attention weight coefficient, and the obtained result has high interpretability. And meanwhile, the shortest path is directly calculated outside to serve as the attention coefficient weight, so that tedious calculation is not needed during training, and the training time is greatly shortened. And compared with some baseline methods on a cora data set, some more excellent results are obtained.
本发明公开了一种基于最短路径注意力的图注意力神经网络方法。最短路径即源节点到目标节点所经过路径中,各边权值和最小的一条路径。最短路径作为注意力权重系数,得到的结果具有较强的可解释性。同时因为在外部直接计算好了最短路径作为注意力系数权重,在训练时就不必再进行繁琐的计算,大大的降低了训练的时间。在cora数据集上与一些基线方法做了对比也得到了一些更优秀的结果。</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2021</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20211022&DB=EPODOC&CC=CN&NR=113537473A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25562,76317</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20211022&DB=EPODOC&CC=CN&NR=113537473A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>YAN GUANGHUI</creatorcontrib><creatorcontrib>LI PENG</creatorcontrib><creatorcontrib>YANG ZHIFEI</creatorcontrib><creatorcontrib>DING YUHANG</creatorcontrib><creatorcontrib>BAO CHENGQI</creatorcontrib><creatorcontrib>SHI JIANQIANG</creatorcontrib><creatorcontrib>LUO HAO</creatorcontrib><creatorcontrib>CHEN GUANGWU</creatorcontrib><creatorcontrib>CHANG WENWEN</creatorcontrib><creatorcontrib>XING DONGFENG</creatorcontrib><creatorcontrib>QU LILI</creatorcontrib><title>Graph attention neural network method based on shortest path</title><description>The invention discloses a graph attention neural network method based on shortest path attention. The shortest path is the path with the minimum weight sum of each edge in the paths from the source node to the target node. The shortest path is used as an attention weight coefficient, and the obtained result has high interpretability. And meanwhile, the shortest path is directly calculated outside to serve as the attention coefficient weight, so that tedious calculation is not needed during training, and the training time is greatly shortened. And compared with some baseline methods on a cora data set, some more excellent results are obtained.
本发明公开了一种基于最短路径注意力的图注意力神经网络方法。最短路径即源节点到目标节点所经过路径中,各边权值和最小的一条路径。最短路径作为注意力权重系数,得到的结果具有较强的可解释性。同时因为在外部直接计算好了最短路径作为注意力系数权重,在训练时就不必再进行繁琐的计算,大大的降低了训练的时间。在cora数据集上与一些基线方法做了对比也得到了一些更优秀的结果。</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2021</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZLBxL0osyFBILClJzSvJzM9TyEstLUrMAVIl5flF2Qq5qSUZ-SkKSYnFqSkKQOnijPyiktTiEoWCxJIMHgbWtMSc4lReKM3NoOjmGuLsoZtakB-fWlyQmJwKNCfe2c_Q0NjU2NzE3NjRmBg1AAM8MAY</recordid><startdate>20211022</startdate><enddate>20211022</enddate><creator>YAN GUANGHUI</creator><creator>LI PENG</creator><creator>YANG ZHIFEI</creator><creator>DING YUHANG</creator><creator>BAO CHENGQI</creator><creator>SHI JIANQIANG</creator><creator>LUO HAO</creator><creator>CHEN GUANGWU</creator><creator>CHANG WENWEN</creator><creator>XING DONGFENG</creator><creator>QU LILI</creator><scope>EVB</scope></search><sort><creationdate>20211022</creationdate><title>Graph attention neural network method based on shortest path</title><author>YAN GUANGHUI ; LI PENG ; YANG ZHIFEI ; DING YUHANG ; BAO CHENGQI ; SHI JIANQIANG ; LUO HAO ; CHEN GUANGWU ; CHANG WENWEN ; XING DONGFENG ; QU LILI</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN113537473A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2021</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>YAN GUANGHUI</creatorcontrib><creatorcontrib>LI PENG</creatorcontrib><creatorcontrib>YANG ZHIFEI</creatorcontrib><creatorcontrib>DING YUHANG</creatorcontrib><creatorcontrib>BAO CHENGQI</creatorcontrib><creatorcontrib>SHI JIANQIANG</creatorcontrib><creatorcontrib>LUO HAO</creatorcontrib><creatorcontrib>CHEN GUANGWU</creatorcontrib><creatorcontrib>CHANG WENWEN</creatorcontrib><creatorcontrib>XING DONGFENG</creatorcontrib><creatorcontrib>QU LILI</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>YAN GUANGHUI</au><au>LI PENG</au><au>YANG ZHIFEI</au><au>DING YUHANG</au><au>BAO CHENGQI</au><au>SHI JIANQIANG</au><au>LUO HAO</au><au>CHEN GUANGWU</au><au>CHANG WENWEN</au><au>XING DONGFENG</au><au>QU LILI</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Graph attention neural network method based on shortest path</title><date>2021-10-22</date><risdate>2021</risdate><abstract>The invention discloses a graph attention neural network method based on shortest path attention. The shortest path is the path with the minimum weight sum of each edge in the paths from the source node to the target node. The shortest path is used as an attention weight coefficient, and the obtained result has high interpretability. And meanwhile, the shortest path is directly calculated outside to serve as the attention coefficient weight, so that tedious calculation is not needed during training, and the training time is greatly shortened. And compared with some baseline methods on a cora data set, some more excellent results are obtained.
本发明公开了一种基于最短路径注意力的图注意力神经网络方法。最短路径即源节点到目标节点所经过路径中,各边权值和最小的一条路径。最短路径作为注意力权重系数,得到的结果具有较强的可解释性。同时因为在外部直接计算好了最短路径作为注意力系数权重,在训练时就不必再进行繁琐的计算,大大的降低了训练的时间。在cora数据集上与一些基线方法做了对比也得到了一些更优秀的结果。</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | chi ; eng |
recordid | cdi_epo_espacenet_CN113537473A |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING PHYSICS |
title | Graph attention neural network method based on shortest path |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T15%3A44%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=YAN%20GUANGHUI&rft.date=2021-10-22&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN113537473A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |