Knowledge enhancement-based text generation model and training method thereof

The invention provides a text generation model based on knowledge enhancement, which is used for generating a text sequence related to an input text sequence according to the input text sequence, the model comprises a coding module and a decoding module, the coding module comprises a coding unit use...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: JIANG HUAICHEN, LI DONGDONG, ZHANG YIYANG
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator JIANG HUAICHEN
LI DONGDONG
ZHANG YIYANG
description The invention provides a text generation model based on knowledge enhancement, which is used for generating a text sequence related to an input text sequence according to the input text sequence, the model comprises a coding module and a decoding module, the coding module comprises a coding unit used for coding each word in the input text sequence into a hidden vector, obtaining a first semantic vector of the word according to each word hidden vector; the knowledge graph attention unit is used for obtaining a knowledge graph vector corresponding to each word based on the knowledge graph, and splicing the knowledge graph vector of each word and the first semantic vector of the word to obtain a second semantic vector of each word; the first variation sampling unit is used for performing variation sampling on the second semantic vector to obtain a first hidden space vector; the regression sampling transformation unit is used for carrying out autoregression sampling on the first hidden space vector to obtain a se
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN115345169A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN115345169A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN115345169A3</originalsourceid><addsrcrecordid>eNqNjLEKwjAURbM4iPoP8QM6hFrBsRRFEJ3cy7O5TQLJS0kC-vlm8AOcLodzuGtxv3F8e2gDCbbEEwK4NC_K0LLgU6QBI1FxkWWIGl4SV5PIsWMjA4qNlS0S4rwVq5l8xu63G7G_nJ_DtcESR-SFpnpWxuGhVNceOnU89e0_zRcJBDaL</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Knowledge enhancement-based text generation model and training method thereof</title><source>esp@cenet</source><creator>JIANG HUAICHEN ; LI DONGDONG ; ZHANG YIYANG</creator><creatorcontrib>JIANG HUAICHEN ; LI DONGDONG ; ZHANG YIYANG</creatorcontrib><description>The invention provides a text generation model based on knowledge enhancement, which is used for generating a text sequence related to an input text sequence according to the input text sequence, the model comprises a coding module and a decoding module, the coding module comprises a coding unit used for coding each word in the input text sequence into a hidden vector, obtaining a first semantic vector of the word according to each word hidden vector; the knowledge graph attention unit is used for obtaining a knowledge graph vector corresponding to each word based on the knowledge graph, and splicing the knowledge graph vector of each word and the first semantic vector of the word to obtain a second semantic vector of each word; the first variation sampling unit is used for performing variation sampling on the second semantic vector to obtain a first hidden space vector; the regression sampling transformation unit is used for carrying out autoregression sampling on the first hidden space vector to obtain a se</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2022</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20221115&amp;DB=EPODOC&amp;CC=CN&amp;NR=115345169A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76290</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20221115&amp;DB=EPODOC&amp;CC=CN&amp;NR=115345169A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>JIANG HUAICHEN</creatorcontrib><creatorcontrib>LI DONGDONG</creatorcontrib><creatorcontrib>ZHANG YIYANG</creatorcontrib><title>Knowledge enhancement-based text generation model and training method thereof</title><description>The invention provides a text generation model based on knowledge enhancement, which is used for generating a text sequence related to an input text sequence according to the input text sequence, the model comprises a coding module and a decoding module, the coding module comprises a coding unit used for coding each word in the input text sequence into a hidden vector, obtaining a first semantic vector of the word according to each word hidden vector; the knowledge graph attention unit is used for obtaining a knowledge graph vector corresponding to each word based on the knowledge graph, and splicing the knowledge graph vector of each word and the first semantic vector of the word to obtain a second semantic vector of each word; the first variation sampling unit is used for performing variation sampling on the second semantic vector to obtain a first hidden space vector; the regression sampling transformation unit is used for carrying out autoregression sampling on the first hidden space vector to obtain a se</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2022</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNjLEKwjAURbM4iPoP8QM6hFrBsRRFEJ3cy7O5TQLJS0kC-vlm8AOcLodzuGtxv3F8e2gDCbbEEwK4NC_K0LLgU6QBI1FxkWWIGl4SV5PIsWMjA4qNlS0S4rwVq5l8xu63G7G_nJ_DtcESR-SFpnpWxuGhVNceOnU89e0_zRcJBDaL</recordid><startdate>20221115</startdate><enddate>20221115</enddate><creator>JIANG HUAICHEN</creator><creator>LI DONGDONG</creator><creator>ZHANG YIYANG</creator><scope>EVB</scope></search><sort><creationdate>20221115</creationdate><title>Knowledge enhancement-based text generation model and training method thereof</title><author>JIANG HUAICHEN ; LI DONGDONG ; ZHANG YIYANG</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN115345169A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2022</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>JIANG HUAICHEN</creatorcontrib><creatorcontrib>LI DONGDONG</creatorcontrib><creatorcontrib>ZHANG YIYANG</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>JIANG HUAICHEN</au><au>LI DONGDONG</au><au>ZHANG YIYANG</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Knowledge enhancement-based text generation model and training method thereof</title><date>2022-11-15</date><risdate>2022</risdate><abstract>The invention provides a text generation model based on knowledge enhancement, which is used for generating a text sequence related to an input text sequence according to the input text sequence, the model comprises a coding module and a decoding module, the coding module comprises a coding unit used for coding each word in the input text sequence into a hidden vector, obtaining a first semantic vector of the word according to each word hidden vector; the knowledge graph attention unit is used for obtaining a knowledge graph vector corresponding to each word based on the knowledge graph, and splicing the knowledge graph vector of each word and the first semantic vector of the word to obtain a second semantic vector of each word; the first variation sampling unit is used for performing variation sampling on the second semantic vector to obtain a first hidden space vector; the regression sampling transformation unit is used for carrying out autoregression sampling on the first hidden space vector to obtain a se</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN115345169A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
PHYSICS
title Knowledge enhancement-based text generation model and training method thereof
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T15%3A41%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=JIANG%20HUAICHEN&rft.date=2022-11-15&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN115345169A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true