Text abstract generation method based on fact consistency enhancement

The invention relates to the technical field of natural language processing, discloses a text abstract generation method based on fact consistency enhancement, solves the problem that in the prior art, different importance degrees among fact triples are ignored to make different contributions to a f...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: LU GUOMING, QIN KE, ZHAO TAIYIN, REN ZHENHUA, LIU JINGYI, LUO GUANGCHUN
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator LU GUOMING
QIN KE
ZHAO TAIYIN
REN ZHENHUA
LIU JINGYI
LUO GUANGCHUN
description The invention relates to the technical field of natural language processing, discloses a text abstract generation method based on fact consistency enhancement, solves the problem that in the prior art, different importance degrees among fact triples are ignored to make different contributions to a final abstract result, and improves the credibility of a generated text abstract. According to the invention, a Transform architecture is adopted to construct a sequence-to-sequence text abstract generation model, and a fact attention module is introduced between a feedforward network module and a cross attention module of a decoder. The calculation module is used for calculating the influence of each fact triad on the generated word based on the attention vector of each fact triad and the word vector of the generated word output by the cross attention module, and updating the word vector of the generated word according to the influence; the attention vector of the fact triad is calculated through a self-attention m
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN117251562A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN117251562A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN117251562A3</originalsourceid><addsrcrecordid>eNrjZHANSa0oUUhMKi4pSkwuUUhPzUstSizJzM9TyE0tychPUUhKLE5NUQDy00Dyyfl5xZnFJal5yZUKqXkZiXnJqbmpeSU8DKxpiTnFqbxQmptB0c01xNlDN7UgPz61uCAxGWhuSbyzn6GhuZGpoamZkaMxMWoAFJ4zgA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Text abstract generation method based on fact consistency enhancement</title><source>esp@cenet</source><creator>LU GUOMING ; QIN KE ; ZHAO TAIYIN ; REN ZHENHUA ; LIU JINGYI ; LUO GUANGCHUN</creator><creatorcontrib>LU GUOMING ; QIN KE ; ZHAO TAIYIN ; REN ZHENHUA ; LIU JINGYI ; LUO GUANGCHUN</creatorcontrib><description>The invention relates to the technical field of natural language processing, discloses a text abstract generation method based on fact consistency enhancement, solves the problem that in the prior art, different importance degrees among fact triples are ignored to make different contributions to a final abstract result, and improves the credibility of a generated text abstract. According to the invention, a Transform architecture is adopted to construct a sequence-to-sequence text abstract generation model, and a fact attention module is introduced between a feedforward network module and a cross attention module of a decoder. The calculation module is used for calculating the influence of each fact triad on the generated word based on the attention vector of each fact triad and the word vector of the generated word output by the cross attention module, and updating the word vector of the generated word according to the influence; the attention vector of the fact triad is calculated through a self-attention m</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20231219&amp;DB=EPODOC&amp;CC=CN&amp;NR=117251562A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25543,76293</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20231219&amp;DB=EPODOC&amp;CC=CN&amp;NR=117251562A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>LU GUOMING</creatorcontrib><creatorcontrib>QIN KE</creatorcontrib><creatorcontrib>ZHAO TAIYIN</creatorcontrib><creatorcontrib>REN ZHENHUA</creatorcontrib><creatorcontrib>LIU JINGYI</creatorcontrib><creatorcontrib>LUO GUANGCHUN</creatorcontrib><title>Text abstract generation method based on fact consistency enhancement</title><description>The invention relates to the technical field of natural language processing, discloses a text abstract generation method based on fact consistency enhancement, solves the problem that in the prior art, different importance degrees among fact triples are ignored to make different contributions to a final abstract result, and improves the credibility of a generated text abstract. According to the invention, a Transform architecture is adopted to construct a sequence-to-sequence text abstract generation model, and a fact attention module is introduced between a feedforward network module and a cross attention module of a decoder. The calculation module is used for calculating the influence of each fact triad on the generated word based on the attention vector of each fact triad and the word vector of the generated word output by the cross attention module, and updating the word vector of the generated word according to the influence; the attention vector of the fact triad is calculated through a self-attention m</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZHANSa0oUUhMKi4pSkwuUUhPzUstSizJzM9TyE0tychPUUhKLE5NUQDy00Dyyfl5xZnFJal5yZUKqXkZiXnJqbmpeSU8DKxpiTnFqbxQmptB0c01xNlDN7UgPz61uCAxGWhuSbyzn6GhuZGpoamZkaMxMWoAFJ4zgA</recordid><startdate>20231219</startdate><enddate>20231219</enddate><creator>LU GUOMING</creator><creator>QIN KE</creator><creator>ZHAO TAIYIN</creator><creator>REN ZHENHUA</creator><creator>LIU JINGYI</creator><creator>LUO GUANGCHUN</creator><scope>EVB</scope></search><sort><creationdate>20231219</creationdate><title>Text abstract generation method based on fact consistency enhancement</title><author>LU GUOMING ; QIN KE ; ZHAO TAIYIN ; REN ZHENHUA ; LIU JINGYI ; LUO GUANGCHUN</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN117251562A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>LU GUOMING</creatorcontrib><creatorcontrib>QIN KE</creatorcontrib><creatorcontrib>ZHAO TAIYIN</creatorcontrib><creatorcontrib>REN ZHENHUA</creatorcontrib><creatorcontrib>LIU JINGYI</creatorcontrib><creatorcontrib>LUO GUANGCHUN</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>LU GUOMING</au><au>QIN KE</au><au>ZHAO TAIYIN</au><au>REN ZHENHUA</au><au>LIU JINGYI</au><au>LUO GUANGCHUN</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Text abstract generation method based on fact consistency enhancement</title><date>2023-12-19</date><risdate>2023</risdate><abstract>The invention relates to the technical field of natural language processing, discloses a text abstract generation method based on fact consistency enhancement, solves the problem that in the prior art, different importance degrees among fact triples are ignored to make different contributions to a final abstract result, and improves the credibility of a generated text abstract. According to the invention, a Transform architecture is adopted to construct a sequence-to-sequence text abstract generation model, and a fact attention module is introduced between a feedforward network module and a cross attention module of a decoder. The calculation module is used for calculating the influence of each fact triad on the generated word based on the attention vector of each fact triad and the word vector of the generated word output by the cross attention module, and updating the word vector of the generated word according to the influence; the attention vector of the fact triad is calculated through a self-attention m</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN117251562A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
PHYSICS
title Text abstract generation method based on fact consistency enhancement
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T05%3A51%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=LU%20GUOMING&rft.date=2023-12-19&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN117251562A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true