Language model training method and device, electronic equipment and storage medium

The invention provides a language model training method and device, electronic equipment and a computer readable storage medium. The language model training method and device are used for training a plurality of N-element models according to a preset training corpus set; according to an expectation...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: CHEN YINGWEN, JIAN RENXIAN, LIN CHANGZHOU
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator CHEN YINGWEN
JIAN RENXIAN
LIN CHANGZHOU
description The invention provides a language model training method and device, electronic equipment and a computer readable storage medium. The language model training method and device are used for training a plurality of N-element models according to a preset training corpus set; according to an expectation maximization algorithm, determining an optimal weight coefficient corresponding to each N-element model when the plurality of N-element models process a preset target corpus set; and according to the optimal weight coefficient corresponding to each N-element model, performing interpolation processing on the plurality of N-element models to obtain a language model. Herein, the optimal weight coefficient of each N-element model during processing of the target corpus set is determined through an expectation maximization algorithm, and after interpolation processing is conducted on the N-element models through the optimal weight coefficients, the language model with the optimal overall processing result can be obtained
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN113761901A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN113761901A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN113761901A3</originalsourceid><addsrcrecordid>eNqNzDsKAjEURuE0FqLu4dorGAYUSxkUC7EQ-yEkv5lAchPzcP3C4AKsTvNx5uJxU2ybsqAQDTzVrBw7thRQx2hIsSGDj9PYEDx0zZGdJrybSwFcJ1BqzNMCxrWwFLOX8gWrXxdifTk_--sWKQ4oSWkw6tDfpewOe3ncyVP3j_kCDCo3_Q</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Language model training method and device, electronic equipment and storage medium</title><source>esp@cenet</source><creator>CHEN YINGWEN ; JIAN RENXIAN ; LIN CHANGZHOU</creator><creatorcontrib>CHEN YINGWEN ; JIAN RENXIAN ; LIN CHANGZHOU</creatorcontrib><description>The invention provides a language model training method and device, electronic equipment and a computer readable storage medium. The language model training method and device are used for training a plurality of N-element models according to a preset training corpus set; according to an expectation maximization algorithm, determining an optimal weight coefficient corresponding to each N-element model when the plurality of N-element models process a preset target corpus set; and according to the optimal weight coefficient corresponding to each N-element model, performing interpolation processing on the plurality of N-element models to obtain a language model. Herein, the optimal weight coefficient of each N-element model during processing of the target corpus set is determined through an expectation maximization algorithm, and after interpolation processing is conducted on the N-element models through the optimal weight coefficients, the language model with the optimal overall processing result can be obtained</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; HANDLING RECORD CARRIERS ; PHYSICS ; PRESENTATION OF DATA ; RECOGNITION OF DATA ; RECORD CARRIERS</subject><creationdate>2021</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20211207&amp;DB=EPODOC&amp;CC=CN&amp;NR=113761901A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20211207&amp;DB=EPODOC&amp;CC=CN&amp;NR=113761901A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>CHEN YINGWEN</creatorcontrib><creatorcontrib>JIAN RENXIAN</creatorcontrib><creatorcontrib>LIN CHANGZHOU</creatorcontrib><title>Language model training method and device, electronic equipment and storage medium</title><description>The invention provides a language model training method and device, electronic equipment and a computer readable storage medium. The language model training method and device are used for training a plurality of N-element models according to a preset training corpus set; according to an expectation maximization algorithm, determining an optimal weight coefficient corresponding to each N-element model when the plurality of N-element models process a preset target corpus set; and according to the optimal weight coefficient corresponding to each N-element model, performing interpolation processing on the plurality of N-element models to obtain a language model. Herein, the optimal weight coefficient of each N-element model during processing of the target corpus set is determined through an expectation maximization algorithm, and after interpolation processing is conducted on the N-element models through the optimal weight coefficients, the language model with the optimal overall processing result can be obtained</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>HANDLING RECORD CARRIERS</subject><subject>PHYSICS</subject><subject>PRESENTATION OF DATA</subject><subject>RECOGNITION OF DATA</subject><subject>RECORD CARRIERS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2021</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNzDsKAjEURuE0FqLu4dorGAYUSxkUC7EQ-yEkv5lAchPzcP3C4AKsTvNx5uJxU2ybsqAQDTzVrBw7thRQx2hIsSGDj9PYEDx0zZGdJrybSwFcJ1BqzNMCxrWwFLOX8gWrXxdifTk_--sWKQ4oSWkw6tDfpewOe3ncyVP3j_kCDCo3_Q</recordid><startdate>20211207</startdate><enddate>20211207</enddate><creator>CHEN YINGWEN</creator><creator>JIAN RENXIAN</creator><creator>LIN CHANGZHOU</creator><scope>EVB</scope></search><sort><creationdate>20211207</creationdate><title>Language model training method and device, electronic equipment and storage medium</title><author>CHEN YINGWEN ; JIAN RENXIAN ; LIN CHANGZHOU</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN113761901A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2021</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>HANDLING RECORD CARRIERS</topic><topic>PHYSICS</topic><topic>PRESENTATION OF DATA</topic><topic>RECOGNITION OF DATA</topic><topic>RECORD CARRIERS</topic><toplevel>online_resources</toplevel><creatorcontrib>CHEN YINGWEN</creatorcontrib><creatorcontrib>JIAN RENXIAN</creatorcontrib><creatorcontrib>LIN CHANGZHOU</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>CHEN YINGWEN</au><au>JIAN RENXIAN</au><au>LIN CHANGZHOU</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Language model training method and device, electronic equipment and storage medium</title><date>2021-12-07</date><risdate>2021</risdate><abstract>The invention provides a language model training method and device, electronic equipment and a computer readable storage medium. The language model training method and device are used for training a plurality of N-element models according to a preset training corpus set; according to an expectation maximization algorithm, determining an optimal weight coefficient corresponding to each N-element model when the plurality of N-element models process a preset target corpus set; and according to the optimal weight coefficient corresponding to each N-element model, performing interpolation processing on the plurality of N-element models to obtain a language model. Herein, the optimal weight coefficient of each N-element model during processing of the target corpus set is determined through an expectation maximization algorithm, and after interpolation processing is conducted on the N-element models through the optimal weight coefficients, the language model with the optimal overall processing result can be obtained</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN113761901A
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
HANDLING RECORD CARRIERS
PHYSICS
PRESENTATION OF DATA
RECOGNITION OF DATA
RECORD CARRIERS
title Language model training method and device, electronic equipment and storage medium
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T14%3A24%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=CHEN%20YINGWEN&rft.date=2021-12-07&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN113761901A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true