Method, device and system of multi-learning subject parallel training model
The invention relates to a method, device and system of a multi-learning subject parallel training model. The method includes the following steps that: samples are read through a plurality of trained learning subjects in a single machine respectively; one learning subject acquires current parameter...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | GUO ZHIMAO ZOU YONGQIANG XIAO LEI JIN XING LI YI XUE WEI |
description | The invention relates to a method, device and system of a multi-learning subject parallel training model. The method includes the following steps that: samples are read through a plurality of trained learning subjects in a single machine respectively; one learning subject acquires current parameter values from the training model at the same time point; the read samples are trained according to the current parameter values, so that new parameter values can be obtained; and the new parameter values are updated into the training model, and one parameter value is stored in the training model. According to the method, device and system of the multi-learning subject parallel training model, since the model only saves one parameter value, the latest state of the model can be visited by all learning subjects, and when any learning subject updates the state of the model, learning subjects which read the state of the model sequentially can see latest update, and therefore, influences caused by a situation in which differences exist in the observation of the state of the model which is further caused by unsharing of the model can be greatly decreased, and in a training process, the model can converge fast. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN104980518A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN104980518A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN104980518A3</originalsourceid><addsrcrecordid>eNrjZPD2TS3JyE_RUUhJLctMTlVIzEtRKK4sLknNVchPU8gtzSnJ1M1JTSzKy8xLVyguTcpKTS5RKEgsSszJSc1RKClKzATL5OanpObwMLCmJeYUp_JCaW4GRTfXEGcP3dSC_PjU4oLE5NS81JJ4Zz9DAxNLCwNTQwtHY2LUAAA_xjVW</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Method, device and system of multi-learning subject parallel training model</title><source>esp@cenet</source><creator>GUO ZHIMAO ; ZOU YONGQIANG ; XIAO LEI ; JIN XING ; LI YI ; XUE WEI</creator><creatorcontrib>GUO ZHIMAO ; ZOU YONGQIANG ; XIAO LEI ; JIN XING ; LI YI ; XUE WEI</creatorcontrib><description>The invention relates to a method, device and system of a multi-learning subject parallel training model. The method includes the following steps that: samples are read through a plurality of trained learning subjects in a single machine respectively; one learning subject acquires current parameter values from the training model at the same time point; the read samples are trained according to the current parameter values, so that new parameter values can be obtained; and the new parameter values are updated into the training model, and one parameter value is stored in the training model. According to the method, device and system of the multi-learning subject parallel training model, since the model only saves one parameter value, the latest state of the model can be visited by all learning subjects, and when any learning subject updates the state of the model, learning subjects which read the state of the model sequentially can see latest update, and therefore, influences caused by a situation in which differences exist in the observation of the state of the model which is further caused by unsharing of the model can be greatly decreased, and in a training process, the model can converge fast.</description><language>eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC COMMUNICATION TECHNIQUE ; ELECTRIC DIGITAL DATA PROCESSING ; ELECTRICITY ; PHYSICS ; TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHICCOMMUNICATION</subject><creationdate>2015</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20151014&DB=EPODOC&CC=CN&NR=104980518A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76289</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20151014&DB=EPODOC&CC=CN&NR=104980518A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>GUO ZHIMAO</creatorcontrib><creatorcontrib>ZOU YONGQIANG</creatorcontrib><creatorcontrib>XIAO LEI</creatorcontrib><creatorcontrib>JIN XING</creatorcontrib><creatorcontrib>LI YI</creatorcontrib><creatorcontrib>XUE WEI</creatorcontrib><title>Method, device and system of multi-learning subject parallel training model</title><description>The invention relates to a method, device and system of a multi-learning subject parallel training model. The method includes the following steps that: samples are read through a plurality of trained learning subjects in a single machine respectively; one learning subject acquires current parameter values from the training model at the same time point; the read samples are trained according to the current parameter values, so that new parameter values can be obtained; and the new parameter values are updated into the training model, and one parameter value is stored in the training model. According to the method, device and system of the multi-learning subject parallel training model, since the model only saves one parameter value, the latest state of the model can be visited by all learning subjects, and when any learning subject updates the state of the model, learning subjects which read the state of the model sequentially can see latest update, and therefore, influences caused by a situation in which differences exist in the observation of the state of the model which is further caused by unsharing of the model can be greatly decreased, and in a training process, the model can converge fast.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC COMMUNICATION TECHNIQUE</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>ELECTRICITY</subject><subject>PHYSICS</subject><subject>TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHICCOMMUNICATION</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2015</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZPD2TS3JyE_RUUhJLctMTlVIzEtRKK4sLknNVchPU8gtzSnJ1M1JTSzKy8xLVyguTcpKTS5RKEgsSszJSc1RKClKzATL5OanpObwMLCmJeYUp_JCaW4GRTfXEGcP3dSC_PjU4oLE5NS81JJ4Zz9DAxNLCwNTQwtHY2LUAAA_xjVW</recordid><startdate>20151014</startdate><enddate>20151014</enddate><creator>GUO ZHIMAO</creator><creator>ZOU YONGQIANG</creator><creator>XIAO LEI</creator><creator>JIN XING</creator><creator>LI YI</creator><creator>XUE WEI</creator><scope>EVB</scope></search><sort><creationdate>20151014</creationdate><title>Method, device and system of multi-learning subject parallel training model</title><author>GUO ZHIMAO ; ZOU YONGQIANG ; XIAO LEI ; JIN XING ; LI YI ; XUE WEI</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN104980518A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2015</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC COMMUNICATION TECHNIQUE</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>ELECTRICITY</topic><topic>PHYSICS</topic><topic>TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHICCOMMUNICATION</topic><toplevel>online_resources</toplevel><creatorcontrib>GUO ZHIMAO</creatorcontrib><creatorcontrib>ZOU YONGQIANG</creatorcontrib><creatorcontrib>XIAO LEI</creatorcontrib><creatorcontrib>JIN XING</creatorcontrib><creatorcontrib>LI YI</creatorcontrib><creatorcontrib>XUE WEI</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>GUO ZHIMAO</au><au>ZOU YONGQIANG</au><au>XIAO LEI</au><au>JIN XING</au><au>LI YI</au><au>XUE WEI</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Method, device and system of multi-learning subject parallel training model</title><date>2015-10-14</date><risdate>2015</risdate><abstract>The invention relates to a method, device and system of a multi-learning subject parallel training model. The method includes the following steps that: samples are read through a plurality of trained learning subjects in a single machine respectively; one learning subject acquires current parameter values from the training model at the same time point; the read samples are trained according to the current parameter values, so that new parameter values can be obtained; and the new parameter values are updated into the training model, and one parameter value is stored in the training model. According to the method, device and system of the multi-learning subject parallel training model, since the model only saves one parameter value, the latest state of the model can be visited by all learning subjects, and when any learning subject updates the state of the model, learning subjects which read the state of the model sequentially can see latest update, and therefore, influences caused by a situation in which differences exist in the observation of the state of the model which is further caused by unsharing of the model can be greatly decreased, and in a training process, the model can converge fast.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_CN104980518A |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING ELECTRIC COMMUNICATION TECHNIQUE ELECTRIC DIGITAL DATA PROCESSING ELECTRICITY PHYSICS TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHICCOMMUNICATION |
title | Method, device and system of multi-learning subject parallel training model |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T22%3A26%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=GUO%20ZHIMAO&rft.date=2015-10-14&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN104980518A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |