Model training method and device, equipment and storage medium

The invention provides a model training method and device, equipment and a storage medium, relates to the technical field of artificial intelligence, in particular to the technical fields of machine vision, deep learning and the like, and can be applied to scenes such as target detection and neural...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: LYU WENYU, XU SHANGLIANG, ZHAO YIAN, WANG GUANZHONG, DANG QINGQING
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator LYU WENYU
XU SHANGLIANG
ZHAO YIAN
WANG GUANZHONG
DANG QINGQING
description The invention provides a model training method and device, equipment and a storage medium, relates to the technical field of artificial intelligence, in particular to the technical fields of machine vision, deep learning and the like, and can be applied to scenes such as target detection and neural network optimization. According to the specific implementation scheme, the method comprises the steps of obtaining a level sequence number of each parameter transmission level in a decoding module of a to-be-trained neural network; a first transmission path set is generated according to the level sequence numbers, and in the first transmission path set, the level sequence number interval between adjacent levels in each transmission path does not exceed two levels; according to a preset path selection rule, selecting a target transmission path from the first transmission path set as a transmission path of a decoding module in the neural network; and training the neural network by using the training data according to
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN116468071A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN116468071A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN116468071A3</originalsourceid><addsrcrecordid>eNrjZLDzzU9JzVEoKUrMzMvMS1fITS3JyE9RSMxLUUhJLctMTtVRSC0szSzITc0rAYsWl-QXJaanAhWmZJbm8jCwpiXmFKfyQmluBkU31xBnD93Ugvz41OKCxOTUvNSSeGc_Q0MzEzMLA3NDR2Ni1AAANJAwRw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Model training method and device, equipment and storage medium</title><source>esp@cenet</source><creator>LYU WENYU ; XU SHANGLIANG ; ZHAO YIAN ; WANG GUANZHONG ; DANG QINGQING</creator><creatorcontrib>LYU WENYU ; XU SHANGLIANG ; ZHAO YIAN ; WANG GUANZHONG ; DANG QINGQING</creatorcontrib><description>The invention provides a model training method and device, equipment and a storage medium, relates to the technical field of artificial intelligence, in particular to the technical fields of machine vision, deep learning and the like, and can be applied to scenes such as target detection and neural network optimization. According to the specific implementation scheme, the method comprises the steps of obtaining a level sequence number of each parameter transmission level in a decoding module of a to-be-trained neural network; a first transmission path set is generated according to the level sequence numbers, and in the first transmission path set, the level sequence number interval between adjacent levels in each transmission path does not exceed two levels; according to a preset path selection rule, selecting a target transmission path from the first transmission path set as a transmission path of a decoding module in the neural network; and training the neural network by using the training data according to</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230721&amp;DB=EPODOC&amp;CC=CN&amp;NR=116468071A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76289</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230721&amp;DB=EPODOC&amp;CC=CN&amp;NR=116468071A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>LYU WENYU</creatorcontrib><creatorcontrib>XU SHANGLIANG</creatorcontrib><creatorcontrib>ZHAO YIAN</creatorcontrib><creatorcontrib>WANG GUANZHONG</creatorcontrib><creatorcontrib>DANG QINGQING</creatorcontrib><title>Model training method and device, equipment and storage medium</title><description>The invention provides a model training method and device, equipment and a storage medium, relates to the technical field of artificial intelligence, in particular to the technical fields of machine vision, deep learning and the like, and can be applied to scenes such as target detection and neural network optimization. According to the specific implementation scheme, the method comprises the steps of obtaining a level sequence number of each parameter transmission level in a decoding module of a to-be-trained neural network; a first transmission path set is generated according to the level sequence numbers, and in the first transmission path set, the level sequence number interval between adjacent levels in each transmission path does not exceed two levels; according to a preset path selection rule, selecting a target transmission path from the first transmission path set as a transmission path of a decoding module in the neural network; and training the neural network by using the training data according to</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZLDzzU9JzVEoKUrMzMvMS1fITS3JyE9RSMxLUUhJLctMTtVRSC0szSzITc0rAYsWl-QXJaanAhWmZJbm8jCwpiXmFKfyQmluBkU31xBnD93Ugvz41OKCxOTUvNSSeGc_Q0MzEzMLA3NDR2Ni1AAANJAwRw</recordid><startdate>20230721</startdate><enddate>20230721</enddate><creator>LYU WENYU</creator><creator>XU SHANGLIANG</creator><creator>ZHAO YIAN</creator><creator>WANG GUANZHONG</creator><creator>DANG QINGQING</creator><scope>EVB</scope></search><sort><creationdate>20230721</creationdate><title>Model training method and device, equipment and storage medium</title><author>LYU WENYU ; XU SHANGLIANG ; ZHAO YIAN ; WANG GUANZHONG ; DANG QINGQING</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN116468071A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>LYU WENYU</creatorcontrib><creatorcontrib>XU SHANGLIANG</creatorcontrib><creatorcontrib>ZHAO YIAN</creatorcontrib><creatorcontrib>WANG GUANZHONG</creatorcontrib><creatorcontrib>DANG QINGQING</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>LYU WENYU</au><au>XU SHANGLIANG</au><au>ZHAO YIAN</au><au>WANG GUANZHONG</au><au>DANG QINGQING</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Model training method and device, equipment and storage medium</title><date>2023-07-21</date><risdate>2023</risdate><abstract>The invention provides a model training method and device, equipment and a storage medium, relates to the technical field of artificial intelligence, in particular to the technical fields of machine vision, deep learning and the like, and can be applied to scenes such as target detection and neural network optimization. According to the specific implementation scheme, the method comprises the steps of obtaining a level sequence number of each parameter transmission level in a decoding module of a to-be-trained neural network; a first transmission path set is generated according to the level sequence numbers, and in the first transmission path set, the level sequence number interval between adjacent levels in each transmission path does not exceed two levels; according to a preset path selection rule, selecting a target transmission path from the first transmission path set as a transmission path of a decoding module in the neural network; and training the neural network by using the training data according to</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN116468071A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
PHYSICS
title Model training method and device, equipment and storage medium
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T22%3A35%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=LYU%20WENYU&rft.date=2023-07-21&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN116468071A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true