Lightweight classification model training method and device, medium and equipment
The invention provides a lightweight classification model training method and device, a medium and equipment. The method comprises the following steps: collecting first text data with a classification label and second text data without a classification label; according to the first text data, traini...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Patent |
Sprache: | chi ; eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | ZHANG ZHENG CHANG BINGXIN HUANG DENGRONG YUE AIZHEN |
description | The invention provides a lightweight classification model training method and device, a medium and equipment. The method comprises the following steps: collecting first text data with a classification label and second text data without a classification label; according to the first text data, training a large-scale language model to obtain a training convergent large-scale language model; inputting the second text data into the large-scale language model to obtain a classification label corresponding to the second text data, and labeling the second text data by using the classification label; and training a lightweight classification model on the edge device according to the labeled second text data. According to the embodiment of the invention, the problem that more accurate text classification is difficult to realize due to less marked data in a small scene is solved based on the existing large-scale language model. And meanwhile, the lightweight classification model has the advantage of real-time response. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN116975624A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN116975624A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN116975624A3</originalsourceid><addsrcrecordid>eNrjZAj0yUzPKClPBZEKyTmJxcWZaZnJiSWZ-XkKufkpqTkKJUWJmXmZeekKuaklGfkpCol5KQopqWWZyak6QKGUzNJcsFBqYWlmQW5qXgkPA2taYk5xKi-U5mZQdHMNcfbQTS3Ij08tLkhMTs1LLYl39jM0NLM0NzUzMnE0JkYNAAjnN_8</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Lightweight classification model training method and device, medium and equipment</title><source>esp@cenet</source><creator>ZHANG ZHENG ; CHANG BINGXIN ; HUANG DENGRONG ; YUE AIZHEN</creator><creatorcontrib>ZHANG ZHENG ; CHANG BINGXIN ; HUANG DENGRONG ; YUE AIZHEN</creatorcontrib><description>The invention provides a lightweight classification model training method and device, a medium and equipment. The method comprises the following steps: collecting first text data with a classification label and second text data without a classification label; according to the first text data, training a large-scale language model to obtain a training convergent large-scale language model; inputting the second text data into the large-scale language model to obtain a classification label corresponding to the second text data, and labeling the second text data by using the classification label; and training a lightweight classification model on the edge device according to the labeled second text data. According to the embodiment of the invention, the problem that more accurate text classification is difficult to realize due to less marked data in a small scene is solved based on the existing large-scale language model. And meanwhile, the lightweight classification model has the advantage of real-time response.</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20231031&DB=EPODOC&CC=CN&NR=116975624A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20231031&DB=EPODOC&CC=CN&NR=116975624A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>ZHANG ZHENG</creatorcontrib><creatorcontrib>CHANG BINGXIN</creatorcontrib><creatorcontrib>HUANG DENGRONG</creatorcontrib><creatorcontrib>YUE AIZHEN</creatorcontrib><title>Lightweight classification model training method and device, medium and equipment</title><description>The invention provides a lightweight classification model training method and device, a medium and equipment. The method comprises the following steps: collecting first text data with a classification label and second text data without a classification label; according to the first text data, training a large-scale language model to obtain a training convergent large-scale language model; inputting the second text data into the large-scale language model to obtain a classification label corresponding to the second text data, and labeling the second text data by using the classification label; and training a lightweight classification model on the edge device according to the labeled second text data. According to the embodiment of the invention, the problem that more accurate text classification is difficult to realize due to less marked data in a small scene is solved based on the existing large-scale language model. And meanwhile, the lightweight classification model has the advantage of real-time response.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZAj0yUzPKClPBZEKyTmJxcWZaZnJiSWZ-XkKufkpqTkKJUWJmXmZeekKuaklGfkpCol5KQopqWWZyak6QKGUzNJcsFBqYWlmQW5qXgkPA2taYk5xKi-U5mZQdHMNcfbQTS3Ij08tLkhMTs1LLYl39jM0NLM0NzUzMnE0JkYNAAjnN_8</recordid><startdate>20231031</startdate><enddate>20231031</enddate><creator>ZHANG ZHENG</creator><creator>CHANG BINGXIN</creator><creator>HUANG DENGRONG</creator><creator>YUE AIZHEN</creator><scope>EVB</scope></search><sort><creationdate>20231031</creationdate><title>Lightweight classification model training method and device, medium and equipment</title><author>ZHANG ZHENG ; CHANG BINGXIN ; HUANG DENGRONG ; YUE AIZHEN</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN116975624A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>ZHANG ZHENG</creatorcontrib><creatorcontrib>CHANG BINGXIN</creatorcontrib><creatorcontrib>HUANG DENGRONG</creatorcontrib><creatorcontrib>YUE AIZHEN</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>ZHANG ZHENG</au><au>CHANG BINGXIN</au><au>HUANG DENGRONG</au><au>YUE AIZHEN</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Lightweight classification model training method and device, medium and equipment</title><date>2023-10-31</date><risdate>2023</risdate><abstract>The invention provides a lightweight classification model training method and device, a medium and equipment. The method comprises the following steps: collecting first text data with a classification label and second text data without a classification label; according to the first text data, training a large-scale language model to obtain a training convergent large-scale language model; inputting the second text data into the large-scale language model to obtain a classification label corresponding to the second text data, and labeling the second text data by using the classification label; and training a lightweight classification model on the edge device according to the labeled second text data. According to the embodiment of the invention, the problem that more accurate text classification is difficult to realize due to less marked data in a small scene is solved based on the existing large-scale language model. And meanwhile, the lightweight classification model has the advantage of real-time response.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | chi ; eng |
recordid | cdi_epo_espacenet_CN116975624A |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING ELECTRIC DIGITAL DATA PROCESSING PHYSICS |
title | Lightweight classification model training method and device, medium and equipment |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T20%3A08%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=ZHANG%20ZHENG&rft.date=2023-10-31&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN116975624A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |