NATURAL LANGUAGE TRAINING AND/OR AUGMENTATION WITH LARGE LANGUAGE MODELS

The techniques described herein enhance the operations of natural language generation systems through training and/or augmentation by a large language model. In a first example, the large language model can execute training operations by processing a training dataset to produce a natural language ou...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: WANG, Shuohang, ITER, Dan, ZENG, Nanshan, XU, Yichong, LIU, Yang, ZHU, Chenguang, SHARMA, Hiteshi
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator WANG, Shuohang
ITER, Dan
ZENG, Nanshan
XU, Yichong
LIU, Yang
ZHU, Chenguang
SHARMA, Hiteshi
description The techniques described herein enhance the operations of natural language generation systems through training and/or augmentation by a large language model. In a first example, the large language model can execute training operations by processing a training dataset to produce a natural language output. The natural language generation system can analyze the training dataset and the natural language output to generate a natural language output mimicking the output of the large language model. The large language model can then evaluate the output of the natural language generation system to iteratively adjust and improve the quality of natural language outputs. In a second example, the large language can augment a small language model in executing natural language tasks. This is accomplished by retrieving external information using the large language model to generate an augmentation input to provide context and a language framework to the small language model to enhance overall outputs.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2024346254A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2024346254A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2024346254A13</originalsourceid><addsrcrecordid>eNrjZPDwcwwJDXL0UfBx9HMPdXR3VQgJcvT08_RzV3D0c9H3D1JwDHX3dfULcQzx9PdTCPcM8QAqDQKqg2vw9Xdx9QnmYWBNS8wpTuWF0twMym6uIc4euqkF-fGpxQWJyal5qSXxocFGBkYmxiZmRqYmjobGxKkCALQ-Lbg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>NATURAL LANGUAGE TRAINING AND/OR AUGMENTATION WITH LARGE LANGUAGE MODELS</title><source>esp@cenet</source><creator>WANG, Shuohang ; ITER, Dan ; ZENG, Nanshan ; XU, Yichong ; LIU, Yang ; ZHU, Chenguang ; SHARMA, Hiteshi</creator><creatorcontrib>WANG, Shuohang ; ITER, Dan ; ZENG, Nanshan ; XU, Yichong ; LIU, Yang ; ZHU, Chenguang ; SHARMA, Hiteshi</creatorcontrib><description>The techniques described herein enhance the operations of natural language generation systems through training and/or augmentation by a large language model. In a first example, the large language model can execute training operations by processing a training dataset to produce a natural language output. The natural language generation system can analyze the training dataset and the natural language output to generate a natural language output mimicking the output of the large language model. The large language model can then evaluate the output of the natural language generation system to iteratively adjust and improve the quality of natural language outputs. In a second example, the large language can augment a small language model in executing natural language tasks. This is accomplished by retrieving external information using the large language model to generate an augmentation input to provide context and a language framework to the small language model to enhance overall outputs.</description><language>eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20241017&amp;DB=EPODOC&amp;CC=US&amp;NR=2024346254A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76289</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20241017&amp;DB=EPODOC&amp;CC=US&amp;NR=2024346254A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>WANG, Shuohang</creatorcontrib><creatorcontrib>ITER, Dan</creatorcontrib><creatorcontrib>ZENG, Nanshan</creatorcontrib><creatorcontrib>XU, Yichong</creatorcontrib><creatorcontrib>LIU, Yang</creatorcontrib><creatorcontrib>ZHU, Chenguang</creatorcontrib><creatorcontrib>SHARMA, Hiteshi</creatorcontrib><title>NATURAL LANGUAGE TRAINING AND/OR AUGMENTATION WITH LARGE LANGUAGE MODELS</title><description>The techniques described herein enhance the operations of natural language generation systems through training and/or augmentation by a large language model. In a first example, the large language model can execute training operations by processing a training dataset to produce a natural language output. The natural language generation system can analyze the training dataset and the natural language output to generate a natural language output mimicking the output of the large language model. The large language model can then evaluate the output of the natural language generation system to iteratively adjust and improve the quality of natural language outputs. In a second example, the large language can augment a small language model in executing natural language tasks. This is accomplished by retrieving external information using the large language model to generate an augmentation input to provide context and a language framework to the small language model to enhance overall outputs.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZPDwcwwJDXL0UfBx9HMPdXR3VQgJcvT08_RzV3D0c9H3D1JwDHX3dfULcQzx9PdTCPcM8QAqDQKqg2vw9Xdx9QnmYWBNS8wpTuWF0twMym6uIc4euqkF-fGpxQWJyal5qSXxocFGBkYmxiZmRqYmjobGxKkCALQ-Lbg</recordid><startdate>20241017</startdate><enddate>20241017</enddate><creator>WANG, Shuohang</creator><creator>ITER, Dan</creator><creator>ZENG, Nanshan</creator><creator>XU, Yichong</creator><creator>LIU, Yang</creator><creator>ZHU, Chenguang</creator><creator>SHARMA, Hiteshi</creator><scope>EVB</scope></search><sort><creationdate>20241017</creationdate><title>NATURAL LANGUAGE TRAINING AND/OR AUGMENTATION WITH LARGE LANGUAGE MODELS</title><author>WANG, Shuohang ; ITER, Dan ; ZENG, Nanshan ; XU, Yichong ; LIU, Yang ; ZHU, Chenguang ; SHARMA, Hiteshi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2024346254A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>WANG, Shuohang</creatorcontrib><creatorcontrib>ITER, Dan</creatorcontrib><creatorcontrib>ZENG, Nanshan</creatorcontrib><creatorcontrib>XU, Yichong</creatorcontrib><creatorcontrib>LIU, Yang</creatorcontrib><creatorcontrib>ZHU, Chenguang</creatorcontrib><creatorcontrib>SHARMA, Hiteshi</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>WANG, Shuohang</au><au>ITER, Dan</au><au>ZENG, Nanshan</au><au>XU, Yichong</au><au>LIU, Yang</au><au>ZHU, Chenguang</au><au>SHARMA, Hiteshi</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>NATURAL LANGUAGE TRAINING AND/OR AUGMENTATION WITH LARGE LANGUAGE MODELS</title><date>2024-10-17</date><risdate>2024</risdate><abstract>The techniques described herein enhance the operations of natural language generation systems through training and/or augmentation by a large language model. In a first example, the large language model can execute training operations by processing a training dataset to produce a natural language output. The natural language generation system can analyze the training dataset and the natural language output to generate a natural language output mimicking the output of the large language model. The large language model can then evaluate the output of the natural language generation system to iteratively adjust and improve the quality of natural language outputs. In a second example, the large language can augment a small language model in executing natural language tasks. This is accomplished by retrieving external information using the large language model to generate an augmentation input to provide context and a language framework to the small language model to enhance overall outputs.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_US2024346254A1
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
PHYSICS
title NATURAL LANGUAGE TRAINING AND/OR AUGMENTATION WITH LARGE LANGUAGE MODELS
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T03%3A06%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=WANG,%20Shuohang&rft.date=2024-10-17&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2024346254A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true