MULTI-TASK NEURAL NETWORKS WITH TASK-SPECIFIC PATHS
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Ha, David Rusu, Andrei-Alexandru Pritzel, Alexander Banarse, Dylan Sunil Zwols, Yori Blundell, Charles Wierstra, Daniel Pieter Fernando, Chrisantha Thomas |
description | Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2024046106A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2024046106A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2024046106A13</originalsourceid><addsrcrecordid>eNrjZDD2DfUJ8dQNcQz2VvBzDQ1y9AFSIeH-Qd7BCuGeIR4KIBnd4ABXZ083T2eFAMcQj2AeBta0xJziVF4ozc2g7OYa4uyhm1qQH59aXJCYnJqXWhIfGmxkYGRiYGJmaGDmaGhMnCoAVqcoMQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>MULTI-TASK NEURAL NETWORKS WITH TASK-SPECIFIC PATHS</title><source>esp@cenet</source><creator>Ha, David ; Rusu, Andrei-Alexandru ; Pritzel, Alexander ; Banarse, Dylan Sunil ; Zwols, Yori ; Blundell, Charles ; Wierstra, Daniel Pieter ; Fernando, Chrisantha Thomas</creator><creatorcontrib>Ha, David ; Rusu, Andrei-Alexandru ; Pritzel, Alexander ; Banarse, Dylan Sunil ; Zwols, Yori ; Blundell, Charles ; Wierstra, Daniel Pieter ; Fernando, Chrisantha Thomas</creatorcontrib><description>Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.</description><language>eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20240208&DB=EPODOC&CC=US&NR=2024046106A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25543,76294</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20240208&DB=EPODOC&CC=US&NR=2024046106A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Ha, David</creatorcontrib><creatorcontrib>Rusu, Andrei-Alexandru</creatorcontrib><creatorcontrib>Pritzel, Alexander</creatorcontrib><creatorcontrib>Banarse, Dylan Sunil</creatorcontrib><creatorcontrib>Zwols, Yori</creatorcontrib><creatorcontrib>Blundell, Charles</creatorcontrib><creatorcontrib>Wierstra, Daniel Pieter</creatorcontrib><creatorcontrib>Fernando, Chrisantha Thomas</creatorcontrib><title>MULTI-TASK NEURAL NETWORKS WITH TASK-SPECIFIC PATHS</title><description>Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZDD2DfUJ8dQNcQz2VvBzDQ1y9AFSIeH-Qd7BCuGeIR4KIBnd4ABXZ083T2eFAMcQj2AeBta0xJziVF4ozc2g7OYa4uyhm1qQH59aXJCYnJqXWhIfGmxkYGRiYGJmaGDmaGhMnCoAVqcoMQ</recordid><startdate>20240208</startdate><enddate>20240208</enddate><creator>Ha, David</creator><creator>Rusu, Andrei-Alexandru</creator><creator>Pritzel, Alexander</creator><creator>Banarse, Dylan Sunil</creator><creator>Zwols, Yori</creator><creator>Blundell, Charles</creator><creator>Wierstra, Daniel Pieter</creator><creator>Fernando, Chrisantha Thomas</creator><scope>EVB</scope></search><sort><creationdate>20240208</creationdate><title>MULTI-TASK NEURAL NETWORKS WITH TASK-SPECIFIC PATHS</title><author>Ha, David ; Rusu, Andrei-Alexandru ; Pritzel, Alexander ; Banarse, Dylan Sunil ; Zwols, Yori ; Blundell, Charles ; Wierstra, Daniel Pieter ; Fernando, Chrisantha Thomas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2024046106A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>Ha, David</creatorcontrib><creatorcontrib>Rusu, Andrei-Alexandru</creatorcontrib><creatorcontrib>Pritzel, Alexander</creatorcontrib><creatorcontrib>Banarse, Dylan Sunil</creatorcontrib><creatorcontrib>Zwols, Yori</creatorcontrib><creatorcontrib>Blundell, Charles</creatorcontrib><creatorcontrib>Wierstra, Daniel Pieter</creatorcontrib><creatorcontrib>Fernando, Chrisantha Thomas</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ha, David</au><au>Rusu, Andrei-Alexandru</au><au>Pritzel, Alexander</au><au>Banarse, Dylan Sunil</au><au>Zwols, Yori</au><au>Blundell, Charles</au><au>Wierstra, Daniel Pieter</au><au>Fernando, Chrisantha Thomas</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>MULTI-TASK NEURAL NETWORKS WITH TASK-SPECIFIC PATHS</title><date>2024-02-08</date><risdate>2024</risdate><abstract>Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_US2024046106A1 |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING PHYSICS |
title | MULTI-TASK NEURAL NETWORKS WITH TASK-SPECIFIC PATHS |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T03%3A06%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Ha,%20David&rft.date=2024-02-08&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2024046106A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |