METHOD AND SYSTEM FOR RELATIONAL GENERAL CONTINUAL LEARNING WITH MULTIPLE MEMORIES IN ARTIFICIAL NEURAL NETWORKS

A computer-implemented method comprising the step of formulating a continual learning algorithm with both element similarity as well as relational similarity between the stable and plastic model in a dual-memory setup with rehearsal. While the method of the current invention comprises the step of us...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: VARMA, Arnav, ZONOOZ, Bahram, ARANI, Elahe
Format: Patent
Sprache:eng ; fre ; ger
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator VARMA, Arnav
ZONOOZ, Bahram
ARANI, Elahe
description A computer-implemented method comprising the step of formulating a continual learning algorithm with both element similarity as well as relational similarity between the stable and plastic model in a dual-memory setup with rehearsal. While the method of the current invention comprises the step of using only two memories to simplify the analysis of impact of relational similarity, said method can be trivially extended to more than two memories. Specifically, the plastic model learns on the data stream as well as on memory samples, while the stable model maintains an exponentially moving average of the plastic model, resulting in a more generalizable model. Simultaneously, to mitigate forgetting and to enable forward transfer, the stable model distills instance-wise and relational knowledge to the plastic model on memory samples. Instance-wise knowledge distillation maintains element similarities, while relational similarity loss maintains relational similarities. The memory samples are maintained in a small constant-sized memory buffer which is updated using reservoir sampling. The method of the current invention was tested under multiple evaluation protocols, showing the efficacy of relational similarity for continual learning with dual-memory setup and rehearsal.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_EP4345688A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>EP4345688A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_EP4345688A13</originalsourceid><addsrcrecordid>eNqNysEKgkAQgGEvHaJ6h3mBDqGF10VHHdqdldkR8SQS2ylKsPcniR6g0_cf_m0yO9TGl2C4hDAERQeVFxC0RsmzsVAjo6wWnpW4W8uiESauoSdtwHVWqbUIDp0XwgDEYESpooLWm7GTL9p7uYZ9srlPjyUefu4SqFCL5hjn1xiXebrFZ3yP2GZpdr7kuTmlfywfiz03zQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>METHOD AND SYSTEM FOR RELATIONAL GENERAL CONTINUAL LEARNING WITH MULTIPLE MEMORIES IN ARTIFICIAL NEURAL NETWORKS</title><source>esp@cenet</source><creator>VARMA, Arnav ; ZONOOZ, Bahram ; ARANI, Elahe</creator><creatorcontrib>VARMA, Arnav ; ZONOOZ, Bahram ; ARANI, Elahe</creatorcontrib><description>A computer-implemented method comprising the step of formulating a continual learning algorithm with both element similarity as well as relational similarity between the stable and plastic model in a dual-memory setup with rehearsal. While the method of the current invention comprises the step of using only two memories to simplify the analysis of impact of relational similarity, said method can be trivially extended to more than two memories. Specifically, the plastic model learns on the data stream as well as on memory samples, while the stable model maintains an exponentially moving average of the plastic model, resulting in a more generalizable model. Simultaneously, to mitigate forgetting and to enable forward transfer, the stable model distills instance-wise and relational knowledge to the plastic model on memory samples. Instance-wise knowledge distillation maintains element similarities, while relational similarity loss maintains relational similarities. The memory samples are maintained in a small constant-sized memory buffer which is updated using reservoir sampling. The method of the current invention was tested under multiple evaluation protocols, showing the efficacy of relational similarity for continual learning with dual-memory setup and rehearsal.</description><language>eng ; fre ; ger</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240403&amp;DB=EPODOC&amp;CC=EP&amp;NR=4345688A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25543,76293</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240403&amp;DB=EPODOC&amp;CC=EP&amp;NR=4345688A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>VARMA, Arnav</creatorcontrib><creatorcontrib>ZONOOZ, Bahram</creatorcontrib><creatorcontrib>ARANI, Elahe</creatorcontrib><title>METHOD AND SYSTEM FOR RELATIONAL GENERAL CONTINUAL LEARNING WITH MULTIPLE MEMORIES IN ARTIFICIAL NEURAL NETWORKS</title><description>A computer-implemented method comprising the step of formulating a continual learning algorithm with both element similarity as well as relational similarity between the stable and plastic model in a dual-memory setup with rehearsal. While the method of the current invention comprises the step of using only two memories to simplify the analysis of impact of relational similarity, said method can be trivially extended to more than two memories. Specifically, the plastic model learns on the data stream as well as on memory samples, while the stable model maintains an exponentially moving average of the plastic model, resulting in a more generalizable model. Simultaneously, to mitigate forgetting and to enable forward transfer, the stable model distills instance-wise and relational knowledge to the plastic model on memory samples. Instance-wise knowledge distillation maintains element similarities, while relational similarity loss maintains relational similarities. The memory samples are maintained in a small constant-sized memory buffer which is updated using reservoir sampling. The method of the current invention was tested under multiple evaluation protocols, showing the efficacy of relational similarity for continual learning with dual-memory setup and rehearsal.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNysEKgkAQgGEvHaJ6h3mBDqGF10VHHdqdldkR8SQS2ylKsPcniR6g0_cf_m0yO9TGl2C4hDAERQeVFxC0RsmzsVAjo6wWnpW4W8uiESauoSdtwHVWqbUIDp0XwgDEYESpooLWm7GTL9p7uYZ9srlPjyUefu4SqFCL5hjn1xiXebrFZ3yP2GZpdr7kuTmlfywfiz03zQ</recordid><startdate>20240403</startdate><enddate>20240403</enddate><creator>VARMA, Arnav</creator><creator>ZONOOZ, Bahram</creator><creator>ARANI, Elahe</creator><scope>EVB</scope></search><sort><creationdate>20240403</creationdate><title>METHOD AND SYSTEM FOR RELATIONAL GENERAL CONTINUAL LEARNING WITH MULTIPLE MEMORIES IN ARTIFICIAL NEURAL NETWORKS</title><author>VARMA, Arnav ; ZONOOZ, Bahram ; ARANI, Elahe</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_EP4345688A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng ; fre ; ger</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>VARMA, Arnav</creatorcontrib><creatorcontrib>ZONOOZ, Bahram</creatorcontrib><creatorcontrib>ARANI, Elahe</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>VARMA, Arnav</au><au>ZONOOZ, Bahram</au><au>ARANI, Elahe</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>METHOD AND SYSTEM FOR RELATIONAL GENERAL CONTINUAL LEARNING WITH MULTIPLE MEMORIES IN ARTIFICIAL NEURAL NETWORKS</title><date>2024-04-03</date><risdate>2024</risdate><abstract>A computer-implemented method comprising the step of formulating a continual learning algorithm with both element similarity as well as relational similarity between the stable and plastic model in a dual-memory setup with rehearsal. While the method of the current invention comprises the step of using only two memories to simplify the analysis of impact of relational similarity, said method can be trivially extended to more than two memories. Specifically, the plastic model learns on the data stream as well as on memory samples, while the stable model maintains an exponentially moving average of the plastic model, resulting in a more generalizable model. Simultaneously, to mitigate forgetting and to enable forward transfer, the stable model distills instance-wise and relational knowledge to the plastic model on memory samples. Instance-wise knowledge distillation maintains element similarities, while relational similarity loss maintains relational similarities. The memory samples are maintained in a small constant-sized memory buffer which is updated using reservoir sampling. The method of the current invention was tested under multiple evaluation protocols, showing the efficacy of relational similarity for continual learning with dual-memory setup and rehearsal.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng ; fre ; ger
recordid cdi_epo_espacenet_EP4345688A1
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
PHYSICS
title METHOD AND SYSTEM FOR RELATIONAL GENERAL CONTINUAL LEARNING WITH MULTIPLE MEMORIES IN ARTIFICIAL NEURAL NETWORKS
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T05%3A11%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=VARMA,%20Arnav&rft.date=2024-04-03&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EEP4345688A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true