Synaptic Metaplasticity in Binarized Neural Networks

While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Laborieux, Axel, Ernoult, Maxence, Hirtzlin, Tifenn, Querlioz, Damien
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Laborieux, Axel
Ernoult, Maxence
Hirtzlin, Tifenn
Querlioz, Damien
description While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such "metaplastic" behaviours do not transfer directly to mitigate catastrophic forgetting in deep neural networks. In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting. Building on this idea, we propose and demonstrate experimentally, in situations of multitask and stream learning, a training technique that reduces catastrophic forgetting without needing previously presented data, nor formal boundaries between datasets and with performance approaching more mainstream techniques with task boundaries. We support our approach with a theoretical analysis on a tractable task. This work bridges computational neuroscience and deep learning, and presents significant assets for future embedded and neuromorphic systems, especially when using novel nanodevices featuring physics analogous to metaplasticity.
doi_str_mv 10.48550/arxiv.2003.03533
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2003_03533</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2003_03533</sourcerecordid><originalsourceid>FETCH-LOGICAL-a673-a881bd1ba0b37ca3eab454b124c531924447a40dcce4590eafa1141f625b29973</originalsourceid><addsrcrecordid>eNotzrtuwjAUxnEvDCjwAJ2aF0iwfY5JMhYEBYnLAHt07DiS1TRETrikT8-lTP9v-vRj7EPwGFOl-IT8zV1iyTnEHBTAkOGhr6npnAm3tqOmovaxXdeHrg5nribv_mwR7uzZU_VIdz35n3bEBiVVrR2_G7DjcnGcr6LN_ns9_9pENE0gojQVuhCauIbEEFjSqFALiUaByCQiJoS8MMaiyrilkoRAUU6l0jLLEgjY5__ti5033v2S7_MnP3_x4Q6UyT9q</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Synaptic Metaplasticity in Binarized Neural Networks</title><source>arXiv.org</source><creator>Laborieux, Axel ; Ernoult, Maxence ; Hirtzlin, Tifenn ; Querlioz, Damien</creator><creatorcontrib>Laborieux, Axel ; Ernoult, Maxence ; Hirtzlin, Tifenn ; Querlioz, Damien</creatorcontrib><description>While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such "metaplastic" behaviours do not transfer directly to mitigate catastrophic forgetting in deep neural networks. In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting. Building on this idea, we propose and demonstrate experimentally, in situations of multitask and stream learning, a training technique that reduces catastrophic forgetting without needing previously presented data, nor formal boundaries between datasets and with performance approaching more mainstream techniques with task boundaries. We support our approach with a theoretical analysis on a tractable task. This work bridges computational neuroscience and deep learning, and presents significant assets for future embedded and neuromorphic systems, especially when using novel nanodevices featuring physics analogous to metaplasticity.</description><identifier>DOI: 10.48550/arxiv.2003.03533</identifier><language>eng</language><subject>Computer Science - Learning ; Computer Science - Neural and Evolutionary Computing ; Statistics - Machine Learning</subject><creationdate>2020-03</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2003.03533$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2003.03533$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Laborieux, Axel</creatorcontrib><creatorcontrib>Ernoult, Maxence</creatorcontrib><creatorcontrib>Hirtzlin, Tifenn</creatorcontrib><creatorcontrib>Querlioz, Damien</creatorcontrib><title>Synaptic Metaplasticity in Binarized Neural Networks</title><description>While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such "metaplastic" behaviours do not transfer directly to mitigate catastrophic forgetting in deep neural networks. In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting. Building on this idea, we propose and demonstrate experimentally, in situations of multitask and stream learning, a training technique that reduces catastrophic forgetting without needing previously presented data, nor formal boundaries between datasets and with performance approaching more mainstream techniques with task boundaries. We support our approach with a theoretical analysis on a tractable task. This work bridges computational neuroscience and deep learning, and presents significant assets for future embedded and neuromorphic systems, especially when using novel nanodevices featuring physics analogous to metaplasticity.</description><subject>Computer Science - Learning</subject><subject>Computer Science - Neural and Evolutionary Computing</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzrtuwjAUxnEvDCjwAJ2aF0iwfY5JMhYEBYnLAHt07DiS1TRETrikT8-lTP9v-vRj7EPwGFOl-IT8zV1iyTnEHBTAkOGhr6npnAm3tqOmovaxXdeHrg5nribv_mwR7uzZU_VIdz35n3bEBiVVrR2_G7DjcnGcr6LN_ns9_9pENE0gojQVuhCauIbEEFjSqFALiUaByCQiJoS8MMaiyrilkoRAUU6l0jLLEgjY5__ti5033v2S7_MnP3_x4Q6UyT9q</recordid><startdate>20200307</startdate><enddate>20200307</enddate><creator>Laborieux, Axel</creator><creator>Ernoult, Maxence</creator><creator>Hirtzlin, Tifenn</creator><creator>Querlioz, Damien</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20200307</creationdate><title>Synaptic Metaplasticity in Binarized Neural Networks</title><author>Laborieux, Axel ; Ernoult, Maxence ; Hirtzlin, Tifenn ; Querlioz, Damien</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a673-a881bd1ba0b37ca3eab454b124c531924447a40dcce4590eafa1141f625b29973</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Computer Science - Learning</topic><topic>Computer Science - Neural and Evolutionary Computing</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Laborieux, Axel</creatorcontrib><creatorcontrib>Ernoult, Maxence</creatorcontrib><creatorcontrib>Hirtzlin, Tifenn</creatorcontrib><creatorcontrib>Querlioz, Damien</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Laborieux, Axel</au><au>Ernoult, Maxence</au><au>Hirtzlin, Tifenn</au><au>Querlioz, Damien</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Synaptic Metaplasticity in Binarized Neural Networks</atitle><date>2020-03-07</date><risdate>2020</risdate><abstract>While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such "metaplastic" behaviours do not transfer directly to mitigate catastrophic forgetting in deep neural networks. In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting. Building on this idea, we propose and demonstrate experimentally, in situations of multitask and stream learning, a training technique that reduces catastrophic forgetting without needing previously presented data, nor formal boundaries between datasets and with performance approaching more mainstream techniques with task boundaries. We support our approach with a theoretical analysis on a tractable task. This work bridges computational neuroscience and deep learning, and presents significant assets for future embedded and neuromorphic systems, especially when using novel nanodevices featuring physics analogous to metaplasticity.</abstract><doi>10.48550/arxiv.2003.03533</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2003.03533
ispartof
issn
language eng
recordid cdi_arxiv_primary_2003_03533
source arXiv.org
subjects Computer Science - Learning
Computer Science - Neural and Evolutionary Computing
Statistics - Machine Learning
title Synaptic Metaplasticity in Binarized Neural Networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T18%3A48%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Synaptic%20Metaplasticity%20in%20Binarized%20Neural%20Networks&rft.au=Laborieux,%20Axel&rft.date=2020-03-07&rft_id=info:doi/10.48550/arxiv.2003.03533&rft_dat=%3Carxiv_GOX%3E2003_03533%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true