ATLAS: Universal Function Approximator for Memory Retention

Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting. Catastrophic forgetting refers to the abrupt unlearning of a previous task when a new task is learned. It is an emergent phenomenon that hinders...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: van Deventer, Heinrich, Bosman, Anna
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator van Deventer, Heinrich
Bosman, Anna
description Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting. Catastrophic forgetting refers to the abrupt unlearning of a previous task when a new task is learned. It is an emergent phenomenon that hinders continual learning. Existing universal function approximation theorems for ANNs guarantee function approximation ability, but do not predict catastrophic forgetting. This paper presents a novel universal approximation theorem for multi-variable functions using only single-variable functions and exponential functions. Furthermore, we present ATLAS: a novel ANN architecture based on the new theorem. It is shown that ATLAS is a universal function approximator capable of some memory retention, and continual learning. The memory of ATLAS is imperfect, with some off-target effects during continual learning, but it is well-behaved and predictable. An efficient implementation of ATLAS is provided. Experiments are conducted to evaluate both the function approximation and memory retention capabilities of ATLAS.
doi_str_mv 10.48550/arxiv.2208.05388
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2208_05388</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2208_05388</sourcerecordid><originalsourceid>FETCH-LOGICAL-a678-c8e58bf0824a77d5caa7b908adfe27389e5f39a54e9906787051d52acd881b483</originalsourceid><addsrcrecordid>eNotj8FuwjAQRH3hUEE_oKf6BxIcO1uvyylCpa2UqlIJ52iTrKVIkEQmRfD3DZTDaC4zozdCPCUqThFALSmc21OstcJYgUF8EKusyLPtq9x17YnDkfZy89vVY9t3MhuG0J_bA419kH7SFx_6cJE_PHJ3TSzEzNP-yI93n4ti81asP6L8-_1zneURvViMamTAyivUKVnbQE1kK6eQGs_aGnQM3jiClJ1TU8EqSBrQVDeISZWimYvn_9kbfTmECSlcyuuL8vbC_AFa0EGw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>ATLAS: Universal Function Approximator for Memory Retention</title><source>arXiv.org</source><creator>van Deventer, Heinrich ; Bosman, Anna</creator><creatorcontrib>van Deventer, Heinrich ; Bosman, Anna</creatorcontrib><description>Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting. Catastrophic forgetting refers to the abrupt unlearning of a previous task when a new task is learned. It is an emergent phenomenon that hinders continual learning. Existing universal function approximation theorems for ANNs guarantee function approximation ability, but do not predict catastrophic forgetting. This paper presents a novel universal approximation theorem for multi-variable functions using only single-variable functions and exponential functions. Furthermore, we present ATLAS: a novel ANN architecture based on the new theorem. It is shown that ATLAS is a universal function approximator capable of some memory retention, and continual learning. The memory of ATLAS is imperfect, with some off-target effects during continual learning, but it is well-behaved and predictable. An efficient implementation of ATLAS is provided. Experiments are conducted to evaluate both the function approximation and memory retention capabilities of ATLAS.</description><identifier>DOI: 10.48550/arxiv.2208.05388</identifier><language>eng</language><subject>Computer Science - Learning ; Computer Science - Neural and Evolutionary Computing</subject><creationdate>2022-08</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2208.05388$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2208.05388$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>van Deventer, Heinrich</creatorcontrib><creatorcontrib>Bosman, Anna</creatorcontrib><title>ATLAS: Universal Function Approximator for Memory Retention</title><description>Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting. Catastrophic forgetting refers to the abrupt unlearning of a previous task when a new task is learned. It is an emergent phenomenon that hinders continual learning. Existing universal function approximation theorems for ANNs guarantee function approximation ability, but do not predict catastrophic forgetting. This paper presents a novel universal approximation theorem for multi-variable functions using only single-variable functions and exponential functions. Furthermore, we present ATLAS: a novel ANN architecture based on the new theorem. It is shown that ATLAS is a universal function approximator capable of some memory retention, and continual learning. The memory of ATLAS is imperfect, with some off-target effects during continual learning, but it is well-behaved and predictable. An efficient implementation of ATLAS is provided. Experiments are conducted to evaluate both the function approximation and memory retention capabilities of ATLAS.</description><subject>Computer Science - Learning</subject><subject>Computer Science - Neural and Evolutionary Computing</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8FuwjAQRH3hUEE_oKf6BxIcO1uvyylCpa2UqlIJ52iTrKVIkEQmRfD3DZTDaC4zozdCPCUqThFALSmc21OstcJYgUF8EKusyLPtq9x17YnDkfZy89vVY9t3MhuG0J_bA419kH7SFx_6cJE_PHJ3TSzEzNP-yI93n4ti81asP6L8-_1zneURvViMamTAyivUKVnbQE1kK6eQGs_aGnQM3jiClJ1TU8EqSBrQVDeISZWimYvn_9kbfTmECSlcyuuL8vbC_AFa0EGw</recordid><startdate>20220810</startdate><enddate>20220810</enddate><creator>van Deventer, Heinrich</creator><creator>Bosman, Anna</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220810</creationdate><title>ATLAS: Universal Function Approximator for Memory Retention</title><author>van Deventer, Heinrich ; Bosman, Anna</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a678-c8e58bf0824a77d5caa7b908adfe27389e5f39a54e9906787051d52acd881b483</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Learning</topic><topic>Computer Science - Neural and Evolutionary Computing</topic><toplevel>online_resources</toplevel><creatorcontrib>van Deventer, Heinrich</creatorcontrib><creatorcontrib>Bosman, Anna</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>van Deventer, Heinrich</au><au>Bosman, Anna</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>ATLAS: Universal Function Approximator for Memory Retention</atitle><date>2022-08-10</date><risdate>2022</risdate><abstract>Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting. Catastrophic forgetting refers to the abrupt unlearning of a previous task when a new task is learned. It is an emergent phenomenon that hinders continual learning. Existing universal function approximation theorems for ANNs guarantee function approximation ability, but do not predict catastrophic forgetting. This paper presents a novel universal approximation theorem for multi-variable functions using only single-variable functions and exponential functions. Furthermore, we present ATLAS: a novel ANN architecture based on the new theorem. It is shown that ATLAS is a universal function approximator capable of some memory retention, and continual learning. The memory of ATLAS is imperfect, with some off-target effects during continual learning, but it is well-behaved and predictable. An efficient implementation of ATLAS is provided. Experiments are conducted to evaluate both the function approximation and memory retention capabilities of ATLAS.</abstract><doi>10.48550/arxiv.2208.05388</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2208.05388
ispartof
issn
language eng
recordid cdi_arxiv_primary_2208_05388
source arXiv.org
subjects Computer Science - Learning
Computer Science - Neural and Evolutionary Computing
title ATLAS: Universal Function Approximator for Memory Retention
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T21%3A27%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=ATLAS:%20Universal%20Function%20Approximator%20for%20Memory%20Retention&rft.au=van%20Deventer,%20Heinrich&rft.date=2022-08-10&rft_id=info:doi/10.48550/arxiv.2208.05388&rft_dat=%3Carxiv_GOX%3E2208_05388%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true