Continual Lifelong Learning in Natural Language Processing: A Survey

Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging f...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2020-12
Hauptverfasser: Biesialska, Magdalena, Biesialska, Katarzyna, Costa-jussà, Marta R
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Biesialska, Magdalena
Biesialska, Katarzyna
Costa-jussà, Marta R
description Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.
doi_str_mv 10.48550/arxiv.2012.09823
format Article
fullrecord <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2012_09823</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2471088116</sourcerecordid><originalsourceid>FETCH-LOGICAL-a526-261311daf6ed2540d15ac6f631c3e50d2d0e88cbd38b37e25046f9abcfb9f8733</originalsourceid><addsrcrecordid>eNotj99LwzAcxIMgOOb-AJ8s-NyafNOkqW-j_oSignsvaZOUjJrOpBnuv7fbfLqDO477IHRDcJYLxvC99L92nwEmkOFSAL1AC6CUpCIHuEKrELYYY-AFMEYX6LEa3WRdlENSW6OH0fVJraV3djbWJe9yiv4YStdH2evk04-dDmGOH5J18hX9Xh-u0aWRQ9Crf12izfPTpnpN64-Xt2pdp5IBT4ETSoiShmsFLMeKMNlxwynpqGZYgcJaiK5VVLS00MBwzk0p2860pREFpUt0e549ITY7b7-lPzRH1OaEOjfuzo2dH3-iDlOzHaN386cG8oJgIQjh9A87oFX7</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2471088116</pqid></control><display><type>article</type><title>Continual Lifelong Learning in Natural Language Processing: A Survey</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Biesialska, Magdalena ; Biesialska, Katarzyna ; Costa-jussà, Marta R</creator><creatorcontrib>Biesialska, Magdalena ; Biesialska, Katarzyna ; Costa-jussà, Marta R</creatorcontrib><description>Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2012.09823</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Computer Science - Artificial Intelligence ; Computer Science - Computation and Language ; Computer Science - Learning ; Computer Science - Neural and Evolutionary Computing ; Data transmission ; Information systems ; Knowledge acquisition ; Lifelong learning ; Natural language ; Natural language processing ; Neural networks</subject><ispartof>arXiv.org, 2020-12</ispartof><rights>2020. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,778,782,883,27908</link.rule.ids><backlink>$$Uhttps://doi.org/10.48550/arXiv.2012.09823$$DView paper in arXiv$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.18653/v1/2020.coling-main.574$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink></links><search><creatorcontrib>Biesialska, Magdalena</creatorcontrib><creatorcontrib>Biesialska, Katarzyna</creatorcontrib><creatorcontrib>Costa-jussà, Marta R</creatorcontrib><title>Continual Lifelong Learning in Natural Language Processing: A Survey</title><title>arXiv.org</title><description>Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Computation and Language</subject><subject>Computer Science - Learning</subject><subject>Computer Science - Neural and Evolutionary Computing</subject><subject>Data transmission</subject><subject>Information systems</subject><subject>Knowledge acquisition</subject><subject>Lifelong learning</subject><subject>Natural language</subject><subject>Natural language processing</subject><subject>Neural networks</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotj99LwzAcxIMgOOb-AJ8s-NyafNOkqW-j_oSignsvaZOUjJrOpBnuv7fbfLqDO477IHRDcJYLxvC99L92nwEmkOFSAL1AC6CUpCIHuEKrELYYY-AFMEYX6LEa3WRdlENSW6OH0fVJraV3djbWJe9yiv4YStdH2evk04-dDmGOH5J18hX9Xh-u0aWRQ9Crf12izfPTpnpN64-Xt2pdp5IBT4ETSoiShmsFLMeKMNlxwynpqGZYgcJaiK5VVLS00MBwzk0p2860pREFpUt0e549ITY7b7-lPzRH1OaEOjfuzo2dH3-iDlOzHaN386cG8oJgIQjh9A87oFX7</recordid><startdate>20201217</startdate><enddate>20201217</enddate><creator>Biesialska, Magdalena</creator><creator>Biesialska, Katarzyna</creator><creator>Costa-jussà, Marta R</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20201217</creationdate><title>Continual Lifelong Learning in Natural Language Processing: A Survey</title><author>Biesialska, Magdalena ; Biesialska, Katarzyna ; Costa-jussà, Marta R</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a526-261311daf6ed2540d15ac6f631c3e50d2d0e88cbd38b37e25046f9abcfb9f8733</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Computation and Language</topic><topic>Computer Science - Learning</topic><topic>Computer Science - Neural and Evolutionary Computing</topic><topic>Data transmission</topic><topic>Information systems</topic><topic>Knowledge acquisition</topic><topic>Lifelong learning</topic><topic>Natural language</topic><topic>Natural language processing</topic><topic>Neural networks</topic><toplevel>online_resources</toplevel><creatorcontrib>Biesialska, Magdalena</creatorcontrib><creatorcontrib>Biesialska, Katarzyna</creatorcontrib><creatorcontrib>Costa-jussà, Marta R</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Biesialska, Magdalena</au><au>Biesialska, Katarzyna</au><au>Costa-jussà, Marta R</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Continual Lifelong Learning in Natural Language Processing: A Survey</atitle><jtitle>arXiv.org</jtitle><date>2020-12-17</date><risdate>2020</risdate><eissn>2331-8422</eissn><abstract>Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2012.09823</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2020-12
issn 2331-8422
language eng
recordid cdi_arxiv_primary_2012_09823
source arXiv.org; Free E- Journals
subjects Computer Science - Artificial Intelligence
Computer Science - Computation and Language
Computer Science - Learning
Computer Science - Neural and Evolutionary Computing
Data transmission
Information systems
Knowledge acquisition
Lifelong learning
Natural language
Natural language processing
Neural networks
title Continual Lifelong Learning in Natural Language Processing: A Survey
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-16T20%3A04%3A05IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Continual%20Lifelong%20Learning%20in%20Natural%20Language%20Processing:%20A%20Survey&rft.jtitle=arXiv.org&rft.au=Biesialska,%20Magdalena&rft.date=2020-12-17&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2012.09823&rft_dat=%3Cproquest_arxiv%3E2471088116%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2471088116&rft_id=info:pmid/&rfr_iscdi=true