Continual Learning in Linear Classification on Separable Data
We analyze continual learning on a sequence of separable linear classification tasks with binary labels. We show theoretically that learning with weak regularization reduces to solving a sequential max-margin problem, corresponding to a special case of the Projection Onto Convex Sets (POCS) framewor...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We analyze continual learning on a sequence of separable linear
classification tasks with binary labels. We show theoretically that learning
with weak regularization reduces to solving a sequential max-margin problem,
corresponding to a special case of the Projection Onto Convex Sets (POCS)
framework. We then develop upper bounds on the forgetting and other quantities
of interest under various settings with recurring tasks, including cyclic and
random orderings of tasks. We discuss several practical implications to popular
training practices like regularization scheduling and weighting. We point out
several theoretical differences between our continual classification setting
and a recently studied continual regression setting. |
---|---|
DOI: | 10.48550/arxiv.2306.03534 |