Prompt-Based Exemplar Super-Compression and Regeneration for Class-Incremental Learning
Replay-based methods in class-incremental learning~(CIL) have attained remarkable success. Despite their effectiveness, the inherent memory restriction results in saving a limited number of exemplars with poor diversity. In this paper, we introduce PESCR, a novel approach that substantially increase...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Replay-based methods in class-incremental learning~(CIL) have attained
remarkable success. Despite their effectiveness, the inherent memory
restriction results in saving a limited number of exemplars with poor
diversity. In this paper, we introduce PESCR, a novel approach that
substantially increases the quantity and enhances the diversity of exemplars
based on a pre-trained general-purpose diffusion model, without fine-tuning it
on target datasets or storing it in the memory buffer. Images are compressed
into visual and textual prompts, which are saved instead of the original
images, decreasing memory consumption by a factor of 24. In subsequent phases,
diverse exemplars are regenerated by the diffusion model. We further propose
partial compression and diffusion-based data augmentation to minimize the
domain gap between generated exemplars and real images. Comprehensive
experiments demonstrate that PESCR significantly improves CIL performance
across multiple benchmarks, e.g., 3.2% above the previous state-of-the-art on
ImageNet-100. |
---|---|
DOI: | 10.48550/arxiv.2311.18266 |