Generative Dataset Distillation: Balancing Global Structure and Local Details
In this paper, we propose a new dataset distillation method that considers balancing global structure and local details when distilling the information from a large dataset into a generative model. Dataset distillation has been proposed to reduce the size of the required dataset when training models...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we propose a new dataset distillation method that considers
balancing global structure and local details when distilling the information
from a large dataset into a generative model. Dataset distillation has been
proposed to reduce the size of the required dataset when training models. The
conventional dataset distillation methods face the problem of long redeployment
time and poor cross-architecture performance. Moreover, previous methods
focused too much on the high-level semantic attributes between the synthetic
dataset and the original dataset while ignoring the local features such as
texture and shape. Based on the above understanding, we propose a new method
for distilling the original image dataset into a generative model. Our method
involves using a conditional generative adversarial network to generate the
distilled dataset. Subsequently, we ensure balancing global structure and local
details in the distillation process, continuously optimizing the generator for
more information-dense dataset generation. |
---|---|
DOI: | 10.48550/arxiv.2404.17732 |