Exploring the potential of prototype-based soft-labels data distillation for imbalanced data classification
24th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC), pp. 173-180, 2022. IEEE Dataset distillation aims at synthesizing a dataset by a small number of artificially generated data items, which, when used as training data, reproduce or approximate a machine...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | 24th International Symposium on Symbolic and Numeric Algorithms
for Scientific Computing (SYNASC), pp. 173-180, 2022. IEEE Dataset distillation aims at synthesizing a dataset by a small number of
artificially generated data items, which, when used as training data, reproduce
or approximate a machine learning (ML) model as if it were trained on the
entire original dataset. Consequently, data distillation methods are usually
tied to a specific ML algorithm. While recent literature deals mainly with
distillation of large collections of images in the context of neural network
models, tabular data distillation is much less represented and mainly focused
on a theoretical perspective. The current paper explores the potential of a
simple distillation technique previously proposed in the context of
Less-than-one shot learning. The main goal is to push further the performance
of prototype-based soft-labels distillation in terms of classification
accuracy, by integrating optimization steps in the distillation process. The
analysis is performed on real-world data sets with various degrees of
imbalance. Experimental studies trace the capability of the method to distill
the data, but also the opportunity to act as an augmentation method, i.e. to
generate new data that is able to increase model accuracy when used in
conjunction with - as opposed to instead of - the original data. |
---|---|
DOI: | 10.48550/arxiv.2403.17130 |