Generative Dataset Distillation: Balancing Global Structure and Local Details
In this paper, we propose a new dataset distillation method that considers balancing global structure and local details when distilling the information from a large dataset into a generative model. Dataset distillation has been proposed to reduce the size of the required dataset when training models...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Li, Longzhen Li, Guang Togo, Ren Maeda, Keisuke Ogawa, Takahiro Haseyama, Miki |
description | In this paper, we propose a new dataset distillation method that considers
balancing global structure and local details when distilling the information
from a large dataset into a generative model. Dataset distillation has been
proposed to reduce the size of the required dataset when training models. The
conventional dataset distillation methods face the problem of long redeployment
time and poor cross-architecture performance. Moreover, previous methods
focused too much on the high-level semantic attributes between the synthetic
dataset and the original dataset while ignoring the local features such as
texture and shape. Based on the above understanding, we propose a new method
for distilling the original image dataset into a generative model. Our method
involves using a conditional generative adversarial network to generate the
distilled dataset. Subsequently, we ensure balancing global structure and local
details in the distillation process, continuously optimizing the generator for
more information-dense dataset generation. |
doi_str_mv | 10.48550/arxiv.2404.17732 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2404_17732</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2404_17732</sourcerecordid><originalsourceid>FETCH-LOGICAL-a672-2b1336a83f480e7ab6c58d1f58e5f146fdb3589b468d35a945bb44a9c6564f9a3</originalsourceid><addsrcrecordid>eNotz7FOwzAUhWEvHVDbB2DCL5A0jq8dhw0aCEhBDHSPrh0bWTIOctwK3p5SmI70D0f6CLlmVQlKiGqH6cufyhoqKFnT8PqKvPQ22oTZnyztMONiM-38kn0I5zjHW3qPAaPx8Z32YdYY6FtOR5OPyVKMEx1mc26dzejDsiErh2Gx2_9dk8Pjw2H_VAyv_fP-bihQNnVRa8a5RMUdqMo2qKURamJOKCscA-kmzYVqNUg1cYEtCK0BsDVSSHAt8jW5-bu9eMbP5D8wfY-_rvHi4j8qe0g4</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Generative Dataset Distillation: Balancing Global Structure and Local Details</title><source>arXiv.org</source><creator>Li, Longzhen ; Li, Guang ; Togo, Ren ; Maeda, Keisuke ; Ogawa, Takahiro ; Haseyama, Miki</creator><creatorcontrib>Li, Longzhen ; Li, Guang ; Togo, Ren ; Maeda, Keisuke ; Ogawa, Takahiro ; Haseyama, Miki</creatorcontrib><description>In this paper, we propose a new dataset distillation method that considers
balancing global structure and local details when distilling the information
from a large dataset into a generative model. Dataset distillation has been
proposed to reduce the size of the required dataset when training models. The
conventional dataset distillation methods face the problem of long redeployment
time and poor cross-architecture performance. Moreover, previous methods
focused too much on the high-level semantic attributes between the synthetic
dataset and the original dataset while ignoring the local features such as
texture and shape. Based on the above understanding, we propose a new method
for distilling the original image dataset into a generative model. Our method
involves using a conditional generative adversarial network to generate the
distilled dataset. Subsequently, we ensure balancing global structure and local
details in the distillation process, continuously optimizing the generator for
more information-dense dataset generation.</description><identifier>DOI: 10.48550/arxiv.2404.17732</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning</subject><creationdate>2024-04</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2404.17732$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2404.17732$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Li, Longzhen</creatorcontrib><creatorcontrib>Li, Guang</creatorcontrib><creatorcontrib>Togo, Ren</creatorcontrib><creatorcontrib>Maeda, Keisuke</creatorcontrib><creatorcontrib>Ogawa, Takahiro</creatorcontrib><creatorcontrib>Haseyama, Miki</creatorcontrib><title>Generative Dataset Distillation: Balancing Global Structure and Local Details</title><description>In this paper, we propose a new dataset distillation method that considers
balancing global structure and local details when distilling the information
from a large dataset into a generative model. Dataset distillation has been
proposed to reduce the size of the required dataset when training models. The
conventional dataset distillation methods face the problem of long redeployment
time and poor cross-architecture performance. Moreover, previous methods
focused too much on the high-level semantic attributes between the synthetic
dataset and the original dataset while ignoring the local features such as
texture and shape. Based on the above understanding, we propose a new method
for distilling the original image dataset into a generative model. Our method
involves using a conditional generative adversarial network to generate the
distilled dataset. Subsequently, we ensure balancing global structure and local
details in the distillation process, continuously optimizing the generator for
more information-dense dataset generation.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz7FOwzAUhWEvHVDbB2DCL5A0jq8dhw0aCEhBDHSPrh0bWTIOctwK3p5SmI70D0f6CLlmVQlKiGqH6cufyhoqKFnT8PqKvPQ22oTZnyztMONiM-38kn0I5zjHW3qPAaPx8Z32YdYY6FtOR5OPyVKMEx1mc26dzejDsiErh2Gx2_9dk8Pjw2H_VAyv_fP-bihQNnVRa8a5RMUdqMo2qKURamJOKCscA-kmzYVqNUg1cYEtCK0BsDVSSHAt8jW5-bu9eMbP5D8wfY-_rvHi4j8qe0g4</recordid><startdate>20240426</startdate><enddate>20240426</enddate><creator>Li, Longzhen</creator><creator>Li, Guang</creator><creator>Togo, Ren</creator><creator>Maeda, Keisuke</creator><creator>Ogawa, Takahiro</creator><creator>Haseyama, Miki</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20240426</creationdate><title>Generative Dataset Distillation: Balancing Global Structure and Local Details</title><author>Li, Longzhen ; Li, Guang ; Togo, Ren ; Maeda, Keisuke ; Ogawa, Takahiro ; Haseyama, Miki</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a672-2b1336a83f480e7ab6c58d1f58e5f146fdb3589b468d35a945bb44a9c6564f9a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Li, Longzhen</creatorcontrib><creatorcontrib>Li, Guang</creatorcontrib><creatorcontrib>Togo, Ren</creatorcontrib><creatorcontrib>Maeda, Keisuke</creatorcontrib><creatorcontrib>Ogawa, Takahiro</creatorcontrib><creatorcontrib>Haseyama, Miki</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Li, Longzhen</au><au>Li, Guang</au><au>Togo, Ren</au><au>Maeda, Keisuke</au><au>Ogawa, Takahiro</au><au>Haseyama, Miki</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Generative Dataset Distillation: Balancing Global Structure and Local Details</atitle><date>2024-04-26</date><risdate>2024</risdate><abstract>In this paper, we propose a new dataset distillation method that considers
balancing global structure and local details when distilling the information
from a large dataset into a generative model. Dataset distillation has been
proposed to reduce the size of the required dataset when training models. The
conventional dataset distillation methods face the problem of long redeployment
time and poor cross-architecture performance. Moreover, previous methods
focused too much on the high-level semantic attributes between the synthetic
dataset and the original dataset while ignoring the local features such as
texture and shape. Based on the above understanding, we propose a new method
for distilling the original image dataset into a generative model. Our method
involves using a conditional generative adversarial network to generate the
distilled dataset. Subsequently, we ensure balancing global structure and local
details in the distillation process, continuously optimizing the generator for
more information-dense dataset generation.</abstract><doi>10.48550/arxiv.2404.17732</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2404.17732 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2404_17732 |
source | arXiv.org |
subjects | Computer Science - Artificial Intelligence Computer Science - Computer Vision and Pattern Recognition Computer Science - Learning |
title | Generative Dataset Distillation: Balancing Global Structure and Local Details |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T13%3A05%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Generative%20Dataset%20Distillation:%20Balancing%20Global%20Structure%20and%20Local%20Details&rft.au=Li,%20Longzhen&rft.date=2024-04-26&rft_id=info:doi/10.48550/arxiv.2404.17732&rft_dat=%3Carxiv_GOX%3E2404_17732%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |