Adaptive Non-Uniform Timestep Sampling for Diffusion Model Training
As a highly expressive generative model, diffusion models have demonstrated exceptional success across various domains, including image generation, natural language processing, and combinatorial optimization. However, as data distributions grow more complex, training these models to convergence beco...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | As a highly expressive generative model, diffusion models have demonstrated
exceptional success across various domains, including image generation, natural
language processing, and combinatorial optimization. However, as data
distributions grow more complex, training these models to convergence becomes
increasingly computationally intensive. While diffusion models are typically
trained using uniform timestep sampling, our research shows that the variance
in stochastic gradients varies significantly across timesteps, with
high-variance timesteps becoming bottlenecks that hinder faster convergence. To
address this issue, we introduce a non-uniform timestep sampling method that
prioritizes these more critical timesteps. Our method tracks the impact of
gradient updates on the objective for each timestep, adaptively selecting those
most likely to minimize the objective effectively. Experimental results
demonstrate that this approach not only accelerates the training process, but
also leads to improved performance at convergence. Furthermore, our method
shows robust performance across various datasets, scheduling strategies, and
diffusion architectures, outperforming previously proposed timestep sampling
and weighting heuristics that lack this degree of robustness. |
---|---|
DOI: | 10.48550/arxiv.2411.09998 |