Trans-Dimensional Generative Modeling via Jump Diffusion Models
We propose a new class of generative models that naturally handle data of varying dimensionality by jointly modeling the state and dimension of each datapoint. The generative process is formulated as a jump diffusion process that makes jumps between different dimensional spaces. We first define a di...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We propose a new class of generative models that naturally handle data of
varying dimensionality by jointly modeling the state and dimension of each
datapoint. The generative process is formulated as a jump diffusion process
that makes jumps between different dimensional spaces. We first define a
dimension destroying forward noising process, before deriving the dimension
creating time-reversed generative process along with a novel evidence lower
bound training objective for learning to approximate it. Simulating our learned
approximation to the time-reversed generative process then provides an
effective way of sampling data of varying dimensionality by jointly generating
state values and dimensions. We demonstrate our approach on molecular and video
datasets of varying dimensionality, reporting better compatibility with
test-time diffusion guidance imputation tasks and improved interpolation
capabilities versus fixed dimensional models that generate state values and
dimensions separately. |
---|---|
DOI: | 10.48550/arxiv.2305.16261 |