Restoration based Generative Models
Denoising diffusion models (DDMs) have recently attracted increasing attention by showing impressive synthesis quality. DDMs are built on a diffusion process that pushes data to the noise distribution and the models learn to denoise. In this paper, we establish the interpretation of DDMs in terms of...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Denoising diffusion models (DDMs) have recently attracted increasing
attention by showing impressive synthesis quality. DDMs are built on a
diffusion process that pushes data to the noise distribution and the models
learn to denoise. In this paper, we establish the interpretation of DDMs in
terms of image restoration (IR). Integrating IR literature allows us to use an
alternative objective and diverse forward processes, not confining to the
diffusion process. By imposing prior knowledge on the loss function grounded on
MAP-based estimation, we eliminate the need for the expensive sampling of DDMs.
Also, we propose a multi-scale training, which improves the performance
compared to the diffusion process, by taking advantage of the flexibility of
the forward process. Experimental results demonstrate that our model improves
the quality and efficiency of both training and inference. Furthermore, we show
the applicability of our model to inverse problems. We believe that our
framework paves the way for designing a new type of flexible general generative
model. |
---|---|
DOI: | 10.48550/arxiv.2303.05456 |