Local MALA-within-Gibbs for Bayesian image deblurring with total variation prior
We consider Bayesian inference for image deblurring with total variation (TV) prior. Since the posterior is analytically intractable, we resort to Markov chain Monte Carlo (MCMC) methods. However, since most MCMC methods significantly deteriorate in high dimensions, they are not suitable to handle h...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We consider Bayesian inference for image deblurring with total variation (TV)
prior. Since the posterior is analytically intractable, we resort to Markov
chain Monte Carlo (MCMC) methods. However, since most MCMC methods
significantly deteriorate in high dimensions, they are not suitable to handle
high resolution imaging problems. In this paper, we show how low-dimensional
sampling can still be facilitated by exploiting the sparse conditional
structure of the posterior. To this end, we make use of the local structures of
the blurring operator and the TV prior by partitioning the image into
rectangular blocks and employing a blocked Gibbs sampler with proposals
stemming from the Metropolis-Hastings adjusted Langevin Algorithm (MALA). We
prove that this MALA-within-Gibbs (MLwG) sampling algorithm has
dimension-independent block acceptance rates and dimension-independent
convergence rate. In order to apply the MALA proposals, we approximate the TV
by a smoothed version, and show that the introduced approximation error is
evenly distributed and dimension-independent. Since the posterior is a Gibbs
density, we can use the Hammersley-Clifford Theorem to identify the posterior
conditionals which only depend locally on the neighboring blocks. We outline
computational strategies to evaluate the conditionals, which are the target
densities in the Gibbs updates, locally and in parallel. In two numerical
experiments, we validate the dimension-independent properties of the MLwG
algorithm and demonstrate its superior performance over MALA. |
---|---|
DOI: | 10.48550/arxiv.2409.09810 |