Hybrid Projection Methods for Large-scale Inverse Problems with Mixed Gaussian Priors
When solving ill-posed inverse problems, a good choice of the prior is critical for the computation of a reasonable solution. A common approach is to include a Gaussian prior, which is defined by a mean vector and a symmetric and positive definite covariance matrix, and to use iterative projection m...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | When solving ill-posed inverse problems, a good choice of the prior is
critical for the computation of a reasonable solution. A common approach is to
include a Gaussian prior, which is defined by a mean vector and a symmetric and
positive definite covariance matrix, and to use iterative projection methods to
solve the corresponding regularized problem. However, a main challenge for many
of these iterative methods is that the prior covariance matrix must be known
and fixed (up to a constant) before starting the solution process. In this
paper, we develop hybrid projection methods for inverse problems with mixed
Gaussian priors where the prior covariance matrix is a convex combination of
matrices and the mixing parameter and the regularization parameter do not need
to be known in advance. Such scenarios may arise when data is used to generate
a sample prior covariance matrix (e.g., in data assimilation) or when different
priors are needed to capture different qualities of the solution. The proposed
hybrid methods are based on a mixed Golub-Kahan process, which is an extension
of the generalized Golub-Kahan bidiagonalization, and a distinctive feature of
the proposed approach is that both the regularization parameter and the
weighting parameter for the covariance matrix can be estimated automatically
during the iterative process. Furthermore, for problems where training data are
available, various data-driven covariance matrices (including those based on
learned covariance kernels) can be easily incorporated. Numerical examples from
tomographic reconstruction demonstrate the potential for these methods. |
---|---|
DOI: | 10.48550/arxiv.2003.13766 |