Stein transport for Bayesian inference
We introduce $\textit{Stein transport}$, a novel methodology for Bayesian inference designed to efficiently push an ensemble of particles along a predefined curve of tempered probability distributions. The driving vector field is chosen from a reproducing kernel Hilbert space and can be derived eith...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We introduce $\textit{Stein transport}$, a novel methodology for Bayesian
inference designed to efficiently push an ensemble of particles along a
predefined curve of tempered probability distributions. The driving vector
field is chosen from a reproducing kernel Hilbert space and can be derived
either through a suitable kernel ridge regression formulation or as an
infinitesimal optimal transport map in the Stein geometry. The update equations
of Stein transport resemble those of Stein variational gradient descent (SVGD),
but introduce a time-varying score function as well as specific weights
attached to the particles. While SVGD relies on convergence in the long-time
limit, Stein transport reaches its posterior approximation at finite time
$t=1$. Studying the mean-field limit, we discuss the errors incurred by
regularisation and finite-particle effects, and we connect Stein transport to
birth-death dynamics and Fisher-Rao gradient flows. In a series of experiments,
we show that in comparison to SVGD, Stein transport not only often reaches more
accurate posterior approximations with a significantly reduced computational
budget, but that it also effectively mitigates the variance collapse phenomenon
commonly observed in SVGD. |
---|---|
DOI: | 10.48550/arxiv.2409.01464 |