Proximal Interacting Particle Langevin Algorithms
We introduce a class of algorithms, termed Proximal Interacting Particle Langevin Algorithms (PIPLA), for inference and learning in latent variable models whose joint probability density is non-differentiable. Leveraging proximal Markov chain Monte Carlo (MCMC) techniques and the recently introduced...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We introduce a class of algorithms, termed Proximal Interacting Particle
Langevin Algorithms (PIPLA), for inference and learning in latent variable
models whose joint probability density is non-differentiable. Leveraging
proximal Markov chain Monte Carlo (MCMC) techniques and the recently introduced
interacting particle Langevin algorithm (IPLA), we propose several variants
within the novel proximal IPLA family, tailored to the problem of estimating
parameters in a non-differentiable statistical model. We prove nonasymptotic
bounds for the parameter estimates produced by multiple algorithms in the
strongly log-concave setting and provide comprehensive numerical experiments on
various models to demonstrate the effectiveness of the proposed methods. In
particular, we demonstrate the utility of the proposed family of algorithms on
a toy hierarchical example where our assumptions can be checked, as well as on
the problems of sparse Bayesian logistic regression, sparse Bayesian neural
network, and sparse matrix completion. Our theory and experiments together show
that PIPLA family can be the de facto choice for parameter estimation problems
in latent variable models for non-differentiable models. |
---|---|
DOI: | 10.48550/arxiv.2406.14292 |