Efficient Informed Proposals for Discrete Distributions via Newton's Series Approximation
Gradients have been exploited in proposal distributions to accelerate the convergence of Markov chain Monte Carlo algorithms on discrete distributions. However, these methods require a natural differentiable extension of the target discrete distribution, which often does not exist or does not provid...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Gradients have been exploited in proposal distributions to accelerate the
convergence of Markov chain Monte Carlo algorithms on discrete distributions.
However, these methods require a natural differentiable extension of the target
discrete distribution, which often does not exist or does not provide effective
gradient guidance. In this paper, we develop a gradient-like proposal for any
discrete distribution without this strong requirement. Built upon a
locally-balanced proposal, our method efficiently approximates the discrete
likelihood ratio via Newton's series expansion to enable a large and efficient
exploration in discrete spaces. We show that our method can also be viewed as a
multilinear extension, thus inheriting its desired properties. We prove that
our method has a guaranteed convergence rate with or without the
Metropolis-Hastings step. Furthermore, our method outperforms a number of
popular alternatives in several different experiments, including the facility
location problem, extractive text summarization, and image retrieval. |
---|---|
DOI: | 10.48550/arxiv.2302.13929 |