Mixture Martingales Revisited with Applications to Sequential Tests and Confidence Intervals
Journal of Machine Learning Research, Microtome Publishing, 2021 This paper presents new deviation inequalities that are valid uniformly in time under adaptive sampling in a multi-armed bandit model. The deviations are measured using the Kullback-Leibler divergence in a given one-dimensional exponen...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Journal of Machine Learning Research, Microtome Publishing, 2021 This paper presents new deviation inequalities that are valid uniformly in
time under adaptive sampling in a multi-armed bandit model. The deviations are
measured using the Kullback-Leibler divergence in a given one-dimensional
exponential family, and may take into account several arms at a time. They are
obtained by constructing for each arm a mixture martingale based on a
hierarchical prior, and by multiplying those martingales. Our deviation
inequalities allow us to analyze stopping rules based on generalized likelihood
ratios for a large class of sequential identification problems, and to
construct tight confidence intervals for some functions of the means of the
arms. |
---|---|
DOI: | 10.48550/arxiv.1811.11419 |