Empirical study on variational inference methods for topic models
In topic modelling, the main computational problem is to approximate the posterior distribution given an observed collection. Commonly, we must resort to variational methods for approximations; however, we do not know which variational variant is the best choice under certain settings. In this paper...
Gespeichert in:
Veröffentlicht in: | Journal of experimental & theoretical artificial intelligence 2018-01, Vol.30 (1), p.129-142 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In topic modelling, the main computational problem is to approximate the posterior distribution given an observed collection. Commonly, we must resort to variational methods for approximations; however, we do not know which variational variant is the best choice under certain settings. In this paper, we focus on four topic modelling inference methods, including mean-field variation Bayesian, collapsed variational Bayesian, hybrid variational-Gibbs and expectation propagation, and aim to systematically compare them. We analyse them from two perspectives, i.e. the approximate posterior distribution and the type of
-divergence; and then empirically compare them on various data-sets by two popular metrics. The empirical results are almost matching our analysis, where they indicate that CVB0 may be the best variational variant for topic models. |
---|---|
ISSN: | 0952-813X 1362-3079 |
DOI: | 10.1080/0952813X.2017.1409277 |