Variational Bhattacharyya divergence for hidden Markov models
Many applications require the use of divergence measures between probability distributions. Several of these, such as the Kullback-Leibler (KL) divergence and the Bhattacharyya divergence, are tractable for simple distributions such as Gaussians, but are intractable for more complex distributions su...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Many applications require the use of divergence measures between probability distributions. Several of these, such as the Kullback-Leibler (KL) divergence and the Bhattacharyya divergence, are tractable for simple distributions such as Gaussians, but are intractable for more complex distributions such as hidden Markov models (HMMs) used in speech recognizers. For tasks related to classification error, the Bhattacharyya divergence is of special importance, due to its relationship with the Bayes error. Here we derive novel variational approximations to the Bhattacharyya divergence for HMMs. Remarkably the variational Bhattacharyya divergence can be computed in a simple closed-form expression for a given sequence length. One of the approximations can even be integrated over all possible sequence lengths in a closed-form expression. We apply the variational Bhattacharyya divergence for HMMs to word confusability, the problem of estimating the probability of mistaking one spoken word for another. |
---|---|
ISSN: | 1520-6149 2379-190X |
DOI: | 10.1109/ICASSP.2008.4518670 |