Nonlinear Mappings for Generative Kernels on Latent Variable Models
Generative kernels have emerged in the last years as an effective method for mixing discriminative and generative approaches. In particular, in this paper, we focus on kernels defined on generative models with latent variables (e.g. the states in a Hidden Markov Model). The basic idea underlying the...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Generative kernels have emerged in the last years as an effective method for mixing discriminative and generative approaches. In particular, in this paper, we focus on kernels defined on generative models with latent variables (e.g. the states in a Hidden Markov Model). The basic idea underlying these kernels is to compare objects, via a inner product, in a feature space where the dimensions are related to the latent variables of the model. Here we propose to enhance these kernels via a nonlinear normalization of the space, namely a nonlinear mapping of space dimensions able to exploit their discriminative characteristics. In this paper we investigate three possible nonlinear mappings, for two HMM-based generative kernels, testing them in different sequence classification problems, with really promising results. |
---|---|
ISSN: | 1051-4651 2831-7475 |
DOI: | 10.1109/ICPR.2010.523 |