Methods for the combination of kernel matrices within a support vector framework
The problem of combining different sources of information arises in several situations, for instance, the classification of data with asymmetric similarity matrices or the construction of an optimal classifier from a collection of kernels. Often, each source of information can be expressed as a simi...
Gespeichert in:
Veröffentlicht in: | Machine learning 2010-01, Vol.78 (1-2), p.137-174 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The problem of combining different sources of information arises in several situations, for instance, the classification of data with asymmetric similarity matrices or the construction of an optimal classifier from a collection of kernels. Often, each source of information can be expressed as a similarity matrix. In this paper we propose a new class of methods in order to produce, for classification purposes, a single kernel matrix from a collection of kernel (similarity) matrices. Then, the constructed kernel matrix is used to train a Support Vector Machine (SVM). The key ideas within the kernel construction are twofold: the quantification, relative to the classification labels, of the difference of information among the similarities; and the extension of the concept of linear combination of similarity matrices to the concept of functional combination of similarity matrices. The proposed methods have been successfully evaluated and compared with other powerful classifiers and kernel combination techniques on a variety of artificial and real classification problems. |
---|---|
ISSN: | 0885-6125 1573-0565 |
DOI: | 10.1007/s10994-009-5135-5 |