A Novel Indefinite Kernel Dimensionality Reduction Algorithm: Weighted Generalized Indefinite Kernel Discriminant Analysis
Kernel methods are becoming increasingly popular for many real-world learning problems. And these methods for data analysis are frequently considered to be restricted to positive definite kernels. In practice, however, indefinite kernels arise and demand application in pattern analysis. In this pape...
Gespeichert in:
Veröffentlicht in: | Neural processing letters 2014-12, Vol.40 (3), p.301-313 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Kernel methods are becoming increasingly popular for many real-world learning problems. And these methods for data analysis are frequently considered to be restricted to positive definite kernels. In practice, however, indefinite kernels arise and demand application in pattern analysis. In this paper, we present several formal extensions of kernel discriminant analysis (KDA) methods which can be used with indefinite kernels. In particular they include indefinite KDA (IKDA) based on generalized singular value decomposition (IKDA/GSVD), pseudo-inverse IKDA, null space IKDA and range space IKDA. Similar to the case of LDA-based algorithms, IKDA-based algorithms also fail to consider that different contribution of each pair of class to the discrimination. To remedy this problem, weighted schemes are incorporated into IKDA extensions in this paper and called them weighted generalized IKDA algorithms. Experiments on two real-world data sets are performed to test and evaluate the effectiveness of the proposed algorithms and the effect of weights on indefinite kernel functions. The results show that the effect of weighted schemes is very significantly. |
---|---|
ISSN: | 1370-4621 1573-773X |
DOI: | 10.1007/s11063-013-9330-9 |