High order discriminant analysis based on Riemannian optimization
Supervised learning of linear discriminant analysis is a well-known algorithm in machine learning, but most of the discriminant relevant algorithms are generally fail to discover the nonlinear structures in dimensionality reduction. To address such problem, thus we propose a novel method for dimensi...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2020-05, Vol.195, p.105630, Article 105630 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Supervised learning of linear discriminant analysis is a well-known algorithm in machine learning, but most of the discriminant relevant algorithms are generally fail to discover the nonlinear structures in dimensionality reduction. To address such problem, thus we propose a novel method for dimensionality reduction of high-dimensional dataset, named manifold-based high order discriminant analysis (MHODA). Transforming the optimization problem from the constrained Euclidean space to a restricted search space of Riemannian manifold and employing the underlying geometry of nonlinear structures, it takes advantage of the fact that matrix manifold is actually of low dimension embedded into the ambient space. More concretely, we update the projection matrices for optimizing over the Stiefel manifold, and exploit the second order geometry of trust-region method. Moreover, in order to validate the efficiency and accuracy of the proposed algorithm, we conduct clustering and classification experiments by using six benchmark datasets. The numerical results demonstrate that MHODA is superiority to the most state-of-the-art methods. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2020.105630 |