Generalized Eigenvalue Proximal Support Vector Machine for Functional Data Classification
Functional data analysis has become a research hotspot in the field of data mining. Traditional data mining methods regard functional data as a discrete and limited observation sequence, ignoring the continuity. In this paper, the functional data classification is addressed, proposing a functional g...
Gespeichert in:
Veröffentlicht in: | Symmetry (Basel) 2021-05, Vol.13 (5), p.833, Article 833 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Functional data analysis has become a research hotspot in the field of data mining. Traditional data mining methods regard functional data as a discrete and limited observation sequence, ignoring the continuity. In this paper, the functional data classification is addressed, proposing a functional generalized eigenvalue proximal support vector machine (FGEPSVM). Specifically, we find two nonparallel hyperplanes in function space, a positive functional hyperplane, and a functional negative hyperplane. The former is closest to the positive functional data and furthest from the negative functional data, while the latter has the opposite properties. By introducing the orthonormal basis, the problem in function space is transformed into the ones in vector space. It should be pointed out that the higher-order derivative information is applied from two aspects. We apply the derivatives alone or the weighted linear combination of the original function and the derivatives. It can be expected that to improve the classification accuracy by using more data information. Experiments on artificial datasets and benchmark datasets show the effectiveness of our FGEPSVM for functional data classification. |
---|---|
ISSN: | 2073-8994 2073-8994 |
DOI: | 10.3390/sym13050833 |