Unsupervised feature selection based on minimum-redundant subspace learning with self-weighted adaptive graph
Unsupervised feature selection for subspace learning is an effective dimensionality reduction strategy whose essence lies in representing the original space with a lower-dimensional subspace in the absence of label information. However, existing unsupervised feature selection methods fail to assign...
Gespeichert in:
Veröffentlicht in: | Digital signal processing 2024-12, Vol.155, p.104738, Article 104738 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Unsupervised feature selection for subspace learning is an effective dimensionality reduction strategy whose essence lies in representing the original space with a lower-dimensional subspace in the absence of label information. However, existing unsupervised feature selection methods fail to assign ranking importance scores to different features adaptively, thereby ignoring the diversity of the importance of features. In addition, they pay more attention to exploring the intrinsic structure rather than eliminating the unreliability of the intrinsic structure interfered with by noise features. To address this, we introduce a novel unsupervised feature selection approach named unsupervised feature selection based on minimum-redundant subspace learning with self-weighted adaptive graph (SWAGFS) by integrating adaptive self-weighted graph learning, minimum redundancy, and sparsity constraints into a comprehensive framework. Specifically, adaptive self-weighted graph learning is introduced to automatically and adaptively learn the importance of various features, which can offer a flexible feature selection. Additionally, L2,1-norm and minimum-redundant constraints are incorporated to ensure sparsity and minimize redundancy in the feature selection matrix. This strategy may substantially strengthen the reliability of the intrinsic structure in the presence of noisy features to some extent. Experimental results on twelve datasets demonstrate the effectiveness and superiority of SWAGFS in feature selection tasks, offering a more compact and representative feature subset. |
---|---|
ISSN: | 1051-2004 |
DOI: | 10.1016/j.dsp.2024.104738 |