Unsupervised feature selection by self-paced learning regularization

•This paper uses self-representation method to construct feature selection model.•Self-paced learning is added into feature selection to consider the outliers.•This paper proposes a novel optimization algorithm to solve the objective function. Previous feature selection methods equivalently consider...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2020-04, Vol.132, p.4-11
Hauptverfasser: Zheng, Wei, Zhu, Xiaofeng, Wen, Guoqiu, Zhu, Yonghua, Yu, Hao, Gan, Jiangzhang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•This paper uses self-representation method to construct feature selection model.•Self-paced learning is added into feature selection to consider the outliers.•This paper proposes a novel optimization algorithm to solve the objective function. Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2018.06.029