Sparse Support Vector Machine with Lp Penalty for Feature Selection

We study the strategies in feature selection with sparse support vector machine (SVM). Recently, the socalled Lp-SVM (0 〈 p 〈 1) has attracted much attention because it can encourage better sparsity than the widely used L1-SVM. However, Lp-SVM is a non-convex and non-Lipschitz optimization problem....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of computer science and technology 2017, Vol.32 (1), p.68-77
Hauptverfasser: Yao, Lan, Zeng, Feng, Li, Dong-Hui, Chen, Zhi-Gang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We study the strategies in feature selection with sparse support vector machine (SVM). Recently, the socalled Lp-SVM (0 〈 p 〈 1) has attracted much attention because it can encourage better sparsity than the widely used L1-SVM. However, Lp-SVM is a non-convex and non-Lipschitz optimization problem. Solving this problem numerically is challenging. In this paper, we reformulate the Lp-SVM into an optimization model with linear objective function and smooth constraints (LOSC-SVM) so that it can be solved by numerical methods for smooth constrained optimization. Our numerical experiments on artificial datasets show that LOSC-SVM (0 〈 p 〈 1) can improve the classification performance in both feature selection and classification by choosing a suitable parameter p. We also apply it to some real-life datasets and experimental results show that it is superior to L1-SVM.
ISSN:1000-9000
1860-4749
DOI:10.1007/s11390-017-1706-2