Nonconvex Regularizations for Feature Selection in Ranking With Sparse SVM

Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several preprocessing approaches have been proposed, only a few have focused on integrating feature selection into the learning process. In this paper, we propose a general framework for feature selection in learn...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2014-06, Vol.25 (6), p.1118-1130
Hauptverfasser: Laporte, Léa, Flamary, Rémi, Canu, Stéphane, Déjean, Sébastien, Mothe, Josiane
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several preprocessing approaches have been proposed, only a few have focused on integrating feature selection into the learning process. In this paper, we propose a general framework for feature selection in learning to rank using support vector machines with a sparse regularization term. We investigate both classical convex regularizations, such as ℓ 1 or weighted ℓ 1 , and nonconvex regularization terms, such as log penalty, minimax concave penalty, or ℓ p pseudo-norm with p
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2013.2286696