A Safe Feature Screening Rule for Rank Lasso
To deal with outliers or heavy-tailed random errors in common high-dimensional data sets, robust regressions are preferable selections and Rank Lasso is a notable model among them. However, the large-scaled feature size in data set increases the computational cost of solving Rank Lasso. In this pape...
Gespeichert in:
Veröffentlicht in: | IEEE signal processing letters 2022, Vol.29, p.1062-1066 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To deal with outliers or heavy-tailed random errors in common high-dimensional data sets, robust regressions are preferable selections and Rank Lasso is a notable model among them. However, the large-scaled feature size in data set increases the computational cost of solving Rank Lasso. In this paper, we build up a safe feature screening rule for Rank Lasso, which can effectively and safely identify inactive features in data sets and reduce the computation time of this model. The advantage of our screening rule is that it can be expressed as the closed-form function of given data and is easily implemented. The proposed screening rule is evaluated on some simulation and real data sets, which show that it can safely discard inactive features with a small computational cost and reduce the time for solving Rank Lasso. |
---|---|
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2022.3167918 |