A novel self-weighted Lasso and its safe screening rule
Lasso is a popular method for high-dimensional applications in machine learning. In this paper, we propose a novel variant of Lasso, named self-weighted Lasso (SWL). Self-weighted means that the weights are obtained based on the correlations of features and output, which is only related to the data...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2022-09, Vol.52 (12), p.14465-14477 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Lasso is a popular method for high-dimensional applications in machine learning. In this paper, we propose a novel variant of Lasso, named self-weighted Lasso (SWL). Self-weighted means that the weights are obtained based on the correlations of features and output, which is only related to the data itself. SWL inherits the extraordinary properties of Lasso, which means that it has the function of feature selection and continuous shrinkage simultaneously. Meanwhile, SWL ensures that the solution is consistent for feature selection, which improves the performance of Lasso. However, there are still huge challenges when training large-scale data with SWL. To improve the efficiency of SWL, especially for large-scale and high-dimensional problems, we propose an efficient acceleration strategy. It belongs to the state-of-the-art safe screening methods, which can significantly reduce the training time without sacrificing the accuracy. Experimental results on twelve benchmark datasets and a practical dataset verify that the SWL has better performance in comparison with other state-of-the-art regression algorithms, and the proposed safe screening rule has excellent efficiency. |
---|---|
ISSN: | 0924-669X 1573-7497 |
DOI: | 10.1007/s10489-022-03316-7 |