PSNA: A pathwise semismooth Newton algorithm for sparse recovery with optimal local convergence and oracle properties

•Highlights of revision of “PSNA: A pathwise semismooth Newton algorithm for sparse recovery with optimal local convergence and oracle properties” (SIGPRO-D-20-01795)•In this paper, we propose a pathwise semismooth Newton algorithm (PSNA) for sparse recovery in high-dimensional linear models. PSNA i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Signal processing 2022-05, Vol.194, p.108432, Article 108432
Hauptverfasser: Huang, Jian, Jiao, Yuling, Lu, Xiliang, Shi, Yueyong, Yang, Qinglong, Yang, Yuanyuan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Highlights of revision of “PSNA: A pathwise semismooth Newton algorithm for sparse recovery with optimal local convergence and oracle properties” (SIGPRO-D-20-01795)•In this paper, we propose a pathwise semismooth Newton algorithm (PSNA) for sparse recovery in high-dimensional linear models. PSNA is derived from a formulation of the KKT conditions for Lasso and Enet based on Newton derivatives. It solves the semismooth KKT equations efficiently by actively and continuously seeking the support of the regression coefficients along the solution path with warm start. At each knot in the path, PSNA converges locally superlinearly for the Enet criterion and achieves the best possible convergence rate for the Lasso criterion, i.e., PSNA converges in just one step at the cost of two matrixvector multiplication per iteration. Under certain regularity conditions on the design matrix and the minimum magnitude of the nonzero elements of the target regression coefficients, we show that PSNA hits a solution with the same signs as the regression coefficients and achieves a sharp estimation error bound in finite steps with high probability. Extensive simulation studies support our theoretical results and indicate that PSNA is competitive with or outperforms state-of-the-art Lasso solvers in terms of efficiency and accuracy.•We thank the editor and reviewers for their thorough review of our manuscript and for your careful reading and constructive comments. We have revised our manuscript by incorporating all of your suggestions. Our point-by-point responses to the editor and reviewer comments are included in separate flles. We propose a pathwise semismooth Newton algorithm (PSNA) for sparse recovery in high-dimensional linear models. PSNA is derived from a formulation of the KKT conditions for Lasso and Enet based on Newton derivatives. It solves the semismooth KKT equations efficiently by actively and continuously seeking the support of the regression coefficients along the solution path with warm start. At each knot in the path, PSNA converges locally superlinearly for the Enet criterion and achieves the best possible convergence rate for the Lasso criterion, i.e., PSNA converges in just one step at the cost of two matrix-vector multiplication per iteration. Under certain regularity conditions on the design matrix and the minimum magnitude of the nonzero elements of the target regression coefficients, we show that PSNA hits a solution with the same signs as the regressio
ISSN:0165-1684
1872-7557
DOI:10.1016/j.sigpro.2021.108432