An Accelerated Maximally Split ADMM for a Class of Generalized Ridge Regression
Ridge regression (RR) has been commonly used in machine learning, but is facing computational challenges in big data applications. To meet the challenges, this article develops a highly parallel new algorithm, i.e., an accelerated maximally split alternating direction method of multipliers (A-MS-ADM...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2023-02, Vol.34 (2), p.958-972 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Ridge regression (RR) has been commonly used in machine learning, but is facing computational challenges in big data applications. To meet the challenges, this article develops a highly parallel new algorithm, i.e., an accelerated maximally split alternating direction method of multipliers (A-MS-ADMM), for a class of generalized RR (GRR) that allows different regularization factors for different regression coefficients. Linear convergence of the new algorithm along with its convergence ratio is established. Optimal parameters of the algorithm for the GRR with a particular set of regularization factors are derived, and a selection scheme of the algorithm parameters for the GRR with general regularization factors is also discussed. The new algorithm is then applied in the training of single-layer feedforward neural networks. Experimental results on performance validation on real-world benchmark datasets for regression and classification and comparisons with existing methods demonstrate the fast convergence, low computational complexity, and high parallelism of the new algorithm. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2021.3104840 |