Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling

In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a s...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:AIChE journal 2023-04, Vol.69 (4), p.n/a
Hauptverfasser: Qin, S. Joe, Liu, Yiren, Tang, Shiqin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a standalone procedure. The PLS connections to the conjugate gradient, Krylov space, and the Cayley–Hamilton theorem for matrix pseudo‐inverse are explored based on known results in the literature. Comparison of PLS with the Lasso and ridge regression are given in terms of the different resolutions along the regularization paths, leading to an explanation of why PLS sometimes does not outperform the Lasso and ridge regression. As an attempt to increase resolutions along the regularization paths, a steepest descent PLS is formulated as a regularized regression alternative to PLS and is compared to other regularized algorithms via simulations and an industrial case study.
ISSN:0001-1541
1547-5905
DOI:10.1002/aic.17992