Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling
In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a s...
Gespeichert in:
Veröffentlicht in: | AIChE journal 2023-04, Vol.69 (4), p.n/a |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | n/a |
---|---|
container_issue | 4 |
container_start_page | |
container_title | AIChE journal |
container_volume | 69 |
creator | Qin, S. Joe Liu, Yiren Tang, Shiqin |
description | In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a standalone procedure. The PLS connections to the conjugate gradient, Krylov space, and the Cayley–Hamilton theorem for matrix pseudo‐inverse are explored based on known results in the literature. Comparison of PLS with the Lasso and ridge regression are given in terms of the different resolutions along the regularization paths, leading to an explanation of why PLS sometimes does not outperform the Lasso and ridge regression. As an attempt to increase resolutions along the regularization paths, a steepest descent PLS is formulated as a regularized regression alternative to PLS and is compared to other regularized algorithms via simulations and an industrial case study. |
doi_str_mv | 10.1002/aic.17992 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2788712541</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2788712541</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2972-9ba4f7b91961d96414e78b1ba7ae5ab62a2a15882ee30c79a5cf9c6feb0b983b3</originalsourceid><addsrcrecordid>eNp1kMtOwzAQRS0EEqWw4A8ssUJqWttN4nhZVTwqVYIFrK2xM4lcpUlrJ6Dy9RjCltVors487iXklrM5Z0wswNk5l0qJMzLhWSqTTLHsnEwYYzyJAr8kVyHsYidkISbEvoLvHTS0QQg9DccBPIYZDT3iAaNSYrDY9jMKbUlt1-6GGnqktYfSRZ1Wnace66EB776wpAePpbO9-0C670psXFtfk4sKmoA3f3VK3h8f3tbPyfblabNebRMrlBSJMpBW0iiucl6qPOUpysJwAxIwA5MLEMCzohCIS2algsxWyuYVGmZUsTTLKbkb9x58dxzi83rXDb6NJ3U0W0guov9I3Y-U9V0IHit98G4P_qQ50z8Z6pih_s0wsouR_XQNnv4H9WqzHie-AewTdGs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2788712541</pqid></control><display><type>article</type><title>Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling</title><source>Wiley Online Library All Journals</source><creator>Qin, S. Joe ; Liu, Yiren ; Tang, Shiqin</creator><creatorcontrib>Qin, S. Joe ; Liu, Yiren ; Tang, Shiqin</creatorcontrib><description>In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a standalone procedure. The PLS connections to the conjugate gradient, Krylov space, and the Cayley–Hamilton theorem for matrix pseudo‐inverse are explored based on known results in the literature. Comparison of PLS with the Lasso and ridge regression are given in terms of the different resolutions along the regularization paths, leading to an explanation of why PLS sometimes does not outperform the Lasso and ridge regression. As an attempt to increase resolutions along the regularization paths, a steepest descent PLS is formulated as a regularized regression alternative to PLS and is compared to other regularized algorithms via simulations and an industrial case study.</description><identifier>ISSN: 0001-1541</identifier><identifier>EISSN: 1547-5905</identifier><identifier>DOI: 10.1002/aic.17992</identifier><language>eng</language><publisher>Hoboken, USA: John Wiley & Sons, Inc</publisher><subject>Algorithms ; conjugate gradient ; Conjugate gradient method ; latent variable analysis ; Least squares ; partial least squares analysis ; partial least squares regression ; Prediction models ; Regression ; Regularization ; regularized regression ; steepest descent</subject><ispartof>AIChE journal, 2023-04, Vol.69 (4), p.n/a</ispartof><rights>2022 American Institute of Chemical Engineers.</rights><rights>2023 American Institute of Chemical Engineers</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2972-9ba4f7b91961d96414e78b1ba7ae5ab62a2a15882ee30c79a5cf9c6feb0b983b3</citedby><cites>FETCH-LOGICAL-c2972-9ba4f7b91961d96414e78b1ba7ae5ab62a2a15882ee30c79a5cf9c6feb0b983b3</cites><orcidid>0000-0001-7631-2535</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Faic.17992$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Faic.17992$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,780,784,1416,27915,27916,45565,45566</link.rule.ids></links><search><creatorcontrib>Qin, S. Joe</creatorcontrib><creatorcontrib>Liu, Yiren</creatorcontrib><creatorcontrib>Tang, Shiqin</creatorcontrib><title>Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling</title><title>AIChE journal</title><description>In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a standalone procedure. The PLS connections to the conjugate gradient, Krylov space, and the Cayley–Hamilton theorem for matrix pseudo‐inverse are explored based on known results in the literature. Comparison of PLS with the Lasso and ridge regression are given in terms of the different resolutions along the regularization paths, leading to an explanation of why PLS sometimes does not outperform the Lasso and ridge regression. As an attempt to increase resolutions along the regularization paths, a steepest descent PLS is formulated as a regularized regression alternative to PLS and is compared to other regularized algorithms via simulations and an industrial case study.</description><subject>Algorithms</subject><subject>conjugate gradient</subject><subject>Conjugate gradient method</subject><subject>latent variable analysis</subject><subject>Least squares</subject><subject>partial least squares analysis</subject><subject>partial least squares regression</subject><subject>Prediction models</subject><subject>Regression</subject><subject>Regularization</subject><subject>regularized regression</subject><subject>steepest descent</subject><issn>0001-1541</issn><issn>1547-5905</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp1kMtOwzAQRS0EEqWw4A8ssUJqWttN4nhZVTwqVYIFrK2xM4lcpUlrJ6Dy9RjCltVors487iXklrM5Z0wswNk5l0qJMzLhWSqTTLHsnEwYYzyJAr8kVyHsYidkISbEvoLvHTS0QQg9DccBPIYZDT3iAaNSYrDY9jMKbUlt1-6GGnqktYfSRZ1Wnace66EB776wpAePpbO9-0C670psXFtfk4sKmoA3f3VK3h8f3tbPyfblabNebRMrlBSJMpBW0iiucl6qPOUpysJwAxIwA5MLEMCzohCIS2algsxWyuYVGmZUsTTLKbkb9x58dxzi83rXDb6NJ3U0W0guov9I3Y-U9V0IHit98G4P_qQ50z8Z6pih_s0wsouR_XQNnv4H9WqzHie-AewTdGs</recordid><startdate>202304</startdate><enddate>202304</enddate><creator>Qin, S. Joe</creator><creator>Liu, Yiren</creator><creator>Tang, Shiqin</creator><general>John Wiley & Sons, Inc</general><general>American Institute of Chemical Engineers</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7ST</scope><scope>7U5</scope><scope>8FD</scope><scope>C1K</scope><scope>L7M</scope><scope>SOI</scope><orcidid>https://orcid.org/0000-0001-7631-2535</orcidid></search><sort><creationdate>202304</creationdate><title>Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling</title><author>Qin, S. Joe ; Liu, Yiren ; Tang, Shiqin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2972-9ba4f7b91961d96414e78b1ba7ae5ab62a2a15882ee30c79a5cf9c6feb0b983b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>conjugate gradient</topic><topic>Conjugate gradient method</topic><topic>latent variable analysis</topic><topic>Least squares</topic><topic>partial least squares analysis</topic><topic>partial least squares regression</topic><topic>Prediction models</topic><topic>Regression</topic><topic>Regularization</topic><topic>regularized regression</topic><topic>steepest descent</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Qin, S. Joe</creatorcontrib><creatorcontrib>Liu, Yiren</creatorcontrib><creatorcontrib>Tang, Shiqin</creatorcontrib><collection>CrossRef</collection><collection>Environment Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Environment Abstracts</collection><jtitle>AIChE journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Qin, S. Joe</au><au>Liu, Yiren</au><au>Tang, Shiqin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling</atitle><jtitle>AIChE journal</jtitle><date>2023-04</date><risdate>2023</risdate><volume>69</volume><issue>4</issue><epage>n/a</epage><issn>0001-1541</issn><eissn>1547-5905</eissn><abstract>In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a standalone procedure. The PLS connections to the conjugate gradient, Krylov space, and the Cayley–Hamilton theorem for matrix pseudo‐inverse are explored based on known results in the literature. Comparison of PLS with the Lasso and ridge regression are given in terms of the different resolutions along the regularization paths, leading to an explanation of why PLS sometimes does not outperform the Lasso and ridge regression. As an attempt to increase resolutions along the regularization paths, a steepest descent PLS is formulated as a regularized regression alternative to PLS and is compared to other regularized algorithms via simulations and an industrial case study.</abstract><cop>Hoboken, USA</cop><pub>John Wiley & Sons, Inc</pub><doi>10.1002/aic.17992</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0001-7631-2535</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0001-1541 |
ispartof | AIChE journal, 2023-04, Vol.69 (4), p.n/a |
issn | 0001-1541 1547-5905 |
language | eng |
recordid | cdi_proquest_journals_2788712541 |
source | Wiley Online Library All Journals |
subjects | Algorithms conjugate gradient Conjugate gradient method latent variable analysis Least squares partial least squares analysis partial least squares regression Prediction models Regression Regularization regularized regression steepest descent |
title | Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T01%3A17%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Partial%20least%20squares,%20steepest%20descent,%20and%20conjugate%20gradient%20for%20regularized%20predictive%20modeling&rft.jtitle=AIChE%20journal&rft.au=Qin,%20S.%20Joe&rft.date=2023-04&rft.volume=69&rft.issue=4&rft.epage=n/a&rft.issn=0001-1541&rft.eissn=1547-5905&rft_id=info:doi/10.1002/aic.17992&rft_dat=%3Cproquest_cross%3E2788712541%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2788712541&rft_id=info:pmid/&rfr_iscdi=true |