Nonmonotone Quasi–Newton-based conjugate gradient methods with application to signal processing
Founded upon a sparse estimation of the Hessian obtained by a recent diagonal quasi-Newton update, a conjugacy condition is given, and then, a class of conjugate gradient methods is developed, being modifications of the Hestenes–Stiefel method. According to the given sparse approximation, the curvat...
Gespeichert in:
Veröffentlicht in: | Numerical algorithms 2023-08, Vol.93 (4), p.1527-1541 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1541 |
---|---|
container_issue | 4 |
container_start_page | 1527 |
container_title | Numerical algorithms |
container_volume | 93 |
creator | Aminifard, Zohre Babaie–Kafaki, Saman Dargahi, Fatemeh |
description | Founded upon a sparse estimation of the Hessian obtained by a recent diagonal quasi-Newton update, a conjugacy condition is given, and then, a class of conjugate gradient methods is developed, being modifications of the Hestenes–Stiefel method. According to the given sparse approximation, the curvature condition is guaranteed regardless of the line search technique. Convergence analysis is conducted without convexity assumption, based on a nonmonotone Armijo line search in which a forgetting factor is embedded to enhance probability of applying more recent available information. Practical advantages of the method are computationally depicted on a set of CUTEr test functions and also, on the well-known signal processing problems such as sparse recovery and nonnegative matrix factorization. |
doi_str_mv | 10.1007/s11075-022-01477-7 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2918606504</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2918606504</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-6b0aa46e3a5d657077c218c68b7f6c0ff63975ea6380e8f5aa3df1d33299bc5c3</originalsourceid><addsrcrecordid>eNp9kM1KAzEQx4MoWKsv4CngOZqPJtk9SvELSkXQc8hms9uUNqlJluLNd_ANfRKjK3jzNDPw_80MPwDOCb4kGMurRAiWHGFKESYzKZE8ABPCJUU1Ffyw9JhIRFhdHYOTlNYYF4zKCdDL4LfBhxy8hU-DTu7z_WNp92VGjU62hSb49dDrbGEfdeusz3Br8yq0Ce5dXkG9222c0dkFD3OAyfVeb-AuBmNTcr4_BUed3iR79lun4OX25nl-jxaPdw_z6wUyjNQZiQZrPROWad4KLrGUhpLKiKqRnTC46wSrJbdasArbquNas7YjLWO0rhvDDZuCi3FvOf062JTVOgyx_JIUrUklsOB4VlJ0TJkYUoq2U7votjq-KYLVt0o1qlRFpfpRqWSB2AilEva9jX-r_6G-ANL7eXw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2918606504</pqid></control><display><type>article</type><title>Nonmonotone Quasi–Newton-based conjugate gradient methods with application to signal processing</title><source>SpringerLink Journals - AutoHoldings</source><creator>Aminifard, Zohre ; Babaie–Kafaki, Saman ; Dargahi, Fatemeh</creator><creatorcontrib>Aminifard, Zohre ; Babaie–Kafaki, Saman ; Dargahi, Fatemeh</creatorcontrib><description>Founded upon a sparse estimation of the Hessian obtained by a recent diagonal quasi-Newton update, a conjugacy condition is given, and then, a class of conjugate gradient methods is developed, being modifications of the Hestenes–Stiefel method. According to the given sparse approximation, the curvature condition is guaranteed regardless of the line search technique. Convergence analysis is conducted without convexity assumption, based on a nonmonotone Armijo line search in which a forgetting factor is embedded to enhance probability of applying more recent available information. Practical advantages of the method are computationally depicted on a set of CUTEr test functions and also, on the well-known signal processing problems such as sparse recovery and nonnegative matrix factorization.</description><identifier>ISSN: 1017-1398</identifier><identifier>EISSN: 1572-9265</identifier><identifier>DOI: 10.1007/s11075-022-01477-7</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algebra ; Algorithms ; Computer Science ; Conjugate gradient method ; Convexity ; Mathematical analysis ; Methods ; Numeric Computing ; Numerical Analysis ; Optimization ; Original Paper ; Signal processing ; Theory of Computation</subject><ispartof>Numerical algorithms, 2023-08, Vol.93 (4), p.1527-1541</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-6b0aa46e3a5d657077c218c68b7f6c0ff63975ea6380e8f5aa3df1d33299bc5c3</citedby><cites>FETCH-LOGICAL-c319t-6b0aa46e3a5d657077c218c68b7f6c0ff63975ea6380e8f5aa3df1d33299bc5c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11075-022-01477-7$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11075-022-01477-7$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Aminifard, Zohre</creatorcontrib><creatorcontrib>Babaie–Kafaki, Saman</creatorcontrib><creatorcontrib>Dargahi, Fatemeh</creatorcontrib><title>Nonmonotone Quasi–Newton-based conjugate gradient methods with application to signal processing</title><title>Numerical algorithms</title><addtitle>Numer Algor</addtitle><description>Founded upon a sparse estimation of the Hessian obtained by a recent diagonal quasi-Newton update, a conjugacy condition is given, and then, a class of conjugate gradient methods is developed, being modifications of the Hestenes–Stiefel method. According to the given sparse approximation, the curvature condition is guaranteed regardless of the line search technique. Convergence analysis is conducted without convexity assumption, based on a nonmonotone Armijo line search in which a forgetting factor is embedded to enhance probability of applying more recent available information. Practical advantages of the method are computationally depicted on a set of CUTEr test functions and also, on the well-known signal processing problems such as sparse recovery and nonnegative matrix factorization.</description><subject>Algebra</subject><subject>Algorithms</subject><subject>Computer Science</subject><subject>Conjugate gradient method</subject><subject>Convexity</subject><subject>Mathematical analysis</subject><subject>Methods</subject><subject>Numeric Computing</subject><subject>Numerical Analysis</subject><subject>Optimization</subject><subject>Original Paper</subject><subject>Signal processing</subject><subject>Theory of Computation</subject><issn>1017-1398</issn><issn>1572-9265</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp9kM1KAzEQx4MoWKsv4CngOZqPJtk9SvELSkXQc8hms9uUNqlJluLNd_ANfRKjK3jzNDPw_80MPwDOCb4kGMurRAiWHGFKESYzKZE8ABPCJUU1Ffyw9JhIRFhdHYOTlNYYF4zKCdDL4LfBhxy8hU-DTu7z_WNp92VGjU62hSb49dDrbGEfdeusz3Br8yq0Ce5dXkG9222c0dkFD3OAyfVeb-AuBmNTcr4_BUed3iR79lun4OX25nl-jxaPdw_z6wUyjNQZiQZrPROWad4KLrGUhpLKiKqRnTC46wSrJbdasArbquNas7YjLWO0rhvDDZuCi3FvOf062JTVOgyx_JIUrUklsOB4VlJ0TJkYUoq2U7votjq-KYLVt0o1qlRFpfpRqWSB2AilEva9jX-r_6G-ANL7eXw</recordid><startdate>20230801</startdate><enddate>20230801</enddate><creator>Aminifard, Zohre</creator><creator>Babaie–Kafaki, Saman</creator><creator>Dargahi, Fatemeh</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L6V</scope><scope>M7S</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope></search><sort><creationdate>20230801</creationdate><title>Nonmonotone Quasi–Newton-based conjugate gradient methods with application to signal processing</title><author>Aminifard, Zohre ; Babaie–Kafaki, Saman ; Dargahi, Fatemeh</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-6b0aa46e3a5d657077c218c68b7f6c0ff63975ea6380e8f5aa3df1d33299bc5c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algebra</topic><topic>Algorithms</topic><topic>Computer Science</topic><topic>Conjugate gradient method</topic><topic>Convexity</topic><topic>Mathematical analysis</topic><topic>Methods</topic><topic>Numeric Computing</topic><topic>Numerical Analysis</topic><topic>Optimization</topic><topic>Original Paper</topic><topic>Signal processing</topic><topic>Theory of Computation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Aminifard, Zohre</creatorcontrib><creatorcontrib>Babaie–Kafaki, Saman</creatorcontrib><creatorcontrib>Dargahi, Fatemeh</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><jtitle>Numerical algorithms</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Aminifard, Zohre</au><au>Babaie–Kafaki, Saman</au><au>Dargahi, Fatemeh</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Nonmonotone Quasi–Newton-based conjugate gradient methods with application to signal processing</atitle><jtitle>Numerical algorithms</jtitle><stitle>Numer Algor</stitle><date>2023-08-01</date><risdate>2023</risdate><volume>93</volume><issue>4</issue><spage>1527</spage><epage>1541</epage><pages>1527-1541</pages><issn>1017-1398</issn><eissn>1572-9265</eissn><abstract>Founded upon a sparse estimation of the Hessian obtained by a recent diagonal quasi-Newton update, a conjugacy condition is given, and then, a class of conjugate gradient methods is developed, being modifications of the Hestenes–Stiefel method. According to the given sparse approximation, the curvature condition is guaranteed regardless of the line search technique. Convergence analysis is conducted without convexity assumption, based on a nonmonotone Armijo line search in which a forgetting factor is embedded to enhance probability of applying more recent available information. Practical advantages of the method are computationally depicted on a set of CUTEr test functions and also, on the well-known signal processing problems such as sparse recovery and nonnegative matrix factorization.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s11075-022-01477-7</doi><tpages>15</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1017-1398 |
ispartof | Numerical algorithms, 2023-08, Vol.93 (4), p.1527-1541 |
issn | 1017-1398 1572-9265 |
language | eng |
recordid | cdi_proquest_journals_2918606504 |
source | SpringerLink Journals - AutoHoldings |
subjects | Algebra Algorithms Computer Science Conjugate gradient method Convexity Mathematical analysis Methods Numeric Computing Numerical Analysis Optimization Original Paper Signal processing Theory of Computation |
title | Nonmonotone Quasi–Newton-based conjugate gradient methods with application to signal processing |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T23%3A03%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Nonmonotone%20Quasi%E2%80%93Newton-based%20conjugate%20gradient%20methods%20with%20application%20to%20signal%20processing&rft.jtitle=Numerical%20algorithms&rft.au=Aminifard,%20Zohre&rft.date=2023-08-01&rft.volume=93&rft.issue=4&rft.spage=1527&rft.epage=1541&rft.pages=1527-1541&rft.issn=1017-1398&rft.eissn=1572-9265&rft_id=info:doi/10.1007/s11075-022-01477-7&rft_dat=%3Cproquest_cross%3E2918606504%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2918606504&rft_id=info:pmid/&rfr_iscdi=true |