Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms
We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, alo...
Gespeichert in:
Veröffentlicht in: | Journal of optimization theory and applications 2019-04, Vol.181 (1), p.244-278 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 278 |
---|---|
container_issue | 1 |
container_start_page | 244 |
container_title | Journal of optimization theory and applications |
container_volume | 181 |
creator | Ochs, Peter Fadili, Jalal Brox, Thomas |
description | We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search strategy, we obtain a flexible algorithm for which we prove (subsequential) convergence to a stationary point under weak assumptions on the growth of the model function error. Special instances of the algorithm with a Euclidean distance function are, for example, gradient descent, forward–backward splitting, ProxDescent, without the common requirement of a “Lipschitz continuous gradient”. In addition, we consider a broad class of Bregman distance functions (generated by Legendre functions), replacing the Euclidean distance. The algorithm has a wide range of applications including many linear and nonlinear inverse problems in signal/image processing and machine learning. |
doi_str_mv | 10.1007/s10957-018-01452-0 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2150946680</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2150946680</sourcerecordid><originalsourceid>FETCH-LOGICAL-c385t-4e6d41a33f24e50baf8622a99835758e88e2a2a2fc2360452835624986031dd63</originalsourceid><addsrcrecordid>eNp9UMtOwzAQtBBIlMIPcIrE2bC2Y8fhVirKQ6Vc6NkyidO6auxip7y-HrdB4oZWq31oZlY7CJ0TuCQAxVUkUPICA5Epc04xHKAB4QXDVBbyEA0AKMWMsvIYncS4AoBSFvkAPc68w7H1vltmu7by7t18ZjfBLFrtsifrbGu_dWe9u87mzja22g-ZdnU2Mx_ZaL3wwXbLNp6io0avozn7rUM0n9y-jO_x9PnuYTya4opJ3uHciDonmrGG5obDq26koFSXpWS84NJIaahO0VSUCUi_pL2geSkFMFLXgg3RRa-7Cf5ta2KnVn4bXDqpKOFQ5kJISCjao6rgYwymUZtgWx2-FAG180z1nqnkmdp7pnYk1pNiAruFCX_S_7B-AO2BbUA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2150946680</pqid></control><display><type>article</type><title>Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms</title><source>Springer Nature - Complete Springer Journals</source><creator>Ochs, Peter ; Fadili, Jalal ; Brox, Thomas</creator><creatorcontrib>Ochs, Peter ; Fadili, Jalal ; Brox, Thomas</creatorcontrib><description>We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search strategy, we obtain a flexible algorithm for which we prove (subsequential) convergence to a stationary point under weak assumptions on the growth of the model function error. Special instances of the algorithm with a Euclidean distance function are, for example, gradient descent, forward–backward splitting, ProxDescent, without the common requirement of a “Lipschitz continuous gradient”. In addition, we consider a broad class of Bregman distance functions (generated by Legendre functions), replacing the Euclidean distance. The algorithm has a wide range of applications including many linear and nonlinear inverse problems in signal/image processing and machine learning.</description><identifier>ISSN: 0022-3239</identifier><identifier>EISSN: 1573-2878</identifier><identifier>DOI: 10.1007/s10957-018-01452-0</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Applications of Mathematics ; Calculus of Variations and Optimal Control; Optimization ; Computational geometry ; Convexity ; Engineering ; Euclidean geometry ; Image processing ; Inverse problems ; Legendre functions ; Machine learning ; Mathematical models ; Mathematics ; Mathematics and Statistics ; Operations Research/Decision Theory ; Optimization ; Signal processing ; Theory of Computation</subject><ispartof>Journal of optimization theory and applications, 2019-04, Vol.181 (1), p.244-278</ispartof><rights>Springer Science+Business Media, LLC, part of Springer Nature 2018</rights><rights>Journal of Optimization Theory and Applications is a copyright of Springer, (2018). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c385t-4e6d41a33f24e50baf8622a99835758e88e2a2a2fc2360452835624986031dd63</citedby><cites>FETCH-LOGICAL-c385t-4e6d41a33f24e50baf8622a99835758e88e2a2a2fc2360452835624986031dd63</cites><orcidid>0000-0002-4880-7511</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10957-018-01452-0$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10957-018-01452-0$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27903,27904,41467,42536,51298</link.rule.ids></links><search><creatorcontrib>Ochs, Peter</creatorcontrib><creatorcontrib>Fadili, Jalal</creatorcontrib><creatorcontrib>Brox, Thomas</creatorcontrib><title>Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms</title><title>Journal of optimization theory and applications</title><addtitle>J Optim Theory Appl</addtitle><description>We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search strategy, we obtain a flexible algorithm for which we prove (subsequential) convergence to a stationary point under weak assumptions on the growth of the model function error. Special instances of the algorithm with a Euclidean distance function are, for example, gradient descent, forward–backward splitting, ProxDescent, without the common requirement of a “Lipschitz continuous gradient”. In addition, we consider a broad class of Bregman distance functions (generated by Legendre functions), replacing the Euclidean distance. The algorithm has a wide range of applications including many linear and nonlinear inverse problems in signal/image processing and machine learning.</description><subject>Algorithms</subject><subject>Applications of Mathematics</subject><subject>Calculus of Variations and Optimal Control; Optimization</subject><subject>Computational geometry</subject><subject>Convexity</subject><subject>Engineering</subject><subject>Euclidean geometry</subject><subject>Image processing</subject><subject>Inverse problems</subject><subject>Legendre functions</subject><subject>Machine learning</subject><subject>Mathematical models</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Operations Research/Decision Theory</subject><subject>Optimization</subject><subject>Signal processing</subject><subject>Theory of Computation</subject><issn>0022-3239</issn><issn>1573-2878</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp9UMtOwzAQtBBIlMIPcIrE2bC2Y8fhVirKQ6Vc6NkyidO6auxip7y-HrdB4oZWq31oZlY7CJ0TuCQAxVUkUPICA5Epc04xHKAB4QXDVBbyEA0AKMWMsvIYncS4AoBSFvkAPc68w7H1vltmu7by7t18ZjfBLFrtsifrbGu_dWe9u87mzja22g-ZdnU2Mx_ZaL3wwXbLNp6io0avozn7rUM0n9y-jO_x9PnuYTya4opJ3uHciDonmrGG5obDq26koFSXpWS84NJIaahO0VSUCUi_pL2geSkFMFLXgg3RRa-7Cf5ta2KnVn4bXDqpKOFQ5kJISCjao6rgYwymUZtgWx2-FAG180z1nqnkmdp7pnYk1pNiAruFCX_S_7B-AO2BbUA</recordid><startdate>20190401</startdate><enddate>20190401</enddate><creator>Ochs, Peter</creator><creator>Fadili, Jalal</creator><creator>Brox, Thomas</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-4880-7511</orcidid></search><sort><creationdate>20190401</creationdate><title>Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms</title><author>Ochs, Peter ; Fadili, Jalal ; Brox, Thomas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c385t-4e6d41a33f24e50baf8622a99835758e88e2a2a2fc2360452835624986031dd63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Applications of Mathematics</topic><topic>Calculus of Variations and Optimal Control; Optimization</topic><topic>Computational geometry</topic><topic>Convexity</topic><topic>Engineering</topic><topic>Euclidean geometry</topic><topic>Image processing</topic><topic>Inverse problems</topic><topic>Legendre functions</topic><topic>Machine learning</topic><topic>Mathematical models</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Operations Research/Decision Theory</topic><topic>Optimization</topic><topic>Signal processing</topic><topic>Theory of Computation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ochs, Peter</creatorcontrib><creatorcontrib>Fadili, Jalal</creatorcontrib><creatorcontrib>Brox, Thomas</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of optimization theory and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ochs, Peter</au><au>Fadili, Jalal</au><au>Brox, Thomas</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms</atitle><jtitle>Journal of optimization theory and applications</jtitle><stitle>J Optim Theory Appl</stitle><date>2019-04-01</date><risdate>2019</risdate><volume>181</volume><issue>1</issue><spage>244</spage><epage>278</epage><pages>244-278</pages><issn>0022-3239</issn><eissn>1573-2878</eissn><abstract>We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search strategy, we obtain a flexible algorithm for which we prove (subsequential) convergence to a stationary point under weak assumptions on the growth of the model function error. Special instances of the algorithm with a Euclidean distance function are, for example, gradient descent, forward–backward splitting, ProxDescent, without the common requirement of a “Lipschitz continuous gradient”. In addition, we consider a broad class of Bregman distance functions (generated by Legendre functions), replacing the Euclidean distance. The algorithm has a wide range of applications including many linear and nonlinear inverse problems in signal/image processing and machine learning.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10957-018-01452-0</doi><tpages>35</tpages><orcidid>https://orcid.org/0000-0002-4880-7511</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0022-3239 |
ispartof | Journal of optimization theory and applications, 2019-04, Vol.181 (1), p.244-278 |
issn | 0022-3239 1573-2878 |
language | eng |
recordid | cdi_proquest_journals_2150946680 |
source | Springer Nature - Complete Springer Journals |
subjects | Algorithms Applications of Mathematics Calculus of Variations and Optimal Control Optimization Computational geometry Convexity Engineering Euclidean geometry Image processing Inverse problems Legendre functions Machine learning Mathematical models Mathematics Mathematics and Statistics Operations Research/Decision Theory Optimization Signal processing Theory of Computation |
title | Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T06%3A29%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Non-smooth%20Non-convex%20Bregman%20Minimization:%20Unification%20and%20New%20Algorithms&rft.jtitle=Journal%20of%20optimization%20theory%20and%20applications&rft.au=Ochs,%20Peter&rft.date=2019-04-01&rft.volume=181&rft.issue=1&rft.spage=244&rft.epage=278&rft.pages=244-278&rft.issn=0022-3239&rft.eissn=1573-2878&rft_id=info:doi/10.1007/s10957-018-01452-0&rft_dat=%3Cproquest_cross%3E2150946680%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2150946680&rft_id=info:pmid/&rfr_iscdi=true |