Tight Convergence Rate in Subgradient Norm of the Proximal Point Algorithm
Proximal point algorithm has found many applications, and it has been playing fundamental roles in the understanding, design, and analysis of many first-order methods. In this paper, we derive the tight convergence rate in subgradient norm of the proximal point algorithm, which was conjectured by Ta...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Gu, Guoyong Yang, Junfeng |
description | Proximal point algorithm has found many applications, and it has been playing
fundamental roles in the understanding, design, and analysis of many
first-order methods. In this paper, we derive the tight convergence rate in
subgradient norm of the proximal point algorithm, which was conjectured by
Taylor, Hendrickx and Glineur [SIAM J.~Optim., 27 (2017), pp.~1283--1313]. This
sort of convergence results in terms of the residual (sub)gradient norm is
particularly interesting when considering dual methods, where the dual residual
gradient norm corresponds to the primal distance to feasibility. |
doi_str_mv | 10.48550/arxiv.2301.03175 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2301_03175</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2301_03175</sourcerecordid><originalsourceid>FETCH-LOGICAL-a675-87555366b0300418644695ab676d80de4f15d7df3e378e0cbf6fee782980f1af3</originalsourceid><addsrcrecordid>eNotz01OwzAUBGBvWKDCAVjhCyQ8179dVhEUUNVWJfvIqZ8TS0mMXFOV21MKq1mMNJqPkAcGpTBSwpNN53Aq5xxYCZxpeUve69D1mVZxOmHqcDog3duMNEz046vtknUBp0w3MY00epp7pLsUz2G0A93FcKmWQxdTyP14R268HY54_58zUr8819Vrsd6u3qrlurBKy8JoKSVXqgUOIJhRQqiFtK3SyhlwKDyTTjvPkWuDcGi98ojazBcGPLOez8jj3-wV03ymy5f03fyimiuK_wC84kaG</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Tight Convergence Rate in Subgradient Norm of the Proximal Point Algorithm</title><source>arXiv.org</source><creator>Gu, Guoyong ; Yang, Junfeng</creator><creatorcontrib>Gu, Guoyong ; Yang, Junfeng</creatorcontrib><description>Proximal point algorithm has found many applications, and it has been playing
fundamental roles in the understanding, design, and analysis of many
first-order methods. In this paper, we derive the tight convergence rate in
subgradient norm of the proximal point algorithm, which was conjectured by
Taylor, Hendrickx and Glineur [SIAM J.~Optim., 27 (2017), pp.~1283--1313]. This
sort of convergence results in terms of the residual (sub)gradient norm is
particularly interesting when considering dual methods, where the dual residual
gradient norm corresponds to the primal distance to feasibility.</description><identifier>DOI: 10.48550/arxiv.2301.03175</identifier><language>eng</language><subject>Mathematics - Optimization and Control</subject><creationdate>2023-01</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2301.03175$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2301.03175$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Gu, Guoyong</creatorcontrib><creatorcontrib>Yang, Junfeng</creatorcontrib><title>Tight Convergence Rate in Subgradient Norm of the Proximal Point Algorithm</title><description>Proximal point algorithm has found many applications, and it has been playing
fundamental roles in the understanding, design, and analysis of many
first-order methods. In this paper, we derive the tight convergence rate in
subgradient norm of the proximal point algorithm, which was conjectured by
Taylor, Hendrickx and Glineur [SIAM J.~Optim., 27 (2017), pp.~1283--1313]. This
sort of convergence results in terms of the residual (sub)gradient norm is
particularly interesting when considering dual methods, where the dual residual
gradient norm corresponds to the primal distance to feasibility.</description><subject>Mathematics - Optimization and Control</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz01OwzAUBGBvWKDCAVjhCyQ8179dVhEUUNVWJfvIqZ8TS0mMXFOV21MKq1mMNJqPkAcGpTBSwpNN53Aq5xxYCZxpeUve69D1mVZxOmHqcDog3duMNEz046vtknUBp0w3MY00epp7pLsUz2G0A93FcKmWQxdTyP14R268HY54_58zUr8819Vrsd6u3qrlurBKy8JoKSVXqgUOIJhRQqiFtK3SyhlwKDyTTjvPkWuDcGi98ojazBcGPLOez8jj3-wV03ymy5f03fyimiuK_wC84kaG</recordid><startdate>20230109</startdate><enddate>20230109</enddate><creator>Gu, Guoyong</creator><creator>Yang, Junfeng</creator><scope>AKZ</scope><scope>GOX</scope></search><sort><creationdate>20230109</creationdate><title>Tight Convergence Rate in Subgradient Norm of the Proximal Point Algorithm</title><author>Gu, Guoyong ; Yang, Junfeng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a675-87555366b0300418644695ab676d80de4f15d7df3e378e0cbf6fee782980f1af3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Mathematics - Optimization and Control</topic><toplevel>online_resources</toplevel><creatorcontrib>Gu, Guoyong</creatorcontrib><creatorcontrib>Yang, Junfeng</creatorcontrib><collection>arXiv Mathematics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Gu, Guoyong</au><au>Yang, Junfeng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Tight Convergence Rate in Subgradient Norm of the Proximal Point Algorithm</atitle><date>2023-01-09</date><risdate>2023</risdate><abstract>Proximal point algorithm has found many applications, and it has been playing
fundamental roles in the understanding, design, and analysis of many
first-order methods. In this paper, we derive the tight convergence rate in
subgradient norm of the proximal point algorithm, which was conjectured by
Taylor, Hendrickx and Glineur [SIAM J.~Optim., 27 (2017), pp.~1283--1313]. This
sort of convergence results in terms of the residual (sub)gradient norm is
particularly interesting when considering dual methods, where the dual residual
gradient norm corresponds to the primal distance to feasibility.</abstract><doi>10.48550/arxiv.2301.03175</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2301.03175 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2301_03175 |
source | arXiv.org |
subjects | Mathematics - Optimization and Control |
title | Tight Convergence Rate in Subgradient Norm of the Proximal Point Algorithm |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T15%3A42%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Tight%20Convergence%20Rate%20in%20Subgradient%20Norm%20of%20the%20Proximal%20Point%20Algorithm&rft.au=Gu,%20Guoyong&rft.date=2023-01-09&rft_id=info:doi/10.48550/arxiv.2301.03175&rft_dat=%3Carxiv_GOX%3E2301_03175%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |