Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization

We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few meth...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of optimization theory and applications 2019-05, Vol.181 (2), p.541-566
Hauptverfasser: Hien, Le Thi Khanh, Nguyen, Cuong V., Xu, Huan, Lu, Canyi, Feng, Jiashi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 566
container_issue 2
container_start_page 541
container_title Journal of optimization theory and applications
container_volume 181
creator Hien, Le Thi Khanh
Nguyen, Cuong V.
Xu, Huan
Lu, Canyi
Feng, Jiashi
description We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex case. Adding a small quadratic regularization is a common devise used to tackle non-strongly convex problems; however, it may cause loss of sparsity of solutions or weaken the performance of the algorithms. Avoiding this devise, we propose an accelerated randomized mirror descent method for solving this problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied, even when proximal points are computed inexactly. We also propose a scheme for solving the problem, when the component functions are non-smooth.
doi_str_mv 10.1007/s10957-018-01469-5
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2167529476</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2167529476</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-e2e962fd68d71a7990278c7c4b485107ebfcbf103eab3dae1d00da659dc7de843</originalsourceid><addsrcrecordid>eNp9kEtLxDAUhYMoOD7-gKuC62iSPtIsh_EJowOi65Amt2OHNqlJRhx_vdEK7lxc7uVwzrnwIXRGyQUlhF8GSkTJMaF1mqISuNxDM1ryHLOa1_toRghjOGe5OERHIWwIIaLmxQzJudbQg1cRTPakrHFD95nOh85757MrCBpszOb92vkuvg4ha5O8cMPoQhche3QWh-idXfe7JNt3-MhWY-xSi4qdsyfooFV9gNPffYxebq6fF3d4ubq9X8yXWOdURAwMRMVaU9WGU8WFIIzXmuuiKeqSEg5Nq5uWkhxUkxsF1BBiVFUKo7mBusiP0fnUO3r3toUQ5cZtvU0vJaMVL5koeJVcbHJp70Lw0MrRd4PyO0mJ_AYpJ5AygZQ_IGWZQvkUCsls1-D_qv9JfQFoWHgW</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2167529476</pqid></control><display><type>article</type><title>Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization</title><source>SpringerLink Journals - AutoHoldings</source><creator>Hien, Le Thi Khanh ; Nguyen, Cuong V. ; Xu, Huan ; Lu, Canyi ; Feng, Jiashi</creator><creatorcontrib>Hien, Le Thi Khanh ; Nguyen, Cuong V. ; Xu, Huan ; Lu, Canyi ; Feng, Jiashi</creatorcontrib><description>We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex case. Adding a small quadratic regularization is a common devise used to tackle non-strongly convex problems; however, it may cause loss of sparsity of solutions or weaken the performance of the algorithms. Avoiding this devise, we propose an accelerated randomized mirror descent method for solving this problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied, even when proximal points are computed inexactly. We also propose a scheme for solving the problem, when the component functions are non-smooth.</description><identifier>ISSN: 0022-3239</identifier><identifier>EISSN: 1573-2878</identifier><identifier>DOI: 10.1007/s10957-018-01469-5</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Applications of Mathematics ; Calculus of Variations and Optimal Control; Optimization ; Computational geometry ; Convexity ; Descent ; Engineering ; Mathematics ; Mathematics and Statistics ; Operations Research/Decision Theory ; Optimization ; Randomization ; Regularization ; Theory of Computation</subject><ispartof>Journal of optimization theory and applications, 2019-05, Vol.181 (2), p.541-566</ispartof><rights>Springer Science+Business Media, LLC, part of Springer Nature 2019</rights><rights>Journal of Optimization Theory and Applications is a copyright of Springer, (2019). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-e2e962fd68d71a7990278c7c4b485107ebfcbf103eab3dae1d00da659dc7de843</citedby><cites>FETCH-LOGICAL-c319t-e2e962fd68d71a7990278c7c4b485107ebfcbf103eab3dae1d00da659dc7de843</cites><orcidid>0000-0003-2532-4637</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10957-018-01469-5$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10957-018-01469-5$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27903,27904,41467,42536,51297</link.rule.ids></links><search><creatorcontrib>Hien, Le Thi Khanh</creatorcontrib><creatorcontrib>Nguyen, Cuong V.</creatorcontrib><creatorcontrib>Xu, Huan</creatorcontrib><creatorcontrib>Lu, Canyi</creatorcontrib><creatorcontrib>Feng, Jiashi</creatorcontrib><title>Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization</title><title>Journal of optimization theory and applications</title><addtitle>J Optim Theory Appl</addtitle><description>We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex case. Adding a small quadratic regularization is a common devise used to tackle non-strongly convex problems; however, it may cause loss of sparsity of solutions or weaken the performance of the algorithms. Avoiding this devise, we propose an accelerated randomized mirror descent method for solving this problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied, even when proximal points are computed inexactly. We also propose a scheme for solving the problem, when the component functions are non-smooth.</description><subject>Algorithms</subject><subject>Applications of Mathematics</subject><subject>Calculus of Variations and Optimal Control; Optimization</subject><subject>Computational geometry</subject><subject>Convexity</subject><subject>Descent</subject><subject>Engineering</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Operations Research/Decision Theory</subject><subject>Optimization</subject><subject>Randomization</subject><subject>Regularization</subject><subject>Theory of Computation</subject><issn>0022-3239</issn><issn>1573-2878</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNp9kEtLxDAUhYMoOD7-gKuC62iSPtIsh_EJowOi65Amt2OHNqlJRhx_vdEK7lxc7uVwzrnwIXRGyQUlhF8GSkTJMaF1mqISuNxDM1ryHLOa1_toRghjOGe5OERHIWwIIaLmxQzJudbQg1cRTPakrHFD95nOh85757MrCBpszOb92vkuvg4ha5O8cMPoQhche3QWh-idXfe7JNt3-MhWY-xSi4qdsyfooFV9gNPffYxebq6fF3d4ubq9X8yXWOdURAwMRMVaU9WGU8WFIIzXmuuiKeqSEg5Nq5uWkhxUkxsF1BBiVFUKo7mBusiP0fnUO3r3toUQ5cZtvU0vJaMVL5koeJVcbHJp70Lw0MrRd4PyO0mJ_AYpJ5AygZQ_IGWZQvkUCsls1-D_qv9JfQFoWHgW</recordid><startdate>20190501</startdate><enddate>20190501</enddate><creator>Hien, Le Thi Khanh</creator><creator>Nguyen, Cuong V.</creator><creator>Xu, Huan</creator><creator>Lu, Canyi</creator><creator>Feng, Jiashi</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7TB</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>88I</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>8G5</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>KR7</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M2O</scope><scope>M2P</scope><scope>M7S</scope><scope>MBDVC</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0003-2532-4637</orcidid></search><sort><creationdate>20190501</creationdate><title>Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization</title><author>Hien, Le Thi Khanh ; Nguyen, Cuong V. ; Xu, Huan ; Lu, Canyi ; Feng, Jiashi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-e2e962fd68d71a7990278c7c4b485107ebfcbf103eab3dae1d00da659dc7de843</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Applications of Mathematics</topic><topic>Calculus of Variations and Optimal Control; Optimization</topic><topic>Computational geometry</topic><topic>Convexity</topic><topic>Descent</topic><topic>Engineering</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Operations Research/Decision Theory</topic><topic>Optimization</topic><topic>Randomization</topic><topic>Regularization</topic><topic>Theory of Computation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hien, Le Thi Khanh</creatorcontrib><creatorcontrib>Nguyen, Cuong V.</creatorcontrib><creatorcontrib>Xu, Huan</creatorcontrib><creatorcontrib>Lu, Canyi</creatorcontrib><creatorcontrib>Feng, Jiashi</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Research Library (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>Civil Engineering Abstracts</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Research Library</collection><collection>Science Database</collection><collection>Engineering Database</collection><collection>Research Library (Corporate)</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Journal of optimization theory and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hien, Le Thi Khanh</au><au>Nguyen, Cuong V.</au><au>Xu, Huan</au><au>Lu, Canyi</au><au>Feng, Jiashi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization</atitle><jtitle>Journal of optimization theory and applications</jtitle><stitle>J Optim Theory Appl</stitle><date>2019-05-01</date><risdate>2019</risdate><volume>181</volume><issue>2</issue><spage>541</spage><epage>566</epage><pages>541-566</pages><issn>0022-3239</issn><eissn>1573-2878</eissn><abstract>We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex case. Adding a small quadratic regularization is a common devise used to tackle non-strongly convex problems; however, it may cause loss of sparsity of solutions or weaken the performance of the algorithms. Avoiding this devise, we propose an accelerated randomized mirror descent method for solving this problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied, even when proximal points are computed inexactly. We also propose a scheme for solving the problem, when the component functions are non-smooth.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10957-018-01469-5</doi><tpages>26</tpages><orcidid>https://orcid.org/0000-0003-2532-4637</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0022-3239
ispartof Journal of optimization theory and applications, 2019-05, Vol.181 (2), p.541-566
issn 0022-3239
1573-2878
language eng
recordid cdi_proquest_journals_2167529476
source SpringerLink Journals - AutoHoldings
subjects Algorithms
Applications of Mathematics
Calculus of Variations and Optimal Control
Optimization
Computational geometry
Convexity
Descent
Engineering
Mathematics
Mathematics and Statistics
Operations Research/Decision Theory
Optimization
Randomization
Regularization
Theory of Computation
title Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-26T16%3A12%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Accelerated%20Randomized%20Mirror%20Descent%20Algorithms%20for%20Composite%20Non-strongly%20Convex%20Optimization&rft.jtitle=Journal%20of%20optimization%20theory%20and%20applications&rft.au=Hien,%20Le%20Thi%20Khanh&rft.date=2019-05-01&rft.volume=181&rft.issue=2&rft.spage=541&rft.epage=566&rft.pages=541-566&rft.issn=0022-3239&rft.eissn=1573-2878&rft_id=info:doi/10.1007/s10957-018-01469-5&rft_dat=%3Cproquest_cross%3E2167529476%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2167529476&rft_id=info:pmid/&rfr_iscdi=true