Estimate Sequences for Variance-Reduced Stochastic Composite Optimization
In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. This point of view covers the stochastic gradient descent method, variants of the approaches SAGA, SVRG, and has se...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2019-05 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Kulunchakov, Andrei Mairal, Julien |
description | In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. This point of view covers the stochastic gradient descent method, variants of the approaches SAGA, SVRG, and has several advantages: (i) we provide a generic proof of convergence for the aforementioned methods; (ii) we show that this SVRG variant is adaptive to strong convexity; (iii) we naturally obtain new algorithms with the same guarantees; (iv) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we show that this viewpoint is useful to obtain new accelerated algorithms in the sense of Nesterov. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2221587628</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2221587628</sourcerecordid><originalsourceid>FETCH-proquest_journals_22215876283</originalsourceid><addsrcrecordid>eNqNjM0KwjAQhIMgWLTvEPBcaDf2514qehKseJWQbjHFdms2vfj05uADeBqG-b5ZiQiUypLqALARMfOQpikUJeS5isS5YW9H7VG2-F5wMsiyJyfv2lkdWnLFbjHYydaTeeoAG1nTOBPb4FzmINuP9pamnVj3-sUY_3Ir9sfmVp-S2VF4Zv8YaHFTmB4AkOVVWUCl_qO-wEA89A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2221587628</pqid></control><display><type>article</type><title>Estimate Sequences for Variance-Reduced Stochastic Composite Optimization</title><source>Free E- Journals</source><creator>Kulunchakov, Andrei ; Mairal, Julien</creator><creatorcontrib>Kulunchakov, Andrei ; Mairal, Julien</creatorcontrib><description>In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. This point of view covers the stochastic gradient descent method, variants of the approaches SAGA, SVRG, and has several advantages: (i) we provide a generic proof of convergence for the aforementioned methods; (ii) we show that this SVRG variant is adaptive to strong convexity; (iii) we naturally obtain new algorithms with the same guarantees; (iv) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we show that this viewpoint is useful to obtain new accelerated algorithms in the sense of Nesterov.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Convexity ; Optimization</subject><ispartof>arXiv.org, 2019-05</ispartof><rights>2019. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Kulunchakov, Andrei</creatorcontrib><creatorcontrib>Mairal, Julien</creatorcontrib><title>Estimate Sequences for Variance-Reduced Stochastic Composite Optimization</title><title>arXiv.org</title><description>In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. This point of view covers the stochastic gradient descent method, variants of the approaches SAGA, SVRG, and has several advantages: (i) we provide a generic proof of convergence for the aforementioned methods; (ii) we show that this SVRG variant is adaptive to strong convexity; (iii) we naturally obtain new algorithms with the same guarantees; (iv) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we show that this viewpoint is useful to obtain new accelerated algorithms in the sense of Nesterov.</description><subject>Algorithms</subject><subject>Convexity</subject><subject>Optimization</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNjM0KwjAQhIMgWLTvEPBcaDf2514qehKseJWQbjHFdms2vfj05uADeBqG-b5ZiQiUypLqALARMfOQpikUJeS5isS5YW9H7VG2-F5wMsiyJyfv2lkdWnLFbjHYydaTeeoAG1nTOBPb4FzmINuP9pamnVj3-sUY_3Ir9sfmVp-S2VF4Zv8YaHFTmB4AkOVVWUCl_qO-wEA89A</recordid><startdate>20190507</startdate><enddate>20190507</enddate><creator>Kulunchakov, Andrei</creator><creator>Mairal, Julien</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PIMPY</scope><scope>PKEHL</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20190507</creationdate><title>Estimate Sequences for Variance-Reduced Stochastic Composite Optimization</title><author>Kulunchakov, Andrei ; Mairal, Julien</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_22215876283</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Convexity</topic><topic>Optimization</topic><toplevel>online_resources</toplevel><creatorcontrib>Kulunchakov, Andrei</creatorcontrib><creatorcontrib>Mairal, Julien</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kulunchakov, Andrei</au><au>Mairal, Julien</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Estimate Sequences for Variance-Reduced Stochastic Composite Optimization</atitle><jtitle>arXiv.org</jtitle><date>2019-05-07</date><risdate>2019</risdate><eissn>2331-8422</eissn><abstract>In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. This point of view covers the stochastic gradient descent method, variants of the approaches SAGA, SVRG, and has several advantages: (i) we provide a generic proof of convergence for the aforementioned methods; (ii) we show that this SVRG variant is adaptive to strong convexity; (iii) we naturally obtain new algorithms with the same guarantees; (iv) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we show that this viewpoint is useful to obtain new accelerated algorithms in the sense of Nesterov.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2019-05 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2221587628 |
source | Free E- Journals |
subjects | Algorithms Convexity Optimization |
title | Estimate Sequences for Variance-Reduced Stochastic Composite Optimization |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-14T19%3A41%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Estimate%20Sequences%20for%20Variance-Reduced%20Stochastic%20Composite%20Optimization&rft.jtitle=arXiv.org&rft.au=Kulunchakov,%20Andrei&rft.date=2019-05-07&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2221587628%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2221587628&rft_id=info:pmid/&rfr_iscdi=true |