Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities
In this paper, we propose a variable sample-size optimistic mirror descent algorithm under the Bregman distance for a class of stochastic mixed variational inequalities. Different from those conventional variable sample-size extragradient algorithms to evaluate the expected mapping twice at each ite...
Gespeichert in:
Veröffentlicht in: | Journal of global optimization 2024-05, Vol.89 (1), p.143-170 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 170 |
---|---|
container_issue | 1 |
container_start_page | 143 |
container_title | Journal of global optimization |
container_volume | 89 |
creator | Yang, Zhen-Ping Zhao, Yong Lin, Gui-Hua |
description | In this paper, we propose a variable sample-size optimistic mirror descent algorithm under the Bregman distance for a class of stochastic mixed variational inequalities. Different from those conventional variable sample-size extragradient algorithms to evaluate the expected mapping twice at each iteration, our algorithm requires only one evaluation of the expected mapping and hence can significantly reduce the computation load. In the monotone case, the proposed algorithm can achieve
O
(
1
/
t
)
ergodic convergence rate in terms of the expected restricted gap function and, under the strongly generalized monotonicity condition, the proposed algorithm has a locally linear convergence rate of the Bregman distance between iterations and solutions when the sample size increases geometrically. Furthermore, we derive some results on stochastic local stability under the generalized monotonicity condition. Numerical experiments indicate that the proposed algorithm compares favorably with some existing methods. |
doi_str_mv | 10.1007/s10898-023-01346-0 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3047006937</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3047006937</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-209b34eea93408088d365157eaa50311a2be57561558567ef112aefd5742a4ae3</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMouK7-AU8Bz9FJ0rTNURa_YMGLeg3ZdrqbpW26SVbUX2_XCt48DQzP-zLzEHLJ4ZoDFDeRQ6lLBkIy4DLLGRyRGVeFZELz_JjMQAvFFAA_JWcxbgFAl0rMSPNmg7OrFmm03dAii-4LqR-S61xMrqKdC8EHWmOssE_UtmsfXNp0tBm3MflqY3-5D6zp-6EtOd_blroed3vbuuQwnpOTxrYRL37nnLze370sHtny-eFpcbtkleQ6MQF6JTNEq2UGJZRlLXM1voHWKpCcW7FCVaicK1WqvMCGc2GxqVWRCZtZlHNyNfUOwe_2GJPZ-n0Yr4lGQlYA5FoWIyUmqgo-xoCNGYLrbPg0HMzBp5l8mtGn-fFpYAzJKRRHuF9j-Kv-J_UNGvB5yw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3047006937</pqid></control><display><type>article</type><title>Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities</title><source>SpringerLink Journals - AutoHoldings</source><creator>Yang, Zhen-Ping ; Zhao, Yong ; Lin, Gui-Hua</creator><creatorcontrib>Yang, Zhen-Ping ; Zhao, Yong ; Lin, Gui-Hua</creatorcontrib><description>In this paper, we propose a variable sample-size optimistic mirror descent algorithm under the Bregman distance for a class of stochastic mixed variational inequalities. Different from those conventional variable sample-size extragradient algorithms to evaluate the expected mapping twice at each iteration, our algorithm requires only one evaluation of the expected mapping and hence can significantly reduce the computation load. In the monotone case, the proposed algorithm can achieve
O
(
1
/
t
)
ergodic convergence rate in terms of the expected restricted gap function and, under the strongly generalized monotonicity condition, the proposed algorithm has a locally linear convergence rate of the Bregman distance between iterations and solutions when the sample size increases geometrically. Furthermore, we derive some results on stochastic local stability under the generalized monotonicity condition. Numerical experiments indicate that the proposed algorithm compares favorably with some existing methods.</description><identifier>ISSN: 0925-5001</identifier><identifier>EISSN: 1573-2916</identifier><identifier>DOI: 10.1007/s10898-023-01346-0</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Approximation ; Computer Science ; Convergence ; Inequalities ; Mapping ; Mathematics ; Mathematics and Statistics ; Operations Research/Decision Theory ; Optimization ; Real Functions</subject><ispartof>Journal of global optimization, 2024-05, Vol.89 (1), p.143-170</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-209b34eea93408088d365157eaa50311a2be57561558567ef112aefd5742a4ae3</citedby><cites>FETCH-LOGICAL-c319t-209b34eea93408088d365157eaa50311a2be57561558567ef112aefd5742a4ae3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10898-023-01346-0$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10898-023-01346-0$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27923,27924,41487,42556,51318</link.rule.ids></links><search><creatorcontrib>Yang, Zhen-Ping</creatorcontrib><creatorcontrib>Zhao, Yong</creatorcontrib><creatorcontrib>Lin, Gui-Hua</creatorcontrib><title>Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities</title><title>Journal of global optimization</title><addtitle>J Glob Optim</addtitle><description>In this paper, we propose a variable sample-size optimistic mirror descent algorithm under the Bregman distance for a class of stochastic mixed variational inequalities. Different from those conventional variable sample-size extragradient algorithms to evaluate the expected mapping twice at each iteration, our algorithm requires only one evaluation of the expected mapping and hence can significantly reduce the computation load. In the monotone case, the proposed algorithm can achieve
O
(
1
/
t
)
ergodic convergence rate in terms of the expected restricted gap function and, under the strongly generalized monotonicity condition, the proposed algorithm has a locally linear convergence rate of the Bregman distance between iterations and solutions when the sample size increases geometrically. Furthermore, we derive some results on stochastic local stability under the generalized monotonicity condition. Numerical experiments indicate that the proposed algorithm compares favorably with some existing methods.</description><subject>Algorithms</subject><subject>Approximation</subject><subject>Computer Science</subject><subject>Convergence</subject><subject>Inequalities</subject><subject>Mapping</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Operations Research/Decision Theory</subject><subject>Optimization</subject><subject>Real Functions</subject><issn>0925-5001</issn><issn>1573-2916</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LxDAQhoMouK7-AU8Bz9FJ0rTNURa_YMGLeg3ZdrqbpW26SVbUX2_XCt48DQzP-zLzEHLJ4ZoDFDeRQ6lLBkIy4DLLGRyRGVeFZELz_JjMQAvFFAA_JWcxbgFAl0rMSPNmg7OrFmm03dAii-4LqR-S61xMrqKdC8EHWmOssE_UtmsfXNp0tBm3MflqY3-5D6zp-6EtOd_blroed3vbuuQwnpOTxrYRL37nnLze370sHtny-eFpcbtkleQ6MQF6JTNEq2UGJZRlLXM1voHWKpCcW7FCVaicK1WqvMCGc2GxqVWRCZtZlHNyNfUOwe_2GJPZ-n0Yr4lGQlYA5FoWIyUmqgo-xoCNGYLrbPg0HMzBp5l8mtGn-fFpYAzJKRRHuF9j-Kv-J_UNGvB5yw</recordid><startdate>20240501</startdate><enddate>20240501</enddate><creator>Yang, Zhen-Ping</creator><creator>Zhao, Yong</creator><creator>Lin, Gui-Hua</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20240501</creationdate><title>Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities</title><author>Yang, Zhen-Ping ; Zhao, Yong ; Lin, Gui-Hua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-209b34eea93408088d365157eaa50311a2be57561558567ef112aefd5742a4ae3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Approximation</topic><topic>Computer Science</topic><topic>Convergence</topic><topic>Inequalities</topic><topic>Mapping</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Operations Research/Decision Theory</topic><topic>Optimization</topic><topic>Real Functions</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yang, Zhen-Ping</creatorcontrib><creatorcontrib>Zhao, Yong</creatorcontrib><creatorcontrib>Lin, Gui-Hua</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of global optimization</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yang, Zhen-Ping</au><au>Zhao, Yong</au><au>Lin, Gui-Hua</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities</atitle><jtitle>Journal of global optimization</jtitle><stitle>J Glob Optim</stitle><date>2024-05-01</date><risdate>2024</risdate><volume>89</volume><issue>1</issue><spage>143</spage><epage>170</epage><pages>143-170</pages><issn>0925-5001</issn><eissn>1573-2916</eissn><abstract>In this paper, we propose a variable sample-size optimistic mirror descent algorithm under the Bregman distance for a class of stochastic mixed variational inequalities. Different from those conventional variable sample-size extragradient algorithms to evaluate the expected mapping twice at each iteration, our algorithm requires only one evaluation of the expected mapping and hence can significantly reduce the computation load. In the monotone case, the proposed algorithm can achieve
O
(
1
/
t
)
ergodic convergence rate in terms of the expected restricted gap function and, under the strongly generalized monotonicity condition, the proposed algorithm has a locally linear convergence rate of the Bregman distance between iterations and solutions when the sample size increases geometrically. Furthermore, we derive some results on stochastic local stability under the generalized monotonicity condition. Numerical experiments indicate that the proposed algorithm compares favorably with some existing methods.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10898-023-01346-0</doi><tpages>28</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0925-5001 |
ispartof | Journal of global optimization, 2024-05, Vol.89 (1), p.143-170 |
issn | 0925-5001 1573-2916 |
language | eng |
recordid | cdi_proquest_journals_3047006937 |
source | SpringerLink Journals - AutoHoldings |
subjects | Algorithms Approximation Computer Science Convergence Inequalities Mapping Mathematics Mathematics and Statistics Operations Research/Decision Theory Optimization Real Functions |
title | Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T20%3A14%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Variable%20sample-size%20optimistic%20mirror%20descent%20algorithm%20for%20stochastic%20mixed%20variational%20inequalities&rft.jtitle=Journal%20of%20global%20optimization&rft.au=Yang,%20Zhen-Ping&rft.date=2024-05-01&rft.volume=89&rft.issue=1&rft.spage=143&rft.epage=170&rft.pages=143-170&rft.issn=0925-5001&rft.eissn=1573-2916&rft_id=info:doi/10.1007/s10898-023-01346-0&rft_dat=%3Cproquest_cross%3E3047006937%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3047006937&rft_id=info:pmid/&rfr_iscdi=true |