Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization
A minimization problem for mathematical expectation of a convex loss function over given convex compact X ∈ R N is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modific...
Gespeichert in:
Veröffentlicht in: | Automation and remote control 2018, Vol.79 (1), p.78-88 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 88 |
---|---|
container_issue | 1 |
container_start_page | 78 |
container_title | Automation and remote control |
container_volume | 79 |
creator | Nazin, A. V. |
description | A minimization problem for mathematical expectation of a convex loss function over given convex compact
X
∈ R
N
is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modification of well-known mirror descent method proposed by A.S. Nemirovsky and D.B. Yudin in 1979 and having extended the standard gradient method. In the beginning, the idea of a new so-called method of Inertial Mirror Descent (IMD) on example of a deterministic optimization problem in R
N
with continuous time is demonstrated. Particularly, in Euclidean case the method of heavy ball is realized; it is noted that the new method no use additional point averaging. Further on, a discrete IMD algorithm is described; the upper bound on error over objective function (i.e., of the difference between current mean losses and their minimum) is proved. |
doi_str_mv | 10.1134/S0005117918010071 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_1993616611</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1993616611</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-8c8923dd8cb17b53f07ecdb3c144b1f0bc9e2e4fc746c2c0c27e607d504479623</originalsourceid><addsrcrecordid>eNp1kF1LwzAUhoMoOKc_wLuA19VzkjRpL8f8GkwmTK9Lm6ZbRtfMJBP119tRLwTx6ly8z_MeeAm5RLhG5OJmCQAposoxAwRQeERGKCFLOHB2TEaHODnkp-QshA0AIjA-IstJu3LexvU2UNfQWWd8tGVLn6z3ztNbE7TpIrUdnbru3XzQZ--q1gz0Mjq9LkO0mi520W7tVxmt687JSVO2wVz83DF5vb97mT4m88XDbDqZJ5qneUwyneWM13WmK1RVyhtQRtcV1yhEhQ1UOjfMiEYrITXToJkyElSdghAql4yPydXQu_PubW9CLDZu77v-ZYF5ziVK2W8zJjhQ2rsQvGmKnbfb0n8WCMVhu-LPdr3DBif0bLcy_lfzv9I3NMxwIQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1993616611</pqid></control><display><type>article</type><title>Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization</title><source>SpringerNature Journals</source><creator>Nazin, A. V.</creator><creatorcontrib>Nazin, A. V.</creatorcontrib><description>A minimization problem for mathematical expectation of a convex loss function over given convex compact
X
∈ R
N
is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modification of well-known mirror descent method proposed by A.S. Nemirovsky and D.B. Yudin in 1979 and having extended the standard gradient method. In the beginning, the idea of a new so-called method of Inertial Mirror Descent (IMD) on example of a deterministic optimization problem in R
N
with continuous time is demonstrated. Particularly, in Euclidean case the method of heavy ball is realized; it is noted that the new method no use additional point averaging. Further on, a discrete IMD algorithm is described; the upper bound on error over objective function (i.e., of the difference between current mean losses and their minimum) is proved.</description><identifier>ISSN: 0005-1179</identifier><identifier>EISSN: 1608-3032</identifier><identifier>DOI: 10.1134/S0005117918010071</identifier><language>eng</language><publisher>Moscow: Pleiades Publishing</publisher><subject>CAE) and Design ; Calculus of Variations and Optimal Control; Optimization ; Computer-Aided Engineering (CAD ; Control ; Descent ; Mathematics ; Mathematics and Statistics ; Mechanical Engineering ; Mechatronics ; Optimization ; Robotics ; Systems Theory ; Topical Issue ; Upper bounds</subject><ispartof>Automation and remote control, 2018, Vol.79 (1), p.78-88</ispartof><rights>Pleiades Publishing, Ltd. 2018</rights><rights>Copyright Springer Science & Business Media 2018</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c359t-8c8923dd8cb17b53f07ecdb3c144b1f0bc9e2e4fc746c2c0c27e607d504479623</citedby><cites>FETCH-LOGICAL-c359t-8c8923dd8cb17b53f07ecdb3c144b1f0bc9e2e4fc746c2c0c27e607d504479623</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1134/S0005117918010071$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1134/S0005117918010071$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>315,782,786,27931,27932,41495,42564,51326</link.rule.ids></links><search><creatorcontrib>Nazin, A. V.</creatorcontrib><title>Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization</title><title>Automation and remote control</title><addtitle>Autom Remote Control</addtitle><description>A minimization problem for mathematical expectation of a convex loss function over given convex compact
X
∈ R
N
is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modification of well-known mirror descent method proposed by A.S. Nemirovsky and D.B. Yudin in 1979 and having extended the standard gradient method. In the beginning, the idea of a new so-called method of Inertial Mirror Descent (IMD) on example of a deterministic optimization problem in R
N
with continuous time is demonstrated. Particularly, in Euclidean case the method of heavy ball is realized; it is noted that the new method no use additional point averaging. Further on, a discrete IMD algorithm is described; the upper bound on error over objective function (i.e., of the difference between current mean losses and their minimum) is proved.</description><subject>CAE) and Design</subject><subject>Calculus of Variations and Optimal Control; Optimization</subject><subject>Computer-Aided Engineering (CAD</subject><subject>Control</subject><subject>Descent</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Mechanical Engineering</subject><subject>Mechatronics</subject><subject>Optimization</subject><subject>Robotics</subject><subject>Systems Theory</subject><subject>Topical Issue</subject><subject>Upper bounds</subject><issn>0005-1179</issn><issn>1608-3032</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><recordid>eNp1kF1LwzAUhoMoOKc_wLuA19VzkjRpL8f8GkwmTK9Lm6ZbRtfMJBP119tRLwTx6ly8z_MeeAm5RLhG5OJmCQAposoxAwRQeERGKCFLOHB2TEaHODnkp-QshA0AIjA-IstJu3LexvU2UNfQWWd8tGVLn6z3ztNbE7TpIrUdnbru3XzQZ--q1gz0Mjq9LkO0mi520W7tVxmt687JSVO2wVz83DF5vb97mT4m88XDbDqZJ5qneUwyneWM13WmK1RVyhtQRtcV1yhEhQ1UOjfMiEYrITXToJkyElSdghAql4yPydXQu_PubW9CLDZu77v-ZYF5ziVK2W8zJjhQ2rsQvGmKnbfb0n8WCMVhu-LPdr3DBif0bLcy_lfzv9I3NMxwIQ</recordid><startdate>2018</startdate><enddate>2018</enddate><creator>Nazin, A. V.</creator><general>Pleiades Publishing</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>2018</creationdate><title>Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization</title><author>Nazin, A. V.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-8c8923dd8cb17b53f07ecdb3c144b1f0bc9e2e4fc746c2c0c27e607d504479623</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>CAE) and Design</topic><topic>Calculus of Variations and Optimal Control; Optimization</topic><topic>Computer-Aided Engineering (CAD</topic><topic>Control</topic><topic>Descent</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Mechanical Engineering</topic><topic>Mechatronics</topic><topic>Optimization</topic><topic>Robotics</topic><topic>Systems Theory</topic><topic>Topical Issue</topic><topic>Upper bounds</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Nazin, A. V.</creatorcontrib><collection>CrossRef</collection><jtitle>Automation and remote control</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nazin, A. V.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization</atitle><jtitle>Automation and remote control</jtitle><stitle>Autom Remote Control</stitle><date>2018</date><risdate>2018</risdate><volume>79</volume><issue>1</issue><spage>78</spage><epage>88</epage><pages>78-88</pages><issn>0005-1179</issn><eissn>1608-3032</eissn><abstract>A minimization problem for mathematical expectation of a convex loss function over given convex compact
X
∈ R
N
is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modification of well-known mirror descent method proposed by A.S. Nemirovsky and D.B. Yudin in 1979 and having extended the standard gradient method. In the beginning, the idea of a new so-called method of Inertial Mirror Descent (IMD) on example of a deterministic optimization problem in R
N
with continuous time is demonstrated. Particularly, in Euclidean case the method of heavy ball is realized; it is noted that the new method no use additional point averaging. Further on, a discrete IMD algorithm is described; the upper bound on error over objective function (i.e., of the difference between current mean losses and their minimum) is proved.</abstract><cop>Moscow</cop><pub>Pleiades Publishing</pub><doi>10.1134/S0005117918010071</doi><tpages>11</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0005-1179 |
ispartof | Automation and remote control, 2018, Vol.79 (1), p.78-88 |
issn | 0005-1179 1608-3032 |
language | eng |
recordid | cdi_proquest_journals_1993616611 |
source | SpringerNature Journals |
subjects | CAE) and Design Calculus of Variations and Optimal Control Optimization Computer-Aided Engineering (CAD Control Descent Mathematics Mathematics and Statistics Mechanical Engineering Mechatronics Optimization Robotics Systems Theory Topical Issue Upper bounds |
title | Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-04T20%3A35%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Algorithms%20of%20Inertial%20Mirror%20Descent%20in%20Convex%20Problems%20of%20Stochastic%20Optimization&rft.jtitle=Automation%20and%20remote%20control&rft.au=Nazin,%20A.%20V.&rft.date=2018&rft.volume=79&rft.issue=1&rft.spage=78&rft.epage=88&rft.pages=78-88&rft.issn=0005-1179&rft.eissn=1608-3032&rft_id=info:doi/10.1134/S0005117918010071&rft_dat=%3Cproquest_cross%3E1993616611%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1993616611&rft_id=info:pmid/&rfr_iscdi=true |