The continuous stochastic gradient method: part II–application and numerics
In this contribution, we present a numerical analysis of the continuous stochastic gradient (CSG) method, including applications from topology optimization and convergence rates. In contrast to standard stochastic gradient optimization schemes, CSG does not discard old gradient samples from previous...
Gespeichert in:
Veröffentlicht in: | Computational optimization and applications 2024-04, Vol.87 (3), p.977-1008 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1008 |
---|---|
container_issue | 3 |
container_start_page | 977 |
container_title | Computational optimization and applications |
container_volume | 87 |
creator | Grieshammer, Max Pflug, Lukas Stingl, Michael Uihlein, Andrian |
description | In this contribution, we present a numerical analysis of the
continuous stochastic gradient
(CSG) method, including applications from topology optimization and convergence rates. In contrast to standard stochastic gradient optimization schemes, CSG does not discard old gradient samples from previous iterations. Instead, design dependent integration weights are calculated to form a convex combination as an approximation to the true gradient at the current design. As the approximation error vanishes in the course of the iterations, CSG represents a hybrid approach, starting off like a purely stochastic method and behaving like a full gradient scheme in the limit. In this work, the efficiency of CSG is demonstrated for practically relevant applications from topology optimization. These settings are characterized by both, a large number of optimization variables
and
an objective function, whose evaluation requires the numerical computation of multiple integrals concatenated in a nonlinear fashion. Such problems could not be solved by any existing optimization method before. Lastly, with regards to convergence rates, first estimates are provided and confirmed with the help of numerical experiments. |
doi_str_mv | 10.1007/s10589-023-00540-w |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3037949381</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3037949381</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-9cd772ef33584bea3d1f974ca6dc0ae96d14aed80e0dbd719e65647122dd0ef03</originalsourceid><addsrcrecordid>eNp9kL1OwzAURi0EEqXwAkyWmA3XdhLHbKjip1IRS5kt13baVK0dbEeIjXfgDXkSAkFiY7rLOd-VDkLnFC4pgLhKFMpaEmCcAJQFkNcDNKGl4ITVsjhEE5CsIhUAP0YnKW0BQArOJuhxuXHYBJ9b34c-4ZSD2eiUW4PXUdvW-Yz3Lm-CvcadjhnP55_vH7rrdq3RuQ0ea2-x7_cutiadoqNG75I7-71T9Hx3u5w9kMXT_Xx2syCGVzwTaawQzDWcl3Wxcppb2khRGF1ZA9rJytJCO1uDA7uygkpXlVUhKGPWgmuAT9HFuNvF8NK7lNU29NEPLxUHLmQheU0Hio2UiSGl6BrVxXav45uioL6zqTGbGrKpn2zqdZD4KKUB9msX_6b_sb4A2c9ykw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3037949381</pqid></control><display><type>article</type><title>The continuous stochastic gradient method: part II–application and numerics</title><source>SpringerLink Journals</source><creator>Grieshammer, Max ; Pflug, Lukas ; Stingl, Michael ; Uihlein, Andrian</creator><creatorcontrib>Grieshammer, Max ; Pflug, Lukas ; Stingl, Michael ; Uihlein, Andrian</creatorcontrib><description>In this contribution, we present a numerical analysis of the
continuous stochastic gradient
(CSG) method, including applications from topology optimization and convergence rates. In contrast to standard stochastic gradient optimization schemes, CSG does not discard old gradient samples from previous iterations. Instead, design dependent integration weights are calculated to form a convex combination as an approximation to the true gradient at the current design. As the approximation error vanishes in the course of the iterations, CSG represents a hybrid approach, starting off like a purely stochastic method and behaving like a full gradient scheme in the limit. In this work, the efficiency of CSG is demonstrated for practically relevant applications from topology optimization. These settings are characterized by both, a large number of optimization variables
and
an objective function, whose evaluation requires the numerical computation of multiple integrals concatenated in a nonlinear fashion. Such problems could not be solved by any existing optimization method before. Lastly, with regards to convergence rates, first estimates are provided and confirmed with the help of numerical experiments.</description><identifier>ISSN: 0926-6003</identifier><identifier>EISSN: 1573-2894</identifier><identifier>DOI: 10.1007/s10589-023-00540-w</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Approximation ; Convergence ; Convex and Discrete Geometry ; Management Science ; Mathematics ; Mathematics and Statistics ; Numerical analysis ; Operations Research ; Operations Research/Decision Theory ; Optimization ; Statistics ; Topology optimization</subject><ispartof>Computational optimization and applications, 2024-04, Vol.87 (3), p.977-1008</ispartof><rights>The Author(s) 2023. corrected publication 2023</rights><rights>The Author(s) 2023. corrected publication 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c363t-9cd772ef33584bea3d1f974ca6dc0ae96d14aed80e0dbd719e65647122dd0ef03</citedby><cites>FETCH-LOGICAL-c363t-9cd772ef33584bea3d1f974ca6dc0ae96d14aed80e0dbd719e65647122dd0ef03</cites><orcidid>0000-0002-0650-3747</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10589-023-00540-w$$EPDF$$P50$$Gspringer$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10589-023-00540-w$$EHTML$$P50$$Gspringer$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Grieshammer, Max</creatorcontrib><creatorcontrib>Pflug, Lukas</creatorcontrib><creatorcontrib>Stingl, Michael</creatorcontrib><creatorcontrib>Uihlein, Andrian</creatorcontrib><title>The continuous stochastic gradient method: part II–application and numerics</title><title>Computational optimization and applications</title><addtitle>Comput Optim Appl</addtitle><description>In this contribution, we present a numerical analysis of the
continuous stochastic gradient
(CSG) method, including applications from topology optimization and convergence rates. In contrast to standard stochastic gradient optimization schemes, CSG does not discard old gradient samples from previous iterations. Instead, design dependent integration weights are calculated to form a convex combination as an approximation to the true gradient at the current design. As the approximation error vanishes in the course of the iterations, CSG represents a hybrid approach, starting off like a purely stochastic method and behaving like a full gradient scheme in the limit. In this work, the efficiency of CSG is demonstrated for practically relevant applications from topology optimization. These settings are characterized by both, a large number of optimization variables
and
an objective function, whose evaluation requires the numerical computation of multiple integrals concatenated in a nonlinear fashion. Such problems could not be solved by any existing optimization method before. Lastly, with regards to convergence rates, first estimates are provided and confirmed with the help of numerical experiments.</description><subject>Approximation</subject><subject>Convergence</subject><subject>Convex and Discrete Geometry</subject><subject>Management Science</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Numerical analysis</subject><subject>Operations Research</subject><subject>Operations Research/Decision Theory</subject><subject>Optimization</subject><subject>Statistics</subject><subject>Topology optimization</subject><issn>0926-6003</issn><issn>1573-2894</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>C6C</sourceid><recordid>eNp9kL1OwzAURi0EEqXwAkyWmA3XdhLHbKjip1IRS5kt13baVK0dbEeIjXfgDXkSAkFiY7rLOd-VDkLnFC4pgLhKFMpaEmCcAJQFkNcDNKGl4ITVsjhEE5CsIhUAP0YnKW0BQArOJuhxuXHYBJ9b34c-4ZSD2eiUW4PXUdvW-Yz3Lm-CvcadjhnP55_vH7rrdq3RuQ0ea2-x7_cutiadoqNG75I7-71T9Hx3u5w9kMXT_Xx2syCGVzwTaawQzDWcl3Wxcppb2khRGF1ZA9rJytJCO1uDA7uygkpXlVUhKGPWgmuAT9HFuNvF8NK7lNU29NEPLxUHLmQheU0Hio2UiSGl6BrVxXav45uioL6zqTGbGrKpn2zqdZD4KKUB9msX_6b_sb4A2c9ykw</recordid><startdate>20240401</startdate><enddate>20240401</enddate><creator>Grieshammer, Max</creator><creator>Pflug, Lukas</creator><creator>Stingl, Michael</creator><creator>Uihlein, Andrian</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-0650-3747</orcidid></search><sort><creationdate>20240401</creationdate><title>The continuous stochastic gradient method: part II–application and numerics</title><author>Grieshammer, Max ; Pflug, Lukas ; Stingl, Michael ; Uihlein, Andrian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-9cd772ef33584bea3d1f974ca6dc0ae96d14aed80e0dbd719e65647122dd0ef03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Approximation</topic><topic>Convergence</topic><topic>Convex and Discrete Geometry</topic><topic>Management Science</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Numerical analysis</topic><topic>Operations Research</topic><topic>Operations Research/Decision Theory</topic><topic>Optimization</topic><topic>Statistics</topic><topic>Topology optimization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Grieshammer, Max</creatorcontrib><creatorcontrib>Pflug, Lukas</creatorcontrib><creatorcontrib>Stingl, Michael</creatorcontrib><creatorcontrib>Uihlein, Andrian</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Computational optimization and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Grieshammer, Max</au><au>Pflug, Lukas</au><au>Stingl, Michael</au><au>Uihlein, Andrian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The continuous stochastic gradient method: part II–application and numerics</atitle><jtitle>Computational optimization and applications</jtitle><stitle>Comput Optim Appl</stitle><date>2024-04-01</date><risdate>2024</risdate><volume>87</volume><issue>3</issue><spage>977</spage><epage>1008</epage><pages>977-1008</pages><issn>0926-6003</issn><eissn>1573-2894</eissn><abstract>In this contribution, we present a numerical analysis of the
continuous stochastic gradient
(CSG) method, including applications from topology optimization and convergence rates. In contrast to standard stochastic gradient optimization schemes, CSG does not discard old gradient samples from previous iterations. Instead, design dependent integration weights are calculated to form a convex combination as an approximation to the true gradient at the current design. As the approximation error vanishes in the course of the iterations, CSG represents a hybrid approach, starting off like a purely stochastic method and behaving like a full gradient scheme in the limit. In this work, the efficiency of CSG is demonstrated for practically relevant applications from topology optimization. These settings are characterized by both, a large number of optimization variables
and
an objective function, whose evaluation requires the numerical computation of multiple integrals concatenated in a nonlinear fashion. Such problems could not be solved by any existing optimization method before. Lastly, with regards to convergence rates, first estimates are provided and confirmed with the help of numerical experiments.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10589-023-00540-w</doi><tpages>32</tpages><orcidid>https://orcid.org/0000-0002-0650-3747</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0926-6003 |
ispartof | Computational optimization and applications, 2024-04, Vol.87 (3), p.977-1008 |
issn | 0926-6003 1573-2894 |
language | eng |
recordid | cdi_proquest_journals_3037949381 |
source | SpringerLink Journals |
subjects | Approximation Convergence Convex and Discrete Geometry Management Science Mathematics Mathematics and Statistics Numerical analysis Operations Research Operations Research/Decision Theory Optimization Statistics Topology optimization |
title | The continuous stochastic gradient method: part II–application and numerics |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-04T11%3A46%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20continuous%20stochastic%20gradient%20method:%20part%20II%E2%80%93application%20and%20numerics&rft.jtitle=Computational%20optimization%20and%20applications&rft.au=Grieshammer,%20Max&rft.date=2024-04-01&rft.volume=87&rft.issue=3&rft.spage=977&rft.epage=1008&rft.pages=977-1008&rft.issn=0926-6003&rft.eissn=1573-2894&rft_id=info:doi/10.1007/s10589-023-00540-w&rft_dat=%3Cproquest_cross%3E3037949381%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3037949381&rft_id=info:pmid/&rfr_iscdi=true |