Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems

We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradie...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2022-08
Hauptverfasser: Krejic, Natasa, Jerinkic, Natasa Krklec, Ostojic, Tijana
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Krejic, Natasa
Jerinkic, Natasa Krklec
Ostojic, Tijana
description We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradient with the spectral coefficient in order to provide a suitable direction which improves the performance of the first order method as shown by numerical results. The step sizes are chosen from the predefined interval and the almost sure convergence of the method is proved under the standard assumptions in stochastic environment. To enhance the performance of the proposed algorithm, we further specify the choice of the step size by introducing an Armijo-like procedure adapted to this framework. Considering the computational cost on machine learning problems, we conclude that the line search improves the performance significantly. Numerical experiments conducted on finite sums problems also reveal that the variable sample strategy outperforms the full sample approach.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2643123641</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2643123641</sourcerecordid><originalsourceid>FETCH-proquest_journals_26431236413</originalsourceid><addsrcrecordid>eNqNi8EKgkAURYcgSMp_eNBa0Bm19lK0qQTbx5jPHNF5NjNG9PUZ9AGt7oFz7ox5XIgo2MacL5hvbRuGIU83PEmEx_JiwJszsoPcUDshVlCM5d3ISqF2cETXUAU1GTiRtj2RayAj_cQXnAenevWWTpH-3ssOe7ti81p2Fv3fLtl6v7tkh2Aw9BjRumtLo9GTuvI0FhEXaRyJ_6oPtFU_lg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2643123641</pqid></control><display><type>article</type><title>Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems</title><source>Free E- Journals</source><creator>Krejic, Natasa ; Jerinkic, Natasa Krklec ; Ostojic, Tijana</creator><creatorcontrib>Krejic, Natasa ; Jerinkic, Natasa Krklec ; Ostojic, Tijana</creatorcontrib><description>We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradient with the spectral coefficient in order to provide a suitable direction which improves the performance of the first order method as shown by numerical results. The step sizes are chosen from the predefined interval and the almost sure convergence of the method is proved under the standard assumptions in stochastic environment. To enhance the performance of the proposed algorithm, we further specify the choice of the step size by introducing an Armijo-like procedure adapted to this framework. Considering the computational cost on machine learning problems, we conclude that the line search improves the performance significantly. Numerical experiments conducted on finite sums problems also reveal that the variable sample strategy outperforms the full sample approach.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Computational geometry ; Convexity ; Machine learning ; Mathematical analysis ; Optimization ; Performance enhancement</subject><ispartof>arXiv.org, 2022-08</ispartof><rights>2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Krejic, Natasa</creatorcontrib><creatorcontrib>Jerinkic, Natasa Krklec</creatorcontrib><creatorcontrib>Ostojic, Tijana</creatorcontrib><title>Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems</title><title>arXiv.org</title><description>We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradient with the spectral coefficient in order to provide a suitable direction which improves the performance of the first order method as shown by numerical results. The step sizes are chosen from the predefined interval and the almost sure convergence of the method is proved under the standard assumptions in stochastic environment. To enhance the performance of the proposed algorithm, we further specify the choice of the step size by introducing an Armijo-like procedure adapted to this framework. Considering the computational cost on machine learning problems, we conclude that the line search improves the performance significantly. Numerical experiments conducted on finite sums problems also reveal that the variable sample strategy outperforms the full sample approach.</description><subject>Algorithms</subject><subject>Computational geometry</subject><subject>Convexity</subject><subject>Machine learning</subject><subject>Mathematical analysis</subject><subject>Optimization</subject><subject>Performance enhancement</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNi8EKgkAURYcgSMp_eNBa0Bm19lK0qQTbx5jPHNF5NjNG9PUZ9AGt7oFz7ox5XIgo2MacL5hvbRuGIU83PEmEx_JiwJszsoPcUDshVlCM5d3ISqF2cETXUAU1GTiRtj2RayAj_cQXnAenevWWTpH-3ssOe7ti81p2Fv3fLtl6v7tkh2Aw9BjRumtLo9GTuvI0FhEXaRyJ_6oPtFU_lg</recordid><startdate>20220808</startdate><enddate>20220808</enddate><creator>Krejic, Natasa</creator><creator>Jerinkic, Natasa Krklec</creator><creator>Ostojic, Tijana</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20220808</creationdate><title>Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems</title><author>Krejic, Natasa ; Jerinkic, Natasa Krklec ; Ostojic, Tijana</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_26431236413</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Computational geometry</topic><topic>Convexity</topic><topic>Machine learning</topic><topic>Mathematical analysis</topic><topic>Optimization</topic><topic>Performance enhancement</topic><toplevel>online_resources</toplevel><creatorcontrib>Krejic, Natasa</creatorcontrib><creatorcontrib>Jerinkic, Natasa Krklec</creatorcontrib><creatorcontrib>Ostojic, Tijana</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Krejic, Natasa</au><au>Jerinkic, Natasa Krklec</au><au>Ostojic, Tijana</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems</atitle><jtitle>arXiv.org</jtitle><date>2022-08-08</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradient with the spectral coefficient in order to provide a suitable direction which improves the performance of the first order method as shown by numerical results. The step sizes are chosen from the predefined interval and the almost sure convergence of the method is proved under the standard assumptions in stochastic environment. To enhance the performance of the proposed algorithm, we further specify the choice of the step size by introducing an Armijo-like procedure adapted to this framework. Considering the computational cost on machine learning problems, we conclude that the line search improves the performance significantly. Numerical experiments conducted on finite sums problems also reveal that the variable sample strategy outperforms the full sample approach.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2022-08
issn 2331-8422
language eng
recordid cdi_proquest_journals_2643123641
source Free E- Journals
subjects Algorithms
Computational geometry
Convexity
Machine learning
Mathematical analysis
Optimization
Performance enhancement
title Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T14%3A57%3A50IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Spectral%20Projected%20Subgradient%20Method%20for%20Nonsmooth%20Convex%20Optimization%20Problems&rft.jtitle=arXiv.org&rft.au=Krejic,%20Natasa&rft.date=2022-08-08&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2643123641%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2643123641&rft_id=info:pmid/&rfr_iscdi=true