Self-Concordant Analysis of Frank-Wolfe Algorithms

Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity n...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Dvurechensky, Pavel, Ostroukhov, Petr, Safin, Kamil, Shtern, Shimrit, Staudigl, Mathias
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Dvurechensky, Pavel
Ostroukhov, Petr
Safin, Kamil
Shtern, Shimrit
Staudigl, Mathias
description Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.
doi_str_mv 10.48550/arxiv.2002.04320
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2002_04320</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2002_04320</sourcerecordid><originalsourceid>FETCH-LOGICAL-a670-26ffa07139d8563233168e485516dcf92c3c50a8082be125068afbd713fe0cfc3</originalsourceid><addsrcrecordid>eNotzrFuwjAUhWEvDBX0ATqRF3B6fY2NGaOotEhIHUBijC6OL0SYuHJQVd6-BTqd6T_6hHhRUM6cMfBK-af7LhEAS5hphCeBmxBZ1qn3KbfUX4qqp3gduqFIXCwz9Se5S5FDUcVDyt3leB4mYsQUh_D8v2OxXb5t6w-5_nxf1dVakp2DRMtMMFd60TpjNWqtrAs3hrKt5wV67Q2QA4f7oNCAdcT79i_gAJ69Hovp4_aObr5yd6Z8bW745o7Xv1MLPb8</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Self-Concordant Analysis of Frank-Wolfe Algorithms</title><source>arXiv.org</source><creator>Dvurechensky, Pavel ; Ostroukhov, Petr ; Safin, Kamil ; Shtern, Shimrit ; Staudigl, Mathias</creator><creatorcontrib>Dvurechensky, Pavel ; Ostroukhov, Petr ; Safin, Kamil ; Shtern, Shimrit ; Staudigl, Mathias</creatorcontrib><description>Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.</description><identifier>DOI: 10.48550/arxiv.2002.04320</identifier><language>eng</language><subject>Computer Science - Learning ; Mathematics - Optimization and Control ; Statistics - Computation</subject><creationdate>2020-02</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2002.04320$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2002.04320$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Dvurechensky, Pavel</creatorcontrib><creatorcontrib>Ostroukhov, Petr</creatorcontrib><creatorcontrib>Safin, Kamil</creatorcontrib><creatorcontrib>Shtern, Shimrit</creatorcontrib><creatorcontrib>Staudigl, Mathias</creatorcontrib><title>Self-Concordant Analysis of Frank-Wolfe Algorithms</title><description>Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.</description><subject>Computer Science - Learning</subject><subject>Mathematics - Optimization and Control</subject><subject>Statistics - Computation</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzrFuwjAUhWEvDBX0ATqRF3B6fY2NGaOotEhIHUBijC6OL0SYuHJQVd6-BTqd6T_6hHhRUM6cMfBK-af7LhEAS5hphCeBmxBZ1qn3KbfUX4qqp3gduqFIXCwz9Se5S5FDUcVDyt3leB4mYsQUh_D8v2OxXb5t6w-5_nxf1dVakp2DRMtMMFd60TpjNWqtrAs3hrKt5wV67Q2QA4f7oNCAdcT79i_gAJ69Hovp4_aObr5yd6Z8bW745o7Xv1MLPb8</recordid><startdate>20200211</startdate><enddate>20200211</enddate><creator>Dvurechensky, Pavel</creator><creator>Ostroukhov, Petr</creator><creator>Safin, Kamil</creator><creator>Shtern, Shimrit</creator><creator>Staudigl, Mathias</creator><scope>AKY</scope><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20200211</creationdate><title>Self-Concordant Analysis of Frank-Wolfe Algorithms</title><author>Dvurechensky, Pavel ; Ostroukhov, Petr ; Safin, Kamil ; Shtern, Shimrit ; Staudigl, Mathias</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a670-26ffa07139d8563233168e485516dcf92c3c50a8082be125068afbd713fe0cfc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Computer Science - Learning</topic><topic>Mathematics - Optimization and Control</topic><topic>Statistics - Computation</topic><toplevel>online_resources</toplevel><creatorcontrib>Dvurechensky, Pavel</creatorcontrib><creatorcontrib>Ostroukhov, Petr</creatorcontrib><creatorcontrib>Safin, Kamil</creatorcontrib><creatorcontrib>Shtern, Shimrit</creatorcontrib><creatorcontrib>Staudigl, Mathias</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Dvurechensky, Pavel</au><au>Ostroukhov, Petr</au><au>Safin, Kamil</au><au>Shtern, Shimrit</au><au>Staudigl, Mathias</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Self-Concordant Analysis of Frank-Wolfe Algorithms</atitle><date>2020-02-11</date><risdate>2020</risdate><abstract>Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.</abstract><doi>10.48550/arxiv.2002.04320</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2002.04320
ispartof
issn
language eng
recordid cdi_arxiv_primary_2002_04320
source arXiv.org
subjects Computer Science - Learning
Mathematics - Optimization and Control
Statistics - Computation
title Self-Concordant Analysis of Frank-Wolfe Algorithms
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T14%3A27%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Self-Concordant%20Analysis%20of%20Frank-Wolfe%20Algorithms&rft.au=Dvurechensky,%20Pavel&rft.date=2020-02-11&rft_id=info:doi/10.48550/arxiv.2002.04320&rft_dat=%3Carxiv_GOX%3E2002_04320%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true