An Equivalence Between Sparse Approximation and Support Vector Machines

This article shows a relationship between two different approximation techniques: the support vector machines (SVM), proposed by V. Vapnik (1995) and a sparse approximation scheme that resembles the basis pursuit denoising algorithm (Chen, 1995; Chen, Donoho, & Saunders, 1995). SVM is a techniqu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural computation 1998-08, Vol.10 (6), p.1455-1480
1. Verfasser: Girosi, Federico
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1480
container_issue 6
container_start_page 1455
container_title Neural computation
container_volume 10
creator Girosi, Federico
description This article shows a relationship between two different approximation techniques: the support vector machines (SVM), proposed by V. Vapnik (1995) and a sparse approximation scheme that resembles the basis pursuit denoising algorithm (Chen, 1995; Chen, Donoho, & Saunders, 1995). SVM is a technique that can be derived from the structural risk minimization principle (Vapnik, 1982) and can be used to estimate the parameters of several different approximation schemes, including radial basis functions, algebraic and trigonometric polynomials, B-splines, and some forms of multilayer perceptrons. Basis pursuit denoising is a sparse approximation technique in which a function is reconstructed by using a small number of basis functions chosen from a large set (the dictionary). We show that if the data are noiseless, the modified version of basis pursuit denoising proposed in this article is equivalent to SVM in the following sense: if applied to the same data set, the two techniques give the same solution, which is obtained by solving the same quadratic programming problem. In the appendix, we present a derivation of the SVM technique in the framework of regularization theory, rather than statistical learning theory, establishing a connection between SVM, sparse approximation, and regularization theory.
doi_str_mv 10.1162/089976698300017269
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1162_089976698300017269</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1859167848</sourcerecordid><originalsourceid>FETCH-LOGICAL-c541t-a07308519f54336f5c4c175d9db663d7514434d535fc3153cd3fcae427cf86fc3</originalsourceid><addsrcrecordid>eNp9kM9LwzAcxYMoc_74BwShBw9eOpPmR9PjHNsUJh6m4i1kaYoZXdol7fzx15uxsovDUyDfz3u89wC4QnCAEEvuIM-ylLGMYwghShOWHYE-ohjGnPP3Y9DfAnEg0lNw5v0yUAxB2gO9bCuiuA-mQxuN163ZyFJbpaN73XxqbaN5LZ3X0bCuXfVlVrIxlY2kzaN5W9eVa6I3rZrKRU9SfRir_QU4KWTp9WX3noPXyfhl9BDPnqePo-EsVpSgJpYwxZBTlBWUYMwKqohCKc2zfMEYzlOKCMEkp5gWCocmKseFkpokqSo4C3_n4HbnG3KtW-0bsTJe6bKUVletF4jTDLGUEx7QZIcqV3nvdCFqF5q4b4Gg2O4n_u4XRNedf7tY6Xwv6QYL95vuLr2SZeGkVcbvsQQTivkWm-ywlWnEsmqdDaMIq1W1QdAwgWEgoUhggkKUEET8mPpwnsEBo38K_AKMophd</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1859167848</pqid></control><display><type>article</type><title>An Equivalence Between Sparse Approximation and Support Vector Machines</title><source>MIT Press Journals</source><creator>Girosi, Federico</creator><creatorcontrib>Girosi, Federico</creatorcontrib><description>This article shows a relationship between two different approximation techniques: the support vector machines (SVM), proposed by V. Vapnik (1995) and a sparse approximation scheme that resembles the basis pursuit denoising algorithm (Chen, 1995; Chen, Donoho, &amp; Saunders, 1995). SVM is a technique that can be derived from the structural risk minimization principle (Vapnik, 1982) and can be used to estimate the parameters of several different approximation schemes, including radial basis functions, algebraic and trigonometric polynomials, B-splines, and some forms of multilayer perceptrons. Basis pursuit denoising is a sparse approximation technique in which a function is reconstructed by using a small number of basis functions chosen from a large set (the dictionary). We show that if the data are noiseless, the modified version of basis pursuit denoising proposed in this article is equivalent to SVM in the following sense: if applied to the same data set, the two techniques give the same solution, which is obtained by solving the same quadratic programming problem. In the appendix, we present a derivation of the SVM technique in the framework of regularization theory, rather than statistical learning theory, establishing a connection between SVM, sparse approximation, and regularization theory.</description><identifier>ISSN: 0899-7667</identifier><identifier>EISSN: 1530-888X</identifier><identifier>DOI: 10.1162/089976698300017269</identifier><identifier>PMID: 9698353</identifier><language>eng</language><publisher>One Rogers Street, Cambridge, MA 02142-1209, USA: MIT Press</publisher><subject>Applied sciences ; Artificial intelligence ; Automata. Abstract machines. Turing machines ; Computer science; control theory; systems ; Connectionism. Neural networks ; Exact sciences and technology ; Letters ; Theoretical computing</subject><ispartof>Neural computation, 1998-08, Vol.10 (6), p.1455-1480</ispartof><rights>1998 INIST-CNRS</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c541t-a07308519f54336f5c4c175d9db663d7514434d535fc3153cd3fcae427cf86fc3</citedby><cites>FETCH-LOGICAL-c541t-a07308519f54336f5c4c175d9db663d7514434d535fc3153cd3fcae427cf86fc3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://direct.mit.edu/neco/article/doi/10.1162/089976698300017269$$EHTML$$P50$$Gmit$$H</linktohtml><link.rule.ids>314,780,784,27923,27924,54008,54009</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=2345383$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/9698353$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Girosi, Federico</creatorcontrib><title>An Equivalence Between Sparse Approximation and Support Vector Machines</title><title>Neural computation</title><addtitle>Neural Comput</addtitle><description>This article shows a relationship between two different approximation techniques: the support vector machines (SVM), proposed by V. Vapnik (1995) and a sparse approximation scheme that resembles the basis pursuit denoising algorithm (Chen, 1995; Chen, Donoho, &amp; Saunders, 1995). SVM is a technique that can be derived from the structural risk minimization principle (Vapnik, 1982) and can be used to estimate the parameters of several different approximation schemes, including radial basis functions, algebraic and trigonometric polynomials, B-splines, and some forms of multilayer perceptrons. Basis pursuit denoising is a sparse approximation technique in which a function is reconstructed by using a small number of basis functions chosen from a large set (the dictionary). We show that if the data are noiseless, the modified version of basis pursuit denoising proposed in this article is equivalent to SVM in the following sense: if applied to the same data set, the two techniques give the same solution, which is obtained by solving the same quadratic programming problem. In the appendix, we present a derivation of the SVM technique in the framework of regularization theory, rather than statistical learning theory, establishing a connection between SVM, sparse approximation, and regularization theory.</description><subject>Applied sciences</subject><subject>Artificial intelligence</subject><subject>Automata. Abstract machines. Turing machines</subject><subject>Computer science; control theory; systems</subject><subject>Connectionism. Neural networks</subject><subject>Exact sciences and technology</subject><subject>Letters</subject><subject>Theoretical computing</subject><issn>0899-7667</issn><issn>1530-888X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1998</creationdate><recordtype>article</recordtype><recordid>eNp9kM9LwzAcxYMoc_74BwShBw9eOpPmR9PjHNsUJh6m4i1kaYoZXdol7fzx15uxsovDUyDfz3u89wC4QnCAEEvuIM-ylLGMYwghShOWHYE-ohjGnPP3Y9DfAnEg0lNw5v0yUAxB2gO9bCuiuA-mQxuN163ZyFJbpaN73XxqbaN5LZ3X0bCuXfVlVrIxlY2kzaN5W9eVa6I3rZrKRU9SfRir_QU4KWTp9WX3noPXyfhl9BDPnqePo-EsVpSgJpYwxZBTlBWUYMwKqohCKc2zfMEYzlOKCMEkp5gWCocmKseFkpokqSo4C3_n4HbnG3KtW-0bsTJe6bKUVletF4jTDLGUEx7QZIcqV3nvdCFqF5q4b4Gg2O4n_u4XRNedf7tY6Xwv6QYL95vuLr2SZeGkVcbvsQQTivkWm-ywlWnEsmqdDaMIq1W1QdAwgWEgoUhggkKUEET8mPpwnsEBo38K_AKMophd</recordid><startdate>19980815</startdate><enddate>19980815</enddate><creator>Girosi, Federico</creator><general>MIT Press</general><scope>IQODW</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>19980815</creationdate><title>An Equivalence Between Sparse Approximation and Support Vector Machines</title><author>Girosi, Federico</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c541t-a07308519f54336f5c4c175d9db663d7514434d535fc3153cd3fcae427cf86fc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1998</creationdate><topic>Applied sciences</topic><topic>Artificial intelligence</topic><topic>Automata. Abstract machines. Turing machines</topic><topic>Computer science; control theory; systems</topic><topic>Connectionism. Neural networks</topic><topic>Exact sciences and technology</topic><topic>Letters</topic><topic>Theoretical computing</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Girosi, Federico</creatorcontrib><collection>Pascal-Francis</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Girosi, Federico</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An Equivalence Between Sparse Approximation and Support Vector Machines</atitle><jtitle>Neural computation</jtitle><addtitle>Neural Comput</addtitle><date>1998-08-15</date><risdate>1998</risdate><volume>10</volume><issue>6</issue><spage>1455</spage><epage>1480</epage><pages>1455-1480</pages><issn>0899-7667</issn><eissn>1530-888X</eissn><abstract>This article shows a relationship between two different approximation techniques: the support vector machines (SVM), proposed by V. Vapnik (1995) and a sparse approximation scheme that resembles the basis pursuit denoising algorithm (Chen, 1995; Chen, Donoho, &amp; Saunders, 1995). SVM is a technique that can be derived from the structural risk minimization principle (Vapnik, 1982) and can be used to estimate the parameters of several different approximation schemes, including radial basis functions, algebraic and trigonometric polynomials, B-splines, and some forms of multilayer perceptrons. Basis pursuit denoising is a sparse approximation technique in which a function is reconstructed by using a small number of basis functions chosen from a large set (the dictionary). We show that if the data are noiseless, the modified version of basis pursuit denoising proposed in this article is equivalent to SVM in the following sense: if applied to the same data set, the two techniques give the same solution, which is obtained by solving the same quadratic programming problem. In the appendix, we present a derivation of the SVM technique in the framework of regularization theory, rather than statistical learning theory, establishing a connection between SVM, sparse approximation, and regularization theory.</abstract><cop>One Rogers Street, Cambridge, MA 02142-1209, USA</cop><pub>MIT Press</pub><pmid>9698353</pmid><doi>10.1162/089976698300017269</doi><tpages>26</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0899-7667
ispartof Neural computation, 1998-08, Vol.10 (6), p.1455-1480
issn 0899-7667
1530-888X
language eng
recordid cdi_crossref_primary_10_1162_089976698300017269
source MIT Press Journals
subjects Applied sciences
Artificial intelligence
Automata. Abstract machines. Turing machines
Computer science
control theory
systems
Connectionism. Neural networks
Exact sciences and technology
Letters
Theoretical computing
title An Equivalence Between Sparse Approximation and Support Vector Machines
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T10%3A35%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20Equivalence%20Between%20Sparse%20Approximation%20and%20Support%20Vector%20Machines&rft.jtitle=Neural%20computation&rft.au=Girosi,%20Federico&rft.date=1998-08-15&rft.volume=10&rft.issue=6&rft.spage=1455&rft.epage=1480&rft.pages=1455-1480&rft.issn=0899-7667&rft.eissn=1530-888X&rft_id=info:doi/10.1162/089976698300017269&rft_dat=%3Cproquest_cross%3E1859167848%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1859167848&rft_id=info:pmid/9698353&rfr_iscdi=true