Kernel-dependent support vector error bounds

Model selection in support vector machines is usually carried out by minimizing the quotient of the radius of the smallest enclosing sphere of the data and the observed margin on the training set. We provide a new criterion taking the distribution within that sphere into account by considering the e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Scholkopf, B, Shawe-Taylor, J, Smola, A.J, Williamson, R.C
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 108
container_issue
container_start_page 103
container_title
container_volume
creator Scholkopf, B
Shawe-Taylor, J
Smola, A.J
Williamson, R.C
description Model selection in support vector machines is usually carried out by minimizing the quotient of the radius of the smallest enclosing sphere of the data and the observed margin on the training set. We provide a new criterion taking the distribution within that sphere into account by considering the eigenvalue distribution of the Gram matrix of the data. Experimental results on real world data show that this new criterion provides a good prediction of the shape of the curve relating generalization error to kernel width.
doi_str_mv 10.1049/cp:19991092
format Conference Proceeding
fullrecord <record><control><sourceid>proquest_iet_c</sourceid><recordid>TN_cdi_proquest_miscellaneous_26983234</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>26983234</sourcerecordid><originalsourceid>FETCH-LOGICAL-c261t-1336d89142529602d386c1402768a49b01a51e721cfd56ca15a9284626b567d63</originalsourceid><addsrcrecordid>eNo1kE1LxDAYhAMquK578g_syYNYzZuv5vUmi1-44EXPoU3ewkpNatL6--2yOoeZy8MwDGMXwG-AK7z1wx0gInAUR-yMWy3Q1ALqY7bgWtYVosVTtirlk89SSiu0C3b9SjlSXwUaKAaK47pMw5DyuP4hP6a8ppxnb9MUQzlnJ13TF1r95ZJ9PD68b56r7dvTy-Z-W3lhYKxAShMsghL7DVwEaY0HxUVtbKOw5dBooHmb74I2vgHdoLDKCNNqUwcjl-zy0Dvk9D1RGd3Xrnjq-yZSmooTBq0UUs3g1QHc0eh8ih1lip6KA-72pzg_uP9T5C9mQFHD</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype><pqid>26983234</pqid></control><display><type>conference_proceeding</type><title>Kernel-dependent support vector error bounds</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Scholkopf, B ; Shawe-Taylor, J ; Smola, A.J ; Williamson, R.C</creator><creatorcontrib>Scholkopf, B ; Shawe-Taylor, J ; Smola, A.J ; Williamson, R.C</creatorcontrib><description>Model selection in support vector machines is usually carried out by minimizing the quotient of the radius of the smallest enclosing sphere of the data and the observed margin on the training set. We provide a new criterion taking the distribution within that sphere into account by considering the eigenvalue distribution of the Gram matrix of the data. Experimental results on real world data show that this new criterion provides a good prediction of the shape of the curve relating generalization error to kernel width.</description><identifier>ISSN: 0537-9989</identifier><identifier>ISBN: 0852967217</identifier><identifier>ISBN: 9780852967218</identifier><identifier>DOI: 10.1049/cp:19991092</identifier><language>eng</language><publisher>London: IEE</publisher><subject>Algebra ; Error analysis in numerical methods ; Learning in AI ; Neural nets</subject><ispartof>9th International Conference on Artificial Neural Networks: ICANN '99, 1999, p.103-108</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c261t-1336d89142529602d386c1402768a49b01a51e721cfd56ca15a9284626b567d63</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>309,310,314,776,780,785,786,4036,4037,27901,27902</link.rule.ids></links><search><creatorcontrib>Scholkopf, B</creatorcontrib><creatorcontrib>Shawe-Taylor, J</creatorcontrib><creatorcontrib>Smola, A.J</creatorcontrib><creatorcontrib>Williamson, R.C</creatorcontrib><title>Kernel-dependent support vector error bounds</title><title>9th International Conference on Artificial Neural Networks: ICANN '99</title><description>Model selection in support vector machines is usually carried out by minimizing the quotient of the radius of the smallest enclosing sphere of the data and the observed margin on the training set. We provide a new criterion taking the distribution within that sphere into account by considering the eigenvalue distribution of the Gram matrix of the data. Experimental results on real world data show that this new criterion provides a good prediction of the shape of the curve relating generalization error to kernel width.</description><subject>Algebra</subject><subject>Error analysis in numerical methods</subject><subject>Learning in AI</subject><subject>Neural nets</subject><issn>0537-9989</issn><isbn>0852967217</isbn><isbn>9780852967218</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>1999</creationdate><recordtype>conference_proceeding</recordtype><recordid>eNo1kE1LxDAYhAMquK578g_syYNYzZuv5vUmi1-44EXPoU3ewkpNatL6--2yOoeZy8MwDGMXwG-AK7z1wx0gInAUR-yMWy3Q1ALqY7bgWtYVosVTtirlk89SSiu0C3b9SjlSXwUaKAaK47pMw5DyuP4hP6a8ppxnb9MUQzlnJ13TF1r95ZJ9PD68b56r7dvTy-Z-W3lhYKxAShMsghL7DVwEaY0HxUVtbKOw5dBooHmb74I2vgHdoLDKCNNqUwcjl-zy0Dvk9D1RGd3Xrnjq-yZSmooTBq0UUs3g1QHc0eh8ih1lip6KA-72pzg_uP9T5C9mQFHD</recordid><startdate>1999</startdate><enddate>1999</enddate><creator>Scholkopf, B</creator><creator>Shawe-Taylor, J</creator><creator>Smola, A.J</creator><creator>Williamson, R.C</creator><general>IEE</general><scope>8ET</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope></search><sort><creationdate>1999</creationdate><title>Kernel-dependent support vector error bounds</title><author>Scholkopf, B ; Shawe-Taylor, J ; Smola, A.J ; Williamson, R.C</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c261t-1336d89142529602d386c1402768a49b01a51e721cfd56ca15a9284626b567d63</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>1999</creationdate><topic>Algebra</topic><topic>Error analysis in numerical methods</topic><topic>Learning in AI</topic><topic>Neural nets</topic><toplevel>online_resources</toplevel><creatorcontrib>Scholkopf, B</creatorcontrib><creatorcontrib>Shawe-Taylor, J</creatorcontrib><creatorcontrib>Smola, A.J</creatorcontrib><creatorcontrib>Williamson, R.C</creatorcontrib><collection>IET Conference Publications by volume</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Scholkopf, B</au><au>Shawe-Taylor, J</au><au>Smola, A.J</au><au>Williamson, R.C</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Kernel-dependent support vector error bounds</atitle><btitle>9th International Conference on Artificial Neural Networks: ICANN '99</btitle><date>1999</date><risdate>1999</risdate><spage>103</spage><epage>108</epage><pages>103-108</pages><issn>0537-9989</issn><isbn>0852967217</isbn><isbn>9780852967218</isbn><abstract>Model selection in support vector machines is usually carried out by minimizing the quotient of the radius of the smallest enclosing sphere of the data and the observed margin on the training set. We provide a new criterion taking the distribution within that sphere into account by considering the eigenvalue distribution of the Gram matrix of the data. Experimental results on real world data show that this new criterion provides a good prediction of the shape of the curve relating generalization error to kernel width.</abstract><cop>London</cop><pub>IEE</pub><doi>10.1049/cp:19991092</doi><tpages>6</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0537-9989
ispartof 9th International Conference on Artificial Neural Networks: ICANN '99, 1999, p.103-108
issn 0537-9989
language eng
recordid cdi_proquest_miscellaneous_26983234
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Algebra
Error analysis in numerical methods
Learning in AI
Neural nets
title Kernel-dependent support vector error bounds
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T07%3A03%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_iet_c&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Kernel-dependent%20support%20vector%20error%20bounds&rft.btitle=9th%20International%20Conference%20on%20Artificial%20Neural%20Networks:%20ICANN%20'99&rft.au=Scholkopf,%20B&rft.date=1999&rft.spage=103&rft.epage=108&rft.pages=103-108&rft.issn=0537-9989&rft.isbn=0852967217&rft.isbn_list=9780852967218&rft_id=info:doi/10.1049/cp:19991092&rft_dat=%3Cproquest_iet_c%3E26983234%3C/proquest_iet_c%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=26983234&rft_id=info:pmid/&rfr_iscdi=true