A nested heuristic for parameter tuning in Support Vector Machines

The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex mod...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers & operations research 2014-03, Vol.43, p.328-334
Hauptverfasser: Carrizosa, Emilio, Martín-Barragán, Belén, Romero Morales, Dolores
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 334
container_issue
container_start_page 328
container_title Computers & operations research
container_volume 43
creator Carrizosa, Emilio
Martín-Barragán, Belén
Romero Morales, Dolores
description The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.
doi_str_mv 10.1016/j.cor.2013.10.002
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1677982191</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0305054813002979</els_id><sourcerecordid>1567106193</sourcerecordid><originalsourceid>FETCH-LOGICAL-c528t-f1d82a4dd44c75aea020efaf4e4c1148d2f99b487938623142cec64498d298d73</originalsourceid><addsrcrecordid>eNqNkc1qGzEUhUVJoI7TB-huIASyGVdX0uiHrhyTJgWHLNKW7oSquRPL2DMTaSaQt69cmy6ySCIQQtJ3zr3SIeQz0BlQkF_WM9_FGaPA835GKftAJqAVL5Wsfh-RCeW0Kmkl9EdyktKa5qEYTMjlvGgxDVgXKxxjSEPwRdPFonfRbXHAWAxjG9qHIrTF_dj3XRyKX-iHjNw6vwpZfEqOG7dJ-OmwTsnPb1c_Fjfl8u76-2K-LH3F9FA2UGvmRF0L4VXl0FFGsXGNQOEBhK5ZY8wfoZXhWjIOgnn0UgiTb_JUfEou9r597B7H3LTdhuRxs3EtdmOyIJUymoGBt9FKKqASDH8HyqlRUupdA2cv0HU3xja_2YKQhoIW_2rDnvKxSyliY_sYti4-W6B2l5Vd25yV3WW1O8pZZc35wdkl7zZNdK0P6b-QaZ7NK5G5r3sO8z8_BYw2-YCtxzrEHIqtu_BKlb-PwaZ-</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1469018491</pqid></control><display><type>article</type><title>A nested heuristic for parameter tuning in Support Vector Machines</title><source>Elsevier ScienceDirect Journals</source><creator>Carrizosa, Emilio ; Martín-Barragán, Belén ; Romero Morales, Dolores</creator><creatorcontrib>Carrizosa, Emilio ; Martín-Barragán, Belén ; Romero Morales, Dolores</creatorcontrib><description>The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.</description><identifier>ISSN: 0305-0548</identifier><identifier>EISSN: 1873-765X</identifier><identifier>EISSN: 0305-0548</identifier><identifier>DOI: 10.1016/j.cor.2013.10.002</identifier><identifier>CODEN: CMORAP</identifier><language>eng</language><publisher>Kidlington: Elsevier Ltd</publisher><subject>Algorithmics. Computability. Computer arithmetics ; Algorithms ; Applied sciences ; Artificial intelligence ; Computer science; control theory; systems ; Computer simulation ; Data processing. List processing. Character string processing ; Exact sciences and technology ; Heuristic ; Learning and adaptive systems ; Libraries ; Mathematical functions ; Mathematical models ; Mathematical problems ; Memory organisation. Data processing ; Multiple kernel learning ; Nested heuristic ; Operations research ; Parameter tuning ; Searching ; Software ; State of the art ; Studies ; Supervised classification ; Support Vector Machines ; Theoretical computing ; Tuning ; Variable neighborhood search</subject><ispartof>Computers &amp; operations research, 2014-03, Vol.43, p.328-334</ispartof><rights>2013 Elsevier Ltd</rights><rights>2015 INIST-CNRS</rights><rights>Copyright Pergamon Press Inc. Mar 2014</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c528t-f1d82a4dd44c75aea020efaf4e4c1148d2f99b487938623142cec64498d298d73</citedby><cites>FETCH-LOGICAL-c528t-f1d82a4dd44c75aea020efaf4e4c1148d2f99b487938623142cec64498d298d73</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0305054813002979$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=28301854$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Carrizosa, Emilio</creatorcontrib><creatorcontrib>Martín-Barragán, Belén</creatorcontrib><creatorcontrib>Romero Morales, Dolores</creatorcontrib><title>A nested heuristic for parameter tuning in Support Vector Machines</title><title>Computers &amp; operations research</title><description>The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.</description><subject>Algorithmics. Computability. Computer arithmetics</subject><subject>Algorithms</subject><subject>Applied sciences</subject><subject>Artificial intelligence</subject><subject>Computer science; control theory; systems</subject><subject>Computer simulation</subject><subject>Data processing. List processing. Character string processing</subject><subject>Exact sciences and technology</subject><subject>Heuristic</subject><subject>Learning and adaptive systems</subject><subject>Libraries</subject><subject>Mathematical functions</subject><subject>Mathematical models</subject><subject>Mathematical problems</subject><subject>Memory organisation. Data processing</subject><subject>Multiple kernel learning</subject><subject>Nested heuristic</subject><subject>Operations research</subject><subject>Parameter tuning</subject><subject>Searching</subject><subject>Software</subject><subject>State of the art</subject><subject>Studies</subject><subject>Supervised classification</subject><subject>Support Vector Machines</subject><subject>Theoretical computing</subject><subject>Tuning</subject><subject>Variable neighborhood search</subject><issn>0305-0548</issn><issn>1873-765X</issn><issn>0305-0548</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><recordid>eNqNkc1qGzEUhUVJoI7TB-huIASyGVdX0uiHrhyTJgWHLNKW7oSquRPL2DMTaSaQt69cmy6ySCIQQtJ3zr3SIeQz0BlQkF_WM9_FGaPA835GKftAJqAVL5Wsfh-RCeW0Kmkl9EdyktKa5qEYTMjlvGgxDVgXKxxjSEPwRdPFonfRbXHAWAxjG9qHIrTF_dj3XRyKX-iHjNw6vwpZfEqOG7dJ-OmwTsnPb1c_Fjfl8u76-2K-LH3F9FA2UGvmRF0L4VXl0FFGsXGNQOEBhK5ZY8wfoZXhWjIOgnn0UgiTb_JUfEou9r597B7H3LTdhuRxs3EtdmOyIJUymoGBt9FKKqASDH8HyqlRUupdA2cv0HU3xja_2YKQhoIW_2rDnvKxSyliY_sYti4-W6B2l5Vd25yV3WW1O8pZZc35wdkl7zZNdK0P6b-QaZ7NK5G5r3sO8z8_BYw2-YCtxzrEHIqtu_BKlb-PwaZ-</recordid><startdate>20140301</startdate><enddate>20140301</enddate><creator>Carrizosa, Emilio</creator><creator>Martín-Barragán, Belén</creator><creator>Romero Morales, Dolores</creator><general>Elsevier Ltd</general><general>Elsevier</general><general>Pergamon Press Inc</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7TB</scope><scope>FR3</scope></search><sort><creationdate>20140301</creationdate><title>A nested heuristic for parameter tuning in Support Vector Machines</title><author>Carrizosa, Emilio ; Martín-Barragán, Belén ; Romero Morales, Dolores</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c528t-f1d82a4dd44c75aea020efaf4e4c1148d2f99b487938623142cec64498d298d73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Algorithmics. Computability. Computer arithmetics</topic><topic>Algorithms</topic><topic>Applied sciences</topic><topic>Artificial intelligence</topic><topic>Computer science; control theory; systems</topic><topic>Computer simulation</topic><topic>Data processing. List processing. Character string processing</topic><topic>Exact sciences and technology</topic><topic>Heuristic</topic><topic>Learning and adaptive systems</topic><topic>Libraries</topic><topic>Mathematical functions</topic><topic>Mathematical models</topic><topic>Mathematical problems</topic><topic>Memory organisation. Data processing</topic><topic>Multiple kernel learning</topic><topic>Nested heuristic</topic><topic>Operations research</topic><topic>Parameter tuning</topic><topic>Searching</topic><topic>Software</topic><topic>State of the art</topic><topic>Studies</topic><topic>Supervised classification</topic><topic>Support Vector Machines</topic><topic>Theoretical computing</topic><topic>Tuning</topic><topic>Variable neighborhood search</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Carrizosa, Emilio</creatorcontrib><creatorcontrib>Martín-Barragán, Belén</creatorcontrib><creatorcontrib>Romero Morales, Dolores</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Engineering Research Database</collection><jtitle>Computers &amp; operations research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Carrizosa, Emilio</au><au>Martín-Barragán, Belén</au><au>Romero Morales, Dolores</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A nested heuristic for parameter tuning in Support Vector Machines</atitle><jtitle>Computers &amp; operations research</jtitle><date>2014-03-01</date><risdate>2014</risdate><volume>43</volume><spage>328</spage><epage>334</epage><pages>328-334</pages><issn>0305-0548</issn><eissn>1873-765X</eissn><eissn>0305-0548</eissn><coden>CMORAP</coden><abstract>The default approach for tuning the parameters of a Support Vector Machine (SVM) is a grid search in the parameter space. Different metaheuristics have been recently proposed as a more efficient alternative, but they have only shown to be useful in models with a low number of parameters. Complex models, involving many parameters, can be seen as extensions of simpler and easy-to-tune models, yielding a nested sequence of models of increasing complexity. In this paper we propose an algorithm which successfully exploits this nested property, with two main advantages versus the state of the art. First, our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. Second, as algorithmic requirements we only need either an SVM library or any routine for the minimization of convex quadratic functions under linear constraints. In the computational study, we address Multiple Kernel Learning tuning problems for which grid search clearly would be infeasible, while our classification accuracy is comparable to that of ad hoc model-dependent benchmark tuning methods.</abstract><cop>Kidlington</cop><pub>Elsevier Ltd</pub><doi>10.1016/j.cor.2013.10.002</doi><tpages>7</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0305-0548
ispartof Computers & operations research, 2014-03, Vol.43, p.328-334
issn 0305-0548
1873-765X
0305-0548
language eng
recordid cdi_proquest_miscellaneous_1677982191
source Elsevier ScienceDirect Journals
subjects Algorithmics. Computability. Computer arithmetics
Algorithms
Applied sciences
Artificial intelligence
Computer science
control theory
systems
Computer simulation
Data processing. List processing. Character string processing
Exact sciences and technology
Heuristic
Learning and adaptive systems
Libraries
Mathematical functions
Mathematical models
Mathematical problems
Memory organisation. Data processing
Multiple kernel learning
Nested heuristic
Operations research
Parameter tuning
Searching
Software
State of the art
Studies
Supervised classification
Support Vector Machines
Theoretical computing
Tuning
Variable neighborhood search
title A nested heuristic for parameter tuning in Support Vector Machines
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T12%3A16%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20nested%20heuristic%20for%20parameter%20tuning%20in%20Support%20Vector%20Machines&rft.jtitle=Computers%20&%20operations%20research&rft.au=Carrizosa,%20Emilio&rft.date=2014-03-01&rft.volume=43&rft.spage=328&rft.epage=334&rft.pages=328-334&rft.issn=0305-0548&rft.eissn=1873-765X&rft.coden=CMORAP&rft_id=info:doi/10.1016/j.cor.2013.10.002&rft_dat=%3Cproquest_cross%3E1567106193%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1469018491&rft_id=info:pmid/&rft_els_id=S0305054813002979&rfr_iscdi=true