A convergent decomposition algorithm for support vector machines
In this work we consider nonlinear minimization problems with a single linear equality constraint and box constraints. In particular we are interested in solving problems where the number of variables is so huge that traditional optimization methods cannot be directly applied. Many interesting real...
Gespeichert in:
Veröffentlicht in: | Computational optimization and applications 2007-11, Vol.38 (2), p.217-234 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 234 |
---|---|
container_issue | 2 |
container_start_page | 217 |
container_title | Computational optimization and applications |
container_volume | 38 |
creator | Lucidi, S Palagi, L Risi, A Sciandrone, M |
description | In this work we consider nonlinear minimization problems with a single linear equality constraint and box constraints. In particular we are interested in solving problems where the number of variables is so huge that traditional optimization methods cannot be directly applied. Many interesting real world problems lead to the solution of large scale constrained problems with this structure. For example, the special subclass of problems with convex quadratic objective function plays a fundamental role in the training of Support Vector Machine, which is a technique for machine learning problems. For this particular subclass of convex quadratic problem, some convergent decomposition methods, based on the solution of a sequence of smaller subproblems, have been proposed. In this paper we define a new globally convergent decomposition algorithm that differs from the previous methods in the rule for the choice of the subproblem variables and in the presence of a proximal point modification in the objective function of the subproblems. In particular, the new rule for sequentially selecting the subproblems appears to be suited to tackle large scale problems, while the introduction of the proximal point term allows us to ensure the global convergence of the algorithm for the general case of nonconvex objective function. Furthermore, we report some preliminary numerical results on support vector classification problems with up to 100 thousands variables. [PUBLICATION ABSTRACT] |
doi_str_mv | 10.1007/s10589-007-9044-x |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_30972891</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1366209301</sourcerecordid><originalsourceid>FETCH-LOGICAL-c303t-2701868d1c20458f22c3d90aedb90292fc43dd35b5c92a9baff00026e42ba8ca3</originalsourceid><addsrcrecordid>eNpdkD9PwzAUxC0EEqHwAdgiBjbDs5048UZV8U-qxAKz5Th2myqJg-1U5dvjqkxM75700-nuELol8EAAqsdAoKwFThILKAp8OEMZKSuGaS2Kc5SBoBxzAHaJrkLYAYCoGM3Q0zLXbtwbvzFjzFuj3TC50MXOjbnqN853cTvk1vk8zNPkfMz3Rsf0Dkpvu9GEa3RhVR_Mzd9doK-X58_VG15_vL6vlmusGbCIaQWk5nVLNIWirC2lmrUClGkbAVRQqwvWtqxsSi2oEo2yNmWk3BS0UbVWbIHuT76Td9-zCVEOXdCm79Vo3BwkS4VSV5LAu3_gzs1-TNkkJSUvCw48QeQEae9C8MbKyXeD8j-SgDwOKk-DyqM8DioP7BfsIGk_</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>215654606</pqid></control><display><type>article</type><title>A convergent decomposition algorithm for support vector machines</title><source>SpringerLink Journals</source><creator>Lucidi, S ; Palagi, L ; Risi, A ; Sciandrone, M</creator><creatorcontrib>Lucidi, S ; Palagi, L ; Risi, A ; Sciandrone, M</creatorcontrib><description>In this work we consider nonlinear minimization problems with a single linear equality constraint and box constraints. In particular we are interested in solving problems where the number of variables is so huge that traditional optimization methods cannot be directly applied. Many interesting real world problems lead to the solution of large scale constrained problems with this structure. For example, the special subclass of problems with convex quadratic objective function plays a fundamental role in the training of Support Vector Machine, which is a technique for machine learning problems. For this particular subclass of convex quadratic problem, some convergent decomposition methods, based on the solution of a sequence of smaller subproblems, have been proposed. In this paper we define a new globally convergent decomposition algorithm that differs from the previous methods in the rule for the choice of the subproblem variables and in the presence of a proximal point modification in the objective function of the subproblems. In particular, the new rule for sequentially selecting the subproblems appears to be suited to tackle large scale problems, while the introduction of the proximal point term allows us to ensure the global convergence of the algorithm for the general case of nonconvex objective function. Furthermore, we report some preliminary numerical results on support vector classification problems with up to 100 thousands variables. [PUBLICATION ABSTRACT]</description><identifier>ISSN: 0926-6003</identifier><identifier>EISSN: 1573-2894</identifier><identifier>DOI: 10.1007/s10589-007-9044-x</identifier><language>eng</language><publisher>New York: Springer Nature B.V</publisher><subject>Algorithms ; Artificial intelligence ; Decomposition ; Machine learning ; Mathematical models ; Optimization ; Problem solving ; Studies ; Support vector machines</subject><ispartof>Computational optimization and applications, 2007-11, Vol.38 (2), p.217-234</ispartof><rights>Springer Science+Business Media, LLC 2007</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c303t-2701868d1c20458f22c3d90aedb90292fc43dd35b5c92a9baff00026e42ba8ca3</citedby><cites>FETCH-LOGICAL-c303t-2701868d1c20458f22c3d90aedb90292fc43dd35b5c92a9baff00026e42ba8ca3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Lucidi, S</creatorcontrib><creatorcontrib>Palagi, L</creatorcontrib><creatorcontrib>Risi, A</creatorcontrib><creatorcontrib>Sciandrone, M</creatorcontrib><title>A convergent decomposition algorithm for support vector machines</title><title>Computational optimization and applications</title><description>In this work we consider nonlinear minimization problems with a single linear equality constraint and box constraints. In particular we are interested in solving problems where the number of variables is so huge that traditional optimization methods cannot be directly applied. Many interesting real world problems lead to the solution of large scale constrained problems with this structure. For example, the special subclass of problems with convex quadratic objective function plays a fundamental role in the training of Support Vector Machine, which is a technique for machine learning problems. For this particular subclass of convex quadratic problem, some convergent decomposition methods, based on the solution of a sequence of smaller subproblems, have been proposed. In this paper we define a new globally convergent decomposition algorithm that differs from the previous methods in the rule for the choice of the subproblem variables and in the presence of a proximal point modification in the objective function of the subproblems. In particular, the new rule for sequentially selecting the subproblems appears to be suited to tackle large scale problems, while the introduction of the proximal point term allows us to ensure the global convergence of the algorithm for the general case of nonconvex objective function. Furthermore, we report some preliminary numerical results on support vector classification problems with up to 100 thousands variables. [PUBLICATION ABSTRACT]</description><subject>Algorithms</subject><subject>Artificial intelligence</subject><subject>Decomposition</subject><subject>Machine learning</subject><subject>Mathematical models</subject><subject>Optimization</subject><subject>Problem solving</subject><subject>Studies</subject><subject>Support vector machines</subject><issn>0926-6003</issn><issn>1573-2894</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2007</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNpdkD9PwzAUxC0EEqHwAdgiBjbDs5048UZV8U-qxAKz5Th2myqJg-1U5dvjqkxM75700-nuELol8EAAqsdAoKwFThILKAp8OEMZKSuGaS2Kc5SBoBxzAHaJrkLYAYCoGM3Q0zLXbtwbvzFjzFuj3TC50MXOjbnqN853cTvk1vk8zNPkfMz3Rsf0Dkpvu9GEa3RhVR_Mzd9doK-X58_VG15_vL6vlmusGbCIaQWk5nVLNIWirC2lmrUClGkbAVRQqwvWtqxsSi2oEo2yNmWk3BS0UbVWbIHuT76Td9-zCVEOXdCm79Vo3BwkS4VSV5LAu3_gzs1-TNkkJSUvCw48QeQEae9C8MbKyXeD8j-SgDwOKk-DyqM8DioP7BfsIGk_</recordid><startdate>20071101</startdate><enddate>20071101</enddate><creator>Lucidi, S</creator><creator>Palagi, L</creator><creator>Risi, A</creator><creator>Sciandrone, M</creator><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M2P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>Q9U</scope></search><sort><creationdate>20071101</creationdate><title>A convergent decomposition algorithm for support vector machines</title><author>Lucidi, S ; Palagi, L ; Risi, A ; Sciandrone, M</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c303t-2701868d1c20458f22c3d90aedb90292fc43dd35b5c92a9baff00026e42ba8ca3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2007</creationdate><topic>Algorithms</topic><topic>Artificial intelligence</topic><topic>Decomposition</topic><topic>Machine learning</topic><topic>Mathematical models</topic><topic>Optimization</topic><topic>Problem solving</topic><topic>Studies</topic><topic>Support vector machines</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lucidi, S</creatorcontrib><creatorcontrib>Palagi, L</creatorcontrib><creatorcontrib>Risi, A</creatorcontrib><creatorcontrib>Sciandrone, M</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Computational optimization and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lucidi, S</au><au>Palagi, L</au><au>Risi, A</au><au>Sciandrone, M</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A convergent decomposition algorithm for support vector machines</atitle><jtitle>Computational optimization and applications</jtitle><date>2007-11-01</date><risdate>2007</risdate><volume>38</volume><issue>2</issue><spage>217</spage><epage>234</epage><pages>217-234</pages><issn>0926-6003</issn><eissn>1573-2894</eissn><abstract>In this work we consider nonlinear minimization problems with a single linear equality constraint and box constraints. In particular we are interested in solving problems where the number of variables is so huge that traditional optimization methods cannot be directly applied. Many interesting real world problems lead to the solution of large scale constrained problems with this structure. For example, the special subclass of problems with convex quadratic objective function plays a fundamental role in the training of Support Vector Machine, which is a technique for machine learning problems. For this particular subclass of convex quadratic problem, some convergent decomposition methods, based on the solution of a sequence of smaller subproblems, have been proposed. In this paper we define a new globally convergent decomposition algorithm that differs from the previous methods in the rule for the choice of the subproblem variables and in the presence of a proximal point modification in the objective function of the subproblems. In particular, the new rule for sequentially selecting the subproblems appears to be suited to tackle large scale problems, while the introduction of the proximal point term allows us to ensure the global convergence of the algorithm for the general case of nonconvex objective function. Furthermore, we report some preliminary numerical results on support vector classification problems with up to 100 thousands variables. [PUBLICATION ABSTRACT]</abstract><cop>New York</cop><pub>Springer Nature B.V</pub><doi>10.1007/s10589-007-9044-x</doi><tpages>18</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0926-6003 |
ispartof | Computational optimization and applications, 2007-11, Vol.38 (2), p.217-234 |
issn | 0926-6003 1573-2894 |
language | eng |
recordid | cdi_proquest_miscellaneous_30972891 |
source | SpringerLink Journals |
subjects | Algorithms Artificial intelligence Decomposition Machine learning Mathematical models Optimization Problem solving Studies Support vector machines |
title | A convergent decomposition algorithm for support vector machines |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T05%3A26%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20convergent%20decomposition%20algorithm%20for%20support%20vector%20machines&rft.jtitle=Computational%20optimization%20and%20applications&rft.au=Lucidi,%20S&rft.date=2007-11-01&rft.volume=38&rft.issue=2&rft.spage=217&rft.epage=234&rft.pages=217-234&rft.issn=0926-6003&rft.eissn=1573-2894&rft_id=info:doi/10.1007/s10589-007-9044-x&rft_dat=%3Cproquest_cross%3E1366209301%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=215654606&rft_id=info:pmid/&rfr_iscdi=true |