An incremental least squares algorithm for large scale linear classification

► The training of a linear classifier is cast as a linear least-squares problem. ► An incremental recursive algorithm performing a finite number of steps is used. ► Memory accesses is minimal, being each training data used only once. ► The approach is suitable for (real-time) linear large scale clas...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:European journal of operational research 2013-02, Vol.224 (3), p.560-565
Hauptverfasser: Cassioli, A., Chiavaioli, A., Manes, C., Sciandrone, M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 565
container_issue 3
container_start_page 560
container_title European journal of operational research
container_volume 224
creator Cassioli, A.
Chiavaioli, A.
Manes, C.
Sciandrone, M.
description ► The training of a linear classifier is cast as a linear least-squares problem. ► An incremental recursive algorithm performing a finite number of steps is used. ► Memory accesses is minimal, being each training data used only once. ► The approach is suitable for (real-time) linear large scale classification. ► Experiments show that the approach is competitive with state-of-the-art algorithms. In this work we consider the problem of training a linear classifier by assuming that the number of data is huge (in particular, data may be larger than the memory capacity). We propose to adopt a linear least-squares formulation of the problem and an incremental recursive algorithm which requires to store a square matrix (whose dimension is equal to the number of features of the data). The algorithm (very simple to implement) converges to the solution using each training data once, so that it effectively handles possible memory issues and is a viable method for linear large scale classification and for real time applications, provided that the number of features of the data is not too large (say of the order of thousands). The extensive computational experiments show that the proposed algorithm is at least competitive with the state-of-the-art algorithms for large scale linear classification.
doi_str_mv 10.1016/j.ejor.2012.09.004
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_1115385054</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0377221712006674</els_id><sourcerecordid>2799829021</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-a8dd00309cf1e1a6536ada3c4957388e31818490d59564f7371580045a9ce80e3</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMouK7-AU8Bz60zTdMm4GURv2DBi55DSKdrSrbZTbqC_94u69nTXN5n3pmHsVuEEgGb-6GkIaayAqxK0CVAfcYWqNqqaFQD52wBom2LqsL2kl3lPAAASpQLtl6N3I8u0ZbGyQYeyOaJ5_3BJsrchk1Mfvra8j4mHmzaEM_OBuLBj2QTd8Hm7Hvv7OTjeM0uehsy3fzNJft8fvp4fC3W7y9vj6t14YTUU2FV1wEI0K5HQttI0djOCldr2QqlSKBCVWvopJZN3beiRanml6TVjhSQWLK7095divsD5ckM8ZDGudIgohRKgqznVHVKuRRzTtSbXfJbm34MgjlaM4M5WjNHawa0mStm6OEE0Xz_t6dksvM0Oup8IjeZLvr_8F9Qc3T2</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1115385054</pqid></control><display><type>article</type><title>An incremental least squares algorithm for large scale linear classification</title><source>Elsevier ScienceDirect Journals Complete</source><creator>Cassioli, A. ; Chiavaioli, A. ; Manes, C. ; Sciandrone, M.</creator><creatorcontrib>Cassioli, A. ; Chiavaioli, A. ; Manes, C. ; Sciandrone, M.</creatorcontrib><description>► The training of a linear classifier is cast as a linear least-squares problem. ► An incremental recursive algorithm performing a finite number of steps is used. ► Memory accesses is minimal, being each training data used only once. ► The approach is suitable for (real-time) linear large scale classification. ► Experiments show that the approach is competitive with state-of-the-art algorithms. In this work we consider the problem of training a linear classifier by assuming that the number of data is huge (in particular, data may be larger than the memory capacity). We propose to adopt a linear least-squares formulation of the problem and an incremental recursive algorithm which requires to store a square matrix (whose dimension is equal to the number of features of the data). The algorithm (very simple to implement) converges to the solution using each training data once, so that it effectively handles possible memory issues and is a viable method for linear large scale classification and for real time applications, provided that the number of features of the data is not too large (say of the order of thousands). The extensive computational experiments show that the proposed algorithm is at least competitive with the state-of-the-art algorithms for large scale linear classification.</description><identifier>ISSN: 0377-2217</identifier><identifier>EISSN: 1872-6860</identifier><identifier>DOI: 10.1016/j.ejor.2012.09.004</identifier><identifier>CODEN: EJORDT</identifier><language>eng</language><publisher>Amsterdam: Elsevier B.V</publisher><subject>Algorithms ; Classification ; Computer memory ; Incremental algorithms ; Large scale optimization ; Linear classification ; Machine learning ; Mathematical problems ; Operations research ; Studies ; Training</subject><ispartof>European journal of operational research, 2013-02, Vol.224 (3), p.560-565</ispartof><rights>2012 Elsevier B.V.</rights><rights>Copyright Elsevier Sequoia S.A. Feb 1, 2013</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c359t-a8dd00309cf1e1a6536ada3c4957388e31818490d59564f7371580045a9ce80e3</citedby><cites>FETCH-LOGICAL-c359t-a8dd00309cf1e1a6536ada3c4957388e31818490d59564f7371580045a9ce80e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.ejor.2012.09.004$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,777,781,3537,27905,27906,45976</link.rule.ids></links><search><creatorcontrib>Cassioli, A.</creatorcontrib><creatorcontrib>Chiavaioli, A.</creatorcontrib><creatorcontrib>Manes, C.</creatorcontrib><creatorcontrib>Sciandrone, M.</creatorcontrib><title>An incremental least squares algorithm for large scale linear classification</title><title>European journal of operational research</title><description>► The training of a linear classifier is cast as a linear least-squares problem. ► An incremental recursive algorithm performing a finite number of steps is used. ► Memory accesses is minimal, being each training data used only once. ► The approach is suitable for (real-time) linear large scale classification. ► Experiments show that the approach is competitive with state-of-the-art algorithms. In this work we consider the problem of training a linear classifier by assuming that the number of data is huge (in particular, data may be larger than the memory capacity). We propose to adopt a linear least-squares formulation of the problem and an incremental recursive algorithm which requires to store a square matrix (whose dimension is equal to the number of features of the data). The algorithm (very simple to implement) converges to the solution using each training data once, so that it effectively handles possible memory issues and is a viable method for linear large scale classification and for real time applications, provided that the number of features of the data is not too large (say of the order of thousands). The extensive computational experiments show that the proposed algorithm is at least competitive with the state-of-the-art algorithms for large scale linear classification.</description><subject>Algorithms</subject><subject>Classification</subject><subject>Computer memory</subject><subject>Incremental algorithms</subject><subject>Large scale optimization</subject><subject>Linear classification</subject><subject>Machine learning</subject><subject>Mathematical problems</subject><subject>Operations research</subject><subject>Studies</subject><subject>Training</subject><issn>0377-2217</issn><issn>1872-6860</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2013</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LxDAQhoMouK7-AU8Bz60zTdMm4GURv2DBi55DSKdrSrbZTbqC_94u69nTXN5n3pmHsVuEEgGb-6GkIaayAqxK0CVAfcYWqNqqaFQD52wBom2LqsL2kl3lPAAASpQLtl6N3I8u0ZbGyQYeyOaJ5_3BJsrchk1Mfvra8j4mHmzaEM_OBuLBj2QTd8Hm7Hvv7OTjeM0uehsy3fzNJft8fvp4fC3W7y9vj6t14YTUU2FV1wEI0K5HQttI0djOCldr2QqlSKBCVWvopJZN3beiRanml6TVjhSQWLK7095divsD5ckM8ZDGudIgohRKgqznVHVKuRRzTtSbXfJbm34MgjlaM4M5WjNHawa0mStm6OEE0Xz_t6dksvM0Oup8IjeZLvr_8F9Qc3T2</recordid><startdate>20130201</startdate><enddate>20130201</enddate><creator>Cassioli, A.</creator><creator>Chiavaioli, A.</creator><creator>Manes, C.</creator><creator>Sciandrone, M.</creator><general>Elsevier B.V</general><general>Elsevier Sequoia S.A</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20130201</creationdate><title>An incremental least squares algorithm for large scale linear classification</title><author>Cassioli, A. ; Chiavaioli, A. ; Manes, C. ; Sciandrone, M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-a8dd00309cf1e1a6536ada3c4957388e31818490d59564f7371580045a9ce80e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2013</creationdate><topic>Algorithms</topic><topic>Classification</topic><topic>Computer memory</topic><topic>Incremental algorithms</topic><topic>Large scale optimization</topic><topic>Linear classification</topic><topic>Machine learning</topic><topic>Mathematical problems</topic><topic>Operations research</topic><topic>Studies</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cassioli, A.</creatorcontrib><creatorcontrib>Chiavaioli, A.</creatorcontrib><creatorcontrib>Manes, C.</creatorcontrib><creatorcontrib>Sciandrone, M.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>European journal of operational research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cassioli, A.</au><au>Chiavaioli, A.</au><au>Manes, C.</au><au>Sciandrone, M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An incremental least squares algorithm for large scale linear classification</atitle><jtitle>European journal of operational research</jtitle><date>2013-02-01</date><risdate>2013</risdate><volume>224</volume><issue>3</issue><spage>560</spage><epage>565</epage><pages>560-565</pages><issn>0377-2217</issn><eissn>1872-6860</eissn><coden>EJORDT</coden><abstract>► The training of a linear classifier is cast as a linear least-squares problem. ► An incremental recursive algorithm performing a finite number of steps is used. ► Memory accesses is minimal, being each training data used only once. ► The approach is suitable for (real-time) linear large scale classification. ► Experiments show that the approach is competitive with state-of-the-art algorithms. In this work we consider the problem of training a linear classifier by assuming that the number of data is huge (in particular, data may be larger than the memory capacity). We propose to adopt a linear least-squares formulation of the problem and an incremental recursive algorithm which requires to store a square matrix (whose dimension is equal to the number of features of the data). The algorithm (very simple to implement) converges to the solution using each training data once, so that it effectively handles possible memory issues and is a viable method for linear large scale classification and for real time applications, provided that the number of features of the data is not too large (say of the order of thousands). The extensive computational experiments show that the proposed algorithm is at least competitive with the state-of-the-art algorithms for large scale linear classification.</abstract><cop>Amsterdam</cop><pub>Elsevier B.V</pub><doi>10.1016/j.ejor.2012.09.004</doi><tpages>6</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0377-2217
ispartof European journal of operational research, 2013-02, Vol.224 (3), p.560-565
issn 0377-2217
1872-6860
language eng
recordid cdi_proquest_journals_1115385054
source Elsevier ScienceDirect Journals Complete
subjects Algorithms
Classification
Computer memory
Incremental algorithms
Large scale optimization
Linear classification
Machine learning
Mathematical problems
Operations research
Studies
Training
title An incremental least squares algorithm for large scale linear classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T14%3A19%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20incremental%20least%20squares%20algorithm%20for%20large%20scale%20linear%20classification&rft.jtitle=European%20journal%20of%20operational%20research&rft.au=Cassioli,%20A.&rft.date=2013-02-01&rft.volume=224&rft.issue=3&rft.spage=560&rft.epage=565&rft.pages=560-565&rft.issn=0377-2217&rft.eissn=1872-6860&rft.coden=EJORDT&rft_id=info:doi/10.1016/j.ejor.2012.09.004&rft_dat=%3Cproquest_cross%3E2799829021%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1115385054&rft_id=info:pmid/&rft_els_id=S0377221712006674&rfr_iscdi=true