Large-Scale Maximum Margin Discriminant Analysis Using Core Vector Machines
Large-margin methods, such as support vector machines (SVMs), have been very successful in classification problems. Recently, maximum margin discriminant analysis (MMDA) was proposed that extends the large-margin idea to feature extraction. It often outperforms traditional methods such as kernel pri...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2008-04, Vol.19 (4), p.610-624 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 624 |
---|---|
container_issue | 4 |
container_start_page | 610 |
container_title | IEEE transaction on neural networks and learning systems |
container_volume | 19 |
creator | Wai-Hung Tsang, I. Kocsor, A. Kwok, J.T.-Y. |
description | Large-margin methods, such as support vector machines (SVMs), have been very successful in classification problems. Recently, maximum margin discriminant analysis (MMDA) was proposed that extends the large-margin idea to feature extraction. It often outperforms traditional methods such as kernel principal component analysis (KPCA) and kernel Fisher discriminant analysis (KFD). However, as in the SVM, its time complexity is cubic in the number of training points m, and is thus computationally inefficient on massive data sets. In this paper, we propose an (1 + isin) 2 -approximation algorithm for obtaining the MMDA features by extending the core vector machine. The resultant time complexity is only linear in m, while its space complexity is independent of m. Extensive comparisons with the original MMDA, KPCA, and KFD on a number of large data sets show that the proposed feature extractor can improve classification accuracy, and is also faster than these kernel-based methods by over an order of magnitude. |
doi_str_mv | 10.1109/TNN.2007.911746 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_70464979</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>4443875</ieee_id><sourcerecordid>875036564</sourcerecordid><originalsourceid>FETCH-LOGICAL-c404t-7cdf6e29cf03f9399daadcd36995260e747d98fed155382286d4259e6e3e2eba3</originalsourceid><addsrcrecordid>eNp90btrHDEQB2ARYuJHUqcIhCUQu9rz6LHSqjQXv_DZKWKnFbJ29iKzD0e6Bfu_9xx32JAilQT6NMzMj7HPHGacgz2-vbmZCQAzs5wbpd-xPW4VLwGsfE93UFVphTC7bD_nBwCuKtAf2C6vpQUJ9R67Wvi0xPJX8B0W1_4p9lNPZ1rGofgRc0ixj4MfVsXJ4LvnHHNxl-OwLOZjwuI3htWYiIc_ccD8ke20vsv4aXsesLuz09v5Rbn4eX45P1mUQYFalSY0rUZhQwuytdLaxvsmNFJbWwkNaJRpbN1iw6tK1kLUulGisqhRosB7Lw_Y0abuYxr_TphXrqdOsev8gOOUXW0qkLrSiuThf6UBpZU1luC3f-DDOCUaOTvLqQXamyR0vEEhjTknbN0jrcenZ8fBreNwFIdbx-E2cdCPr9uy032PzZvf7p_A9y3wmSJokx9CzK9OwHp8DeS-bFxExNdnpZSkYeULysKZZA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>912284503</pqid></control><display><type>article</type><title>Large-Scale Maximum Margin Discriminant Analysis Using Core Vector Machines</title><source>IEEE Electronic Library (IEL)</source><creator>Wai-Hung Tsang, I. ; Kocsor, A. ; Kwok, J.T.-Y.</creator><creatorcontrib>Wai-Hung Tsang, I. ; Kocsor, A. ; Kwok, J.T.-Y.</creatorcontrib><description>Large-margin methods, such as support vector machines (SVMs), have been very successful in classification problems. Recently, maximum margin discriminant analysis (MMDA) was proposed that extends the large-margin idea to feature extraction. It often outperforms traditional methods such as kernel principal component analysis (KPCA) and kernel Fisher discriminant analysis (KFD). However, as in the SVM, its time complexity is cubic in the number of training points m, and is thus computationally inefficient on massive data sets. In this paper, we propose an (1 + isin) 2 -approximation algorithm for obtaining the MMDA features by extending the core vector machine. The resultant time complexity is only linear in m, while its space complexity is independent of m. Extensive comparisons with the original MMDA, KPCA, and KFD on a number of large data sets show that the proposed feature extractor can improve classification accuracy, and is also faster than these kernel-based methods by over an order of magnitude.</description><identifier>ISSN: 1045-9227</identifier><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 1941-0093</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNN.2007.911746</identifier><identifier>PMID: 18390308</identifier><identifier>CODEN: ITNNEP</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Algorithmics. Computability. Computer arithmetics ; Algorithms ; Applied sciences ; Artificial intelligence ; Classification ; Complexity ; Computer science; control theory; systems ; Computer Simulation ; Connectionism. Neural networks ; core vector machines ; Councils ; Data mining ; Data processing. List processing. Character string processing ; Discriminant Analysis ; Exact sciences and technology ; Feature extraction ; Humans ; Information analysis ; Kernel ; Kernels ; Large-scale systems ; Mathematical analysis ; Memory organisation. Data processing ; Models, Statistical ; Neural Networks (Computer) ; Principal Component Analysis ; Scalability ; Signal Processing, Computer-Assisted ; Software ; Studies ; Support vector machine classification ; Support vector machines ; support vector machines (SVMs) ; Theoretical computing ; Time Factors ; Vectors (mathematics)</subject><ispartof>IEEE transaction on neural networks and learning systems, 2008-04, Vol.19 (4), p.610-624</ispartof><rights>2008 INIST-CNRS</rights><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2008</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c404t-7cdf6e29cf03f9399daadcd36995260e747d98fed155382286d4259e6e3e2eba3</citedby><cites>FETCH-LOGICAL-c404t-7cdf6e29cf03f9399daadcd36995260e747d98fed155382286d4259e6e3e2eba3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/4443875$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/4443875$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=20228660$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/18390308$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Wai-Hung Tsang, I.</creatorcontrib><creatorcontrib>Kocsor, A.</creatorcontrib><creatorcontrib>Kwok, J.T.-Y.</creatorcontrib><title>Large-Scale Maximum Margin Discriminant Analysis Using Core Vector Machines</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNN</addtitle><addtitle>IEEE Trans Neural Netw</addtitle><description>Large-margin methods, such as support vector machines (SVMs), have been very successful in classification problems. Recently, maximum margin discriminant analysis (MMDA) was proposed that extends the large-margin idea to feature extraction. It often outperforms traditional methods such as kernel principal component analysis (KPCA) and kernel Fisher discriminant analysis (KFD). However, as in the SVM, its time complexity is cubic in the number of training points m, and is thus computationally inefficient on massive data sets. In this paper, we propose an (1 + isin) 2 -approximation algorithm for obtaining the MMDA features by extending the core vector machine. The resultant time complexity is only linear in m, while its space complexity is independent of m. Extensive comparisons with the original MMDA, KPCA, and KFD on a number of large data sets show that the proposed feature extractor can improve classification accuracy, and is also faster than these kernel-based methods by over an order of magnitude.</description><subject>Algorithmics. Computability. Computer arithmetics</subject><subject>Algorithms</subject><subject>Applied sciences</subject><subject>Artificial intelligence</subject><subject>Classification</subject><subject>Complexity</subject><subject>Computer science; control theory; systems</subject><subject>Computer Simulation</subject><subject>Connectionism. Neural networks</subject><subject>core vector machines</subject><subject>Councils</subject><subject>Data mining</subject><subject>Data processing. List processing. Character string processing</subject><subject>Discriminant Analysis</subject><subject>Exact sciences and technology</subject><subject>Feature extraction</subject><subject>Humans</subject><subject>Information analysis</subject><subject>Kernel</subject><subject>Kernels</subject><subject>Large-scale systems</subject><subject>Mathematical analysis</subject><subject>Memory organisation. Data processing</subject><subject>Models, Statistical</subject><subject>Neural Networks (Computer)</subject><subject>Principal Component Analysis</subject><subject>Scalability</subject><subject>Signal Processing, Computer-Assisted</subject><subject>Software</subject><subject>Studies</subject><subject>Support vector machine classification</subject><subject>Support vector machines</subject><subject>support vector machines (SVMs)</subject><subject>Theoretical computing</subject><subject>Time Factors</subject><subject>Vectors (mathematics)</subject><issn>1045-9227</issn><issn>2162-237X</issn><issn>1941-0093</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2008</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNp90btrHDEQB2ARYuJHUqcIhCUQu9rz6LHSqjQXv_DZKWKnFbJ29iKzD0e6Bfu_9xx32JAilQT6NMzMj7HPHGacgz2-vbmZCQAzs5wbpd-xPW4VLwGsfE93UFVphTC7bD_nBwCuKtAf2C6vpQUJ9R67Wvi0xPJX8B0W1_4p9lNPZ1rGofgRc0ixj4MfVsXJ4LvnHHNxl-OwLOZjwuI3htWYiIc_ccD8ke20vsv4aXsesLuz09v5Rbn4eX45P1mUQYFalSY0rUZhQwuytdLaxvsmNFJbWwkNaJRpbN1iw6tK1kLUulGisqhRosB7Lw_Y0abuYxr_TphXrqdOsev8gOOUXW0qkLrSiuThf6UBpZU1luC3f-DDOCUaOTvLqQXamyR0vEEhjTknbN0jrcenZ8fBreNwFIdbx-E2cdCPr9uy032PzZvf7p_A9y3wmSJokx9CzK9OwHp8DeS-bFxExNdnpZSkYeULysKZZA</recordid><startdate>20080401</startdate><enddate>20080401</enddate><creator>Wai-Hung Tsang, I.</creator><creator>Kocsor, A.</creator><creator>Kwok, J.T.-Y.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope></search><sort><creationdate>20080401</creationdate><title>Large-Scale Maximum Margin Discriminant Analysis Using Core Vector Machines</title><author>Wai-Hung Tsang, I. ; Kocsor, A. ; Kwok, J.T.-Y.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c404t-7cdf6e29cf03f9399daadcd36995260e747d98fed155382286d4259e6e3e2eba3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2008</creationdate><topic>Algorithmics. Computability. Computer arithmetics</topic><topic>Algorithms</topic><topic>Applied sciences</topic><topic>Artificial intelligence</topic><topic>Classification</topic><topic>Complexity</topic><topic>Computer science; control theory; systems</topic><topic>Computer Simulation</topic><topic>Connectionism. Neural networks</topic><topic>core vector machines</topic><topic>Councils</topic><topic>Data mining</topic><topic>Data processing. List processing. Character string processing</topic><topic>Discriminant Analysis</topic><topic>Exact sciences and technology</topic><topic>Feature extraction</topic><topic>Humans</topic><topic>Information analysis</topic><topic>Kernel</topic><topic>Kernels</topic><topic>Large-scale systems</topic><topic>Mathematical analysis</topic><topic>Memory organisation. Data processing</topic><topic>Models, Statistical</topic><topic>Neural Networks (Computer)</topic><topic>Principal Component Analysis</topic><topic>Scalability</topic><topic>Signal Processing, Computer-Assisted</topic><topic>Software</topic><topic>Studies</topic><topic>Support vector machine classification</topic><topic>Support vector machines</topic><topic>support vector machines (SVMs)</topic><topic>Theoretical computing</topic><topic>Time Factors</topic><topic>Vectors (mathematics)</topic><toplevel>online_resources</toplevel><creatorcontrib>Wai-Hung Tsang, I.</creatorcontrib><creatorcontrib>Kocsor, A.</creatorcontrib><creatorcontrib>Kwok, J.T.-Y.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wai-Hung Tsang, I.</au><au>Kocsor, A.</au><au>Kwok, J.T.-Y.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Large-Scale Maximum Margin Discriminant Analysis Using Core Vector Machines</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNN</stitle><addtitle>IEEE Trans Neural Netw</addtitle><date>2008-04-01</date><risdate>2008</risdate><volume>19</volume><issue>4</issue><spage>610</spage><epage>624</epage><pages>610-624</pages><issn>1045-9227</issn><issn>2162-237X</issn><eissn>1941-0093</eissn><eissn>2162-2388</eissn><coden>ITNNEP</coden><abstract>Large-margin methods, such as support vector machines (SVMs), have been very successful in classification problems. Recently, maximum margin discriminant analysis (MMDA) was proposed that extends the large-margin idea to feature extraction. It often outperforms traditional methods such as kernel principal component analysis (KPCA) and kernel Fisher discriminant analysis (KFD). However, as in the SVM, its time complexity is cubic in the number of training points m, and is thus computationally inefficient on massive data sets. In this paper, we propose an (1 + isin) 2 -approximation algorithm for obtaining the MMDA features by extending the core vector machine. The resultant time complexity is only linear in m, while its space complexity is independent of m. Extensive comparisons with the original MMDA, KPCA, and KFD on a number of large data sets show that the proposed feature extractor can improve classification accuracy, and is also faster than these kernel-based methods by over an order of magnitude.</abstract><cop>New York, NY</cop><pub>IEEE</pub><pmid>18390308</pmid><doi>10.1109/TNN.2007.911746</doi><tpages>15</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1045-9227 |
ispartof | IEEE transaction on neural networks and learning systems, 2008-04, Vol.19 (4), p.610-624 |
issn | 1045-9227 2162-237X 1941-0093 2162-2388 |
language | eng |
recordid | cdi_proquest_miscellaneous_70464979 |
source | IEEE Electronic Library (IEL) |
subjects | Algorithmics. Computability. Computer arithmetics Algorithms Applied sciences Artificial intelligence Classification Complexity Computer science control theory systems Computer Simulation Connectionism. Neural networks core vector machines Councils Data mining Data processing. List processing. Character string processing Discriminant Analysis Exact sciences and technology Feature extraction Humans Information analysis Kernel Kernels Large-scale systems Mathematical analysis Memory organisation. Data processing Models, Statistical Neural Networks (Computer) Principal Component Analysis Scalability Signal Processing, Computer-Assisted Software Studies Support vector machine classification Support vector machines support vector machines (SVMs) Theoretical computing Time Factors Vectors (mathematics) |
title | Large-Scale Maximum Margin Discriminant Analysis Using Core Vector Machines |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T07%3A54%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Large-Scale%20Maximum%20Margin%20Discriminant%20Analysis%20Using%20Core%20Vector%20Machines&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Wai-Hung%20Tsang,%20I.&rft.date=2008-04-01&rft.volume=19&rft.issue=4&rft.spage=610&rft.epage=624&rft.pages=610-624&rft.issn=1045-9227&rft.eissn=1941-0093&rft.coden=ITNNEP&rft_id=info:doi/10.1109/TNN.2007.911746&rft_dat=%3Cproquest_RIE%3E875036564%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=912284503&rft_id=info:pmid/18390308&rft_ieee_id=4443875&rfr_iscdi=true |