Unsupervised feature selection based on kernel fisher discriminant analysis and regression learning

In this paper, we propose a new feature selection method called kernel fisher discriminant analysis and regression learning based algorithm for unsupervised feature selection. The existing feature selection methods are based on either manifold learning or discriminative techniques, each of which has...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Machine learning 2019-04, Vol.108 (4), p.659-686
Hauptverfasser: Shang, Ronghua, Meng, Yang, Liu, Chiyang, Jiao, Licheng, Esfahani, Amir M. Ghalamzan, Stolkin, Rustam
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 686
container_issue 4
container_start_page 659
container_title Machine learning
container_volume 108
creator Shang, Ronghua
Meng, Yang
Liu, Chiyang
Jiao, Licheng
Esfahani, Amir M. Ghalamzan
Stolkin, Rustam
description In this paper, we propose a new feature selection method called kernel fisher discriminant analysis and regression learning based algorithm for unsupervised feature selection. The existing feature selection methods are based on either manifold learning or discriminative techniques, each of which has some shortcomings. Although some studies show the advantages of two-steps method benefiting from both manifold learning and discriminative techniques, a joint formulation has been shown to be more efficient. To do so, we construct a global discriminant objective term of a clustering framework based on the kernel method. We add another term of regression learning into the objective function, which can impose the optimization to select a low-dimensional representation of the original dataset. We use L 2,1 - norm of the features to impose a sparse structure upon features, which can result in more discriminative features. We propose an algorithm to solve the optimization problem introduced in this paper. We further discuss convergence, parameter sensitivity, computational complexity, as well as the clustering and classification accuracy of the proposed algorithm. In order to demonstrate the effectiveness of the proposed algorithm, we perform a set of experiments with different available datasets. The results obtained by the proposed algorithm are compared against the state-of-the-art algorithms. These results show that our method outperforms the existing state-of-the-art methods in many cases on different datasets, but the improved performance comes with the cost of increased time complexity.
doi_str_mv 10.1007/s10994-018-5765-6
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2116545549</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2116545549</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-ca7837b0ba35b074fe54469d6f8d8e96628445228133865b06706112c0bc6b3d3</originalsourceid><addsrcrecordid>eNp1kM1LxDAQxYMouK7-Ad4Knqv5bnqUxS9Y8OKeQ5pO16w1XTNdYf97Uyp48jSP4b3HzI-Qa0ZvGaXVHTJa17KkzJSq0qrUJ2TBVCVKqrQ6JQtqTF4yrs7JBeKOUsq10QviNxEPe0jfAaEtOnDjIUGB0IMfwxCLxk37LD4gReiLLuA7pKIN6FP4DNHFsXDR9UcMmEVbJNgmQJyyPbgUQ9xekrPO9QhXv3NJNo8Pb6vncv369LK6X5deqHosvauMqBraOKEaWskOlJS6bnVnWgO11txIqTg3TAijs0VXVDPGPW28bkQrluRm7t2n4esAONrdcEj5OLScMa2kUrLOLja7fBoQE3R2nz9x6WgZtRNLO7O0maWdWFqdM3zOYPbGLaS_5v9DP_QFd7g</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2116545549</pqid></control><display><type>article</type><title>Unsupervised feature selection based on kernel fisher discriminant analysis and regression learning</title><source>SpringerLink (Online service)</source><creator>Shang, Ronghua ; Meng, Yang ; Liu, Chiyang ; Jiao, Licheng ; Esfahani, Amir M. Ghalamzan ; Stolkin, Rustam</creator><creatorcontrib>Shang, Ronghua ; Meng, Yang ; Liu, Chiyang ; Jiao, Licheng ; Esfahani, Amir M. Ghalamzan ; Stolkin, Rustam</creatorcontrib><description>In this paper, we propose a new feature selection method called kernel fisher discriminant analysis and regression learning based algorithm for unsupervised feature selection. The existing feature selection methods are based on either manifold learning or discriminative techniques, each of which has some shortcomings. Although some studies show the advantages of two-steps method benefiting from both manifold learning and discriminative techniques, a joint formulation has been shown to be more efficient. To do so, we construct a global discriminant objective term of a clustering framework based on the kernel method. We add another term of regression learning into the objective function, which can impose the optimization to select a low-dimensional representation of the original dataset. We use L 2,1 - norm of the features to impose a sparse structure upon features, which can result in more discriminative features. We propose an algorithm to solve the optimization problem introduced in this paper. We further discuss convergence, parameter sensitivity, computational complexity, as well as the clustering and classification accuracy of the proposed algorithm. In order to demonstrate the effectiveness of the proposed algorithm, we perform a set of experiments with different available datasets. The results obtained by the proposed algorithm are compared against the state-of-the-art algorithms. These results show that our method outperforms the existing state-of-the-art methods in many cases on different datasets, but the improved performance comes with the cost of increased time complexity.</description><identifier>ISSN: 0885-6125</identifier><identifier>EISSN: 1573-0565</identifier><identifier>DOI: 10.1007/s10994-018-5765-6</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Artificial Intelligence ; Clustering ; Complexity ; Computer Science ; Control ; Datasets ; Discriminant analysis ; Machine learning ; Manifolds (mathematics) ; Mechatronics ; Natural Language Processing (NLP) ; Optimization ; Parameter sensitivity ; Regression analysis ; Robotics ; Simulation and Modeling ; State of the art</subject><ispartof>Machine learning, 2019-04, Vol.108 (4), p.659-686</ispartof><rights>The Author(s) 2018</rights><rights>Machine Learning is a copyright of Springer, (2018). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c359t-ca7837b0ba35b074fe54469d6f8d8e96628445228133865b06706112c0bc6b3d3</citedby><cites>FETCH-LOGICAL-c359t-ca7837b0ba35b074fe54469d6f8d8e96628445228133865b06706112c0bc6b3d3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10994-018-5765-6$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10994-018-5765-6$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Shang, Ronghua</creatorcontrib><creatorcontrib>Meng, Yang</creatorcontrib><creatorcontrib>Liu, Chiyang</creatorcontrib><creatorcontrib>Jiao, Licheng</creatorcontrib><creatorcontrib>Esfahani, Amir M. Ghalamzan</creatorcontrib><creatorcontrib>Stolkin, Rustam</creatorcontrib><title>Unsupervised feature selection based on kernel fisher discriminant analysis and regression learning</title><title>Machine learning</title><addtitle>Mach Learn</addtitle><description>In this paper, we propose a new feature selection method called kernel fisher discriminant analysis and regression learning based algorithm for unsupervised feature selection. The existing feature selection methods are based on either manifold learning or discriminative techniques, each of which has some shortcomings. Although some studies show the advantages of two-steps method benefiting from both manifold learning and discriminative techniques, a joint formulation has been shown to be more efficient. To do so, we construct a global discriminant objective term of a clustering framework based on the kernel method. We add another term of regression learning into the objective function, which can impose the optimization to select a low-dimensional representation of the original dataset. We use L 2,1 - norm of the features to impose a sparse structure upon features, which can result in more discriminative features. We propose an algorithm to solve the optimization problem introduced in this paper. We further discuss convergence, parameter sensitivity, computational complexity, as well as the clustering and classification accuracy of the proposed algorithm. In order to demonstrate the effectiveness of the proposed algorithm, we perform a set of experiments with different available datasets. The results obtained by the proposed algorithm are compared against the state-of-the-art algorithms. These results show that our method outperforms the existing state-of-the-art methods in many cases on different datasets, but the improved performance comes with the cost of increased time complexity.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Clustering</subject><subject>Complexity</subject><subject>Computer Science</subject><subject>Control</subject><subject>Datasets</subject><subject>Discriminant analysis</subject><subject>Machine learning</subject><subject>Manifolds (mathematics)</subject><subject>Mechatronics</subject><subject>Natural Language Processing (NLP)</subject><subject>Optimization</subject><subject>Parameter sensitivity</subject><subject>Regression analysis</subject><subject>Robotics</subject><subject>Simulation and Modeling</subject><subject>State of the art</subject><issn>0885-6125</issn><issn>1573-0565</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp1kM1LxDAQxYMouK7-Ad4Knqv5bnqUxS9Y8OKeQ5pO16w1XTNdYf97Uyp48jSP4b3HzI-Qa0ZvGaXVHTJa17KkzJSq0qrUJ2TBVCVKqrQ6JQtqTF4yrs7JBeKOUsq10QviNxEPe0jfAaEtOnDjIUGB0IMfwxCLxk37LD4gReiLLuA7pKIN6FP4DNHFsXDR9UcMmEVbJNgmQJyyPbgUQ9xekrPO9QhXv3NJNo8Pb6vncv369LK6X5deqHosvauMqBraOKEaWskOlJS6bnVnWgO11txIqTg3TAijs0VXVDPGPW28bkQrluRm7t2n4esAONrdcEj5OLScMa2kUrLOLja7fBoQE3R2nz9x6WgZtRNLO7O0maWdWFqdM3zOYPbGLaS_5v9DP_QFd7g</recordid><startdate>20190401</startdate><enddate>20190401</enddate><creator>Shang, Ronghua</creator><creator>Meng, Yang</creator><creator>Liu, Chiyang</creator><creator>Jiao, Licheng</creator><creator>Esfahani, Amir M. Ghalamzan</creator><creator>Stolkin, Rustam</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7XB</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M2P</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope></search><sort><creationdate>20190401</creationdate><title>Unsupervised feature selection based on kernel fisher discriminant analysis and regression learning</title><author>Shang, Ronghua ; Meng, Yang ; Liu, Chiyang ; Jiao, Licheng ; Esfahani, Amir M. Ghalamzan ; Stolkin, Rustam</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-ca7837b0ba35b074fe54469d6f8d8e96628445228133865b06706112c0bc6b3d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Clustering</topic><topic>Complexity</topic><topic>Computer Science</topic><topic>Control</topic><topic>Datasets</topic><topic>Discriminant analysis</topic><topic>Machine learning</topic><topic>Manifolds (mathematics)</topic><topic>Mechatronics</topic><topic>Natural Language Processing (NLP)</topic><topic>Optimization</topic><topic>Parameter sensitivity</topic><topic>Regression analysis</topic><topic>Robotics</topic><topic>Simulation and Modeling</topic><topic>State of the art</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Shang, Ronghua</creatorcontrib><creatorcontrib>Meng, Yang</creatorcontrib><creatorcontrib>Liu, Chiyang</creatorcontrib><creatorcontrib>Jiao, Licheng</creatorcontrib><creatorcontrib>Esfahani, Amir M. Ghalamzan</creatorcontrib><creatorcontrib>Stolkin, Rustam</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Database‎ (1962 - current)</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Science Journals (ProQuest Database)</collection><collection>ProQuest advanced technologies &amp; aerospace journals</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>Machine learning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shang, Ronghua</au><au>Meng, Yang</au><au>Liu, Chiyang</au><au>Jiao, Licheng</au><au>Esfahani, Amir M. Ghalamzan</au><au>Stolkin, Rustam</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Unsupervised feature selection based on kernel fisher discriminant analysis and regression learning</atitle><jtitle>Machine learning</jtitle><stitle>Mach Learn</stitle><date>2019-04-01</date><risdate>2019</risdate><volume>108</volume><issue>4</issue><spage>659</spage><epage>686</epage><pages>659-686</pages><issn>0885-6125</issn><eissn>1573-0565</eissn><abstract>In this paper, we propose a new feature selection method called kernel fisher discriminant analysis and regression learning based algorithm for unsupervised feature selection. The existing feature selection methods are based on either manifold learning or discriminative techniques, each of which has some shortcomings. Although some studies show the advantages of two-steps method benefiting from both manifold learning and discriminative techniques, a joint formulation has been shown to be more efficient. To do so, we construct a global discriminant objective term of a clustering framework based on the kernel method. We add another term of regression learning into the objective function, which can impose the optimization to select a low-dimensional representation of the original dataset. We use L 2,1 - norm of the features to impose a sparse structure upon features, which can result in more discriminative features. We propose an algorithm to solve the optimization problem introduced in this paper. We further discuss convergence, parameter sensitivity, computational complexity, as well as the clustering and classification accuracy of the proposed algorithm. In order to demonstrate the effectiveness of the proposed algorithm, we perform a set of experiments with different available datasets. The results obtained by the proposed algorithm are compared against the state-of-the-art algorithms. These results show that our method outperforms the existing state-of-the-art methods in many cases on different datasets, but the improved performance comes with the cost of increased time complexity.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10994-018-5765-6</doi><tpages>28</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0885-6125
ispartof Machine learning, 2019-04, Vol.108 (4), p.659-686
issn 0885-6125
1573-0565
language eng
recordid cdi_proquest_journals_2116545549
source SpringerLink (Online service)
subjects Algorithms
Artificial Intelligence
Clustering
Complexity
Computer Science
Control
Datasets
Discriminant analysis
Machine learning
Manifolds (mathematics)
Mechatronics
Natural Language Processing (NLP)
Optimization
Parameter sensitivity
Regression analysis
Robotics
Simulation and Modeling
State of the art
title Unsupervised feature selection based on kernel fisher discriminant analysis and regression learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T08%3A19%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Unsupervised%20feature%20selection%20based%20on%20kernel%20fisher%20discriminant%20analysis%20and%20regression%20learning&rft.jtitle=Machine%20learning&rft.au=Shang,%20Ronghua&rft.date=2019-04-01&rft.volume=108&rft.issue=4&rft.spage=659&rft.epage=686&rft.pages=659-686&rft.issn=0885-6125&rft.eissn=1573-0565&rft_id=info:doi/10.1007/s10994-018-5765-6&rft_dat=%3Cproquest_cross%3E2116545549%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2116545549&rft_id=info:pmid/&rfr_iscdi=true