Unsupervised feature selection by self-paced learning regularization

•This paper uses self-representation method to construct feature selection model.•Self-paced learning is added into feature selection to consider the outliers.•This paper proposes a novel optimization algorithm to solve the objective function. Previous feature selection methods equivalently consider...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2020-04, Vol.132, p.4-11
Hauptverfasser: Zheng, Wei, Zhu, Xiaofeng, Wen, Guoqiu, Zhu, Yonghua, Yu, Hao, Gan, Jiangzhang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 11
container_issue
container_start_page 4
container_title Pattern recognition letters
container_volume 132
creator Zheng, Wei
Zhu, Xiaofeng
Wen, Guoqiu
Zhu, Yonghua
Yu, Hao
Gan, Jiangzhang
description •This paper uses self-representation method to construct feature selection model.•Self-paced learning is added into feature selection to consider the outliers.•This paper proposes a novel optimization algorithm to solve the objective function. Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods.
doi_str_mv 10.1016/j.patrec.2018.06.029
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2440492489</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0167865518302782</els_id><sourcerecordid>2440492489</sourcerecordid><originalsourceid>FETCH-LOGICAL-c334t-491a6fb48fe0f97e6664df5946fa1bc3c112ef87de993bd3a9fe303e002d75a13</originalsourceid><addsrcrecordid>eNp9kE9LxDAQxYMouK5-Aw8Fz62TJk2biyDrX1jw4p5Dmk6WlNrWpF1YP71Z6tnTMMzvveE9Qm4pZBSouG-zUU8eTZYDrTIQGeTyjKxoVeZpyTg_J6uIlWkliuKSXIXQAoBgslqRp10f5hH9wQVsEot6mj0mATs0kxv6pD6eFpuO2sR7h9r3rt8nHvdzp7370SfqmlxY3QW8-Ztrsnt5_ty8pduP1_fN4zY1jPEp5ZJqYWteWQQrSxRC8MYWkguraW2YoTRHW5UNSsnqhmlpkQFDgLwpC03ZmtwtvqMfvmcMk2qH2ffxpco5By5zXslI8YUyfgjBo1Wjd1_aHxUFdepLtWrpS536UiBU7CvKHhYZxgQHh14F47CPsV1EJ9UM7n-DXwvrdqY</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2440492489</pqid></control><display><type>article</type><title>Unsupervised feature selection by self-paced learning regularization</title><source>Elsevier ScienceDirect Journals</source><creator>Zheng, Wei ; Zhu, Xiaofeng ; Wen, Guoqiu ; Zhu, Yonghua ; Yu, Hao ; Gan, Jiangzhang</creator><creatorcontrib>Zheng, Wei ; Zhu, Xiaofeng ; Wen, Guoqiu ; Zhu, Yonghua ; Yu, Hao ; Gan, Jiangzhang</creatorcontrib><description>•This paper uses self-representation method to construct feature selection model.•Self-paced learning is added into feature selection to consider the outliers.•This paper proposes a novel optimization algorithm to solve the objective function. Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods.</description><identifier>ISSN: 0167-8655</identifier><identifier>EISSN: 1872-7344</identifier><identifier>DOI: 10.1016/j.patrec.2018.06.029</identifier><language>eng</language><publisher>Amsterdam: Elsevier B.V</publisher><subject>Feature selection ; Outliers (statistics) ; Regularization ; Robust statistic ; Self-paced learning</subject><ispartof>Pattern recognition letters, 2020-04, Vol.132, p.4-11</ispartof><rights>2018 Elsevier B.V.</rights><rights>Copyright Elsevier Science Ltd. Apr 2020</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c334t-491a6fb48fe0f97e6664df5946fa1bc3c112ef87de993bd3a9fe303e002d75a13</citedby><cites>FETCH-LOGICAL-c334t-491a6fb48fe0f97e6664df5946fa1bc3c112ef87de993bd3a9fe303e002d75a13</cites><orcidid>0000-0001-6840-0578</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0167865518302782$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3536,27903,27904,65309</link.rule.ids></links><search><creatorcontrib>Zheng, Wei</creatorcontrib><creatorcontrib>Zhu, Xiaofeng</creatorcontrib><creatorcontrib>Wen, Guoqiu</creatorcontrib><creatorcontrib>Zhu, Yonghua</creatorcontrib><creatorcontrib>Yu, Hao</creatorcontrib><creatorcontrib>Gan, Jiangzhang</creatorcontrib><title>Unsupervised feature selection by self-paced learning regularization</title><title>Pattern recognition letters</title><description>•This paper uses self-representation method to construct feature selection model.•Self-paced learning is added into feature selection to consider the outliers.•This paper proposes a novel optimization algorithm to solve the objective function. Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods.</description><subject>Feature selection</subject><subject>Outliers (statistics)</subject><subject>Regularization</subject><subject>Robust statistic</subject><subject>Self-paced learning</subject><issn>0167-8655</issn><issn>1872-7344</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp9kE9LxDAQxYMouK5-Aw8Fz62TJk2biyDrX1jw4p5Dmk6WlNrWpF1YP71Z6tnTMMzvveE9Qm4pZBSouG-zUU8eTZYDrTIQGeTyjKxoVeZpyTg_J6uIlWkliuKSXIXQAoBgslqRp10f5hH9wQVsEot6mj0mATs0kxv6pD6eFpuO2sR7h9r3rt8nHvdzp7370SfqmlxY3QW8-Ztrsnt5_ty8pduP1_fN4zY1jPEp5ZJqYWteWQQrSxRC8MYWkguraW2YoTRHW5UNSsnqhmlpkQFDgLwpC03ZmtwtvqMfvmcMk2qH2ffxpco5By5zXslI8YUyfgjBo1Wjd1_aHxUFdepLtWrpS536UiBU7CvKHhYZxgQHh14F47CPsV1EJ9UM7n-DXwvrdqY</recordid><startdate>202004</startdate><enddate>202004</enddate><creator>Zheng, Wei</creator><creator>Zhu, Xiaofeng</creator><creator>Wen, Guoqiu</creator><creator>Zhu, Yonghua</creator><creator>Yu, Hao</creator><creator>Gan, Jiangzhang</creator><general>Elsevier B.V</general><general>Elsevier Science Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7TK</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-6840-0578</orcidid></search><sort><creationdate>202004</creationdate><title>Unsupervised feature selection by self-paced learning regularization</title><author>Zheng, Wei ; Zhu, Xiaofeng ; Wen, Guoqiu ; Zhu, Yonghua ; Yu, Hao ; Gan, Jiangzhang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c334t-491a6fb48fe0f97e6664df5946fa1bc3c112ef87de993bd3a9fe303e002d75a13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Feature selection</topic><topic>Outliers (statistics)</topic><topic>Regularization</topic><topic>Robust statistic</topic><topic>Self-paced learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zheng, Wei</creatorcontrib><creatorcontrib>Zhu, Xiaofeng</creatorcontrib><creatorcontrib>Wen, Guoqiu</creatorcontrib><creatorcontrib>Zhu, Yonghua</creatorcontrib><creatorcontrib>Yu, Hao</creatorcontrib><creatorcontrib>Gan, Jiangzhang</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Pattern recognition letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zheng, Wei</au><au>Zhu, Xiaofeng</au><au>Wen, Guoqiu</au><au>Zhu, Yonghua</au><au>Yu, Hao</au><au>Gan, Jiangzhang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Unsupervised feature selection by self-paced learning regularization</atitle><jtitle>Pattern recognition letters</jtitle><date>2020-04</date><risdate>2020</risdate><volume>132</volume><spage>4</spage><epage>11</epage><pages>4-11</pages><issn>0167-8655</issn><eissn>1872-7344</eissn><abstract>•This paper uses self-representation method to construct feature selection model.•Self-paced learning is added into feature selection to consider the outliers.•This paper proposes a novel optimization algorithm to solve the objective function. Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods.</abstract><cop>Amsterdam</cop><pub>Elsevier B.V</pub><doi>10.1016/j.patrec.2018.06.029</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0001-6840-0578</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0167-8655
ispartof Pattern recognition letters, 2020-04, Vol.132, p.4-11
issn 0167-8655
1872-7344
language eng
recordid cdi_proquest_journals_2440492489
source Elsevier ScienceDirect Journals
subjects Feature selection
Outliers (statistics)
Regularization
Robust statistic
Self-paced learning
title Unsupervised feature selection by self-paced learning regularization
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-24T18%3A54%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Unsupervised%20feature%20selection%20by%20self-paced%20learning%20regularization&rft.jtitle=Pattern%20recognition%20letters&rft.au=Zheng,%20Wei&rft.date=2020-04&rft.volume=132&rft.spage=4&rft.epage=11&rft.pages=4-11&rft.issn=0167-8655&rft.eissn=1872-7344&rft_id=info:doi/10.1016/j.patrec.2018.06.029&rft_dat=%3Cproquest_cross%3E2440492489%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2440492489&rft_id=info:pmid/&rft_els_id=S0167865518302782&rfr_iscdi=true