A multi-objective algorithm for multi-label filter feature selection problem
Feature selection is an important data preprocessing method before classification. Multi-objective optimization algorithms have been proved an effective way to solve feature selection problems. However, there are few studies on multi-objective optimization feature selection methods for multi-label d...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2020-11, Vol.50 (11), p.3748-3774 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 3774 |
---|---|
container_issue | 11 |
container_start_page | 3748 |
container_title | Applied intelligence (Dordrecht, Netherlands) |
container_volume | 50 |
creator | Dong, Hongbin Sun, Jing Li, Tao Ding, Rui Sun, Xiaohang |
description | Feature selection is an important data preprocessing method before classification. Multi-objective optimization algorithms have been proved an effective way to solve feature selection problems. However, there are few studies on multi-objective optimization feature selection methods for multi-label data. In this paper, a multi-objective multi-label filter feature selection algorithm based on two particle swarms (MOMFS) is proposed. We use mutual information to measure the relevance between features and label sets, and the redundancy between features, which are taken as two objectives. In order to avoid Particle Swarm Optimization (PSO) from falling into the local optimum and obtaining a false Pareto front, we employ two swarms to optimize the two objectives separately and propose an improved hybrid topology based on particle’s fitness value. Furthermore, an archive maintenance strategy is introduced to maintain the distribution of archive. In order to study the effectiveness of the proposed algorithm, we select five multi-label evaluation criteria and perform experiments on seven multi-label data sets. MOMFS is compared with classic single-objective multi-label feature selection algorithms, multi-objective filter and wrapper feature selection algorithms. The experimental results show that MOMFS can effectively reduce the multi-label data dimension and perform better than other approaches on five evaluation criteria. |
doi_str_mv | 10.1007/s10489-020-01785-2 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2608622003</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2608622003</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-ef59d549de3ac2321a45dff2618a6aac10d84a0819e03498f4a3e8eae2f3dfaf3</originalsourceid><addsrcrecordid>eNp9kE1LAzEQhoMoWKt_wFPAc3TysbvJsRS_oOBFwVtIdyd1S7Zbk6zgv3frFrx5msPM877MQ8g1h1sOUN0lDkobBgIY8EoXTJyQGS8qySplqlMyAyMUK0vzfk4uUtoCgJTAZ2S1oN0Qcsv69Rbr3H4hdWHTxzZ_dNT38bgNbo2B-jZkjNSjy0NEmjAckH5H97FfB-wuyZl3IeHVcc7J28P96_KJrV4en5eLFaslN5mhL0xTKNOgdLWQgjtVNN6LkmtXOldzaLRyoLlBkMpor5xEjQ6Fl413Xs7JzZQ79n4OmLLd9kPcjZVWlKBLIQ7vzYmYrurYpxTR231sOxe_LQd7sGYna3a0Zn-tWTFCcoLSeLzbYPyL_of6ASjbcN8</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2608622003</pqid></control><display><type>article</type><title>A multi-objective algorithm for multi-label filter feature selection problem</title><source>SpringerNature Journals</source><creator>Dong, Hongbin ; Sun, Jing ; Li, Tao ; Ding, Rui ; Sun, Xiaohang</creator><creatorcontrib>Dong, Hongbin ; Sun, Jing ; Li, Tao ; Ding, Rui ; Sun, Xiaohang</creatorcontrib><description>Feature selection is an important data preprocessing method before classification. Multi-objective optimization algorithms have been proved an effective way to solve feature selection problems. However, there are few studies on multi-objective optimization feature selection methods for multi-label data. In this paper, a multi-objective multi-label filter feature selection algorithm based on two particle swarms (MOMFS) is proposed. We use mutual information to measure the relevance between features and label sets, and the redundancy between features, which are taken as two objectives. In order to avoid Particle Swarm Optimization (PSO) from falling into the local optimum and obtaining a false Pareto front, we employ two swarms to optimize the two objectives separately and propose an improved hybrid topology based on particle’s fitness value. Furthermore, an archive maintenance strategy is introduced to maintain the distribution of archive. In order to study the effectiveness of the proposed algorithm, we select five multi-label evaluation criteria and perform experiments on seven multi-label data sets. MOMFS is compared with classic single-objective multi-label feature selection algorithms, multi-objective filter and wrapper feature selection algorithms. The experimental results show that MOMFS can effectively reduce the multi-label data dimension and perform better than other approaches on five evaluation criteria.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-020-01785-2</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Archives & records ; Artificial Intelligence ; Computer Science ; Criteria ; Evaluation ; Feature selection ; Machines ; Manufacturing ; Mechanical Engineering ; Multiple objective analysis ; Optimization ; Pareto optimization ; Particle swarm optimization ; Processes ; Redundancy ; Topology</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2020-11, Vol.50 (11), p.3748-3774</ispartof><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020</rights><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-ef59d549de3ac2321a45dff2618a6aac10d84a0819e03498f4a3e8eae2f3dfaf3</citedby><cites>FETCH-LOGICAL-c319t-ef59d549de3ac2321a45dff2618a6aac10d84a0819e03498f4a3e8eae2f3dfaf3</cites><orcidid>0000-0002-4390-8866</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-020-01785-2$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-020-01785-2$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Dong, Hongbin</creatorcontrib><creatorcontrib>Sun, Jing</creatorcontrib><creatorcontrib>Li, Tao</creatorcontrib><creatorcontrib>Ding, Rui</creatorcontrib><creatorcontrib>Sun, Xiaohang</creatorcontrib><title>A multi-objective algorithm for multi-label filter feature selection problem</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>Feature selection is an important data preprocessing method before classification. Multi-objective optimization algorithms have been proved an effective way to solve feature selection problems. However, there are few studies on multi-objective optimization feature selection methods for multi-label data. In this paper, a multi-objective multi-label filter feature selection algorithm based on two particle swarms (MOMFS) is proposed. We use mutual information to measure the relevance between features and label sets, and the redundancy between features, which are taken as two objectives. In order to avoid Particle Swarm Optimization (PSO) from falling into the local optimum and obtaining a false Pareto front, we employ two swarms to optimize the two objectives separately and propose an improved hybrid topology based on particle’s fitness value. Furthermore, an archive maintenance strategy is introduced to maintain the distribution of archive. In order to study the effectiveness of the proposed algorithm, we select five multi-label evaluation criteria and perform experiments on seven multi-label data sets. MOMFS is compared with classic single-objective multi-label feature selection algorithms, multi-objective filter and wrapper feature selection algorithms. The experimental results show that MOMFS can effectively reduce the multi-label data dimension and perform better than other approaches on five evaluation criteria.</description><subject>Algorithms</subject><subject>Archives & records</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Criteria</subject><subject>Evaluation</subject><subject>Feature selection</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Multiple objective analysis</subject><subject>Optimization</subject><subject>Pareto optimization</subject><subject>Particle swarm optimization</subject><subject>Processes</subject><subject>Redundancy</subject><subject>Topology</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE1LAzEQhoMoWKt_wFPAc3TysbvJsRS_oOBFwVtIdyd1S7Zbk6zgv3frFrx5msPM877MQ8g1h1sOUN0lDkobBgIY8EoXTJyQGS8qySplqlMyAyMUK0vzfk4uUtoCgJTAZ2S1oN0Qcsv69Rbr3H4hdWHTxzZ_dNT38bgNbo2B-jZkjNSjy0NEmjAckH5H97FfB-wuyZl3IeHVcc7J28P96_KJrV4en5eLFaslN5mhL0xTKNOgdLWQgjtVNN6LkmtXOldzaLRyoLlBkMpor5xEjQ6Fl413Xs7JzZQ79n4OmLLd9kPcjZVWlKBLIQ7vzYmYrurYpxTR231sOxe_LQd7sGYna3a0Zn-tWTFCcoLSeLzbYPyL_of6ASjbcN8</recordid><startdate>20201101</startdate><enddate>20201101</enddate><creator>Dong, Hongbin</creator><creator>Sun, Jing</creator><creator>Li, Tao</creator><creator>Ding, Rui</creator><creator>Sun, Xiaohang</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-4390-8866</orcidid></search><sort><creationdate>20201101</creationdate><title>A multi-objective algorithm for multi-label filter feature selection problem</title><author>Dong, Hongbin ; Sun, Jing ; Li, Tao ; Ding, Rui ; Sun, Xiaohang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-ef59d549de3ac2321a45dff2618a6aac10d84a0819e03498f4a3e8eae2f3dfaf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Archives & records</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Criteria</topic><topic>Evaluation</topic><topic>Feature selection</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Multiple objective analysis</topic><topic>Optimization</topic><topic>Pareto optimization</topic><topic>Particle swarm optimization</topic><topic>Processes</topic><topic>Redundancy</topic><topic>Topology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dong, Hongbin</creatorcontrib><creatorcontrib>Sun, Jing</creatorcontrib><creatorcontrib>Li, Tao</creatorcontrib><creatorcontrib>Ding, Rui</creatorcontrib><creatorcontrib>Sun, Xiaohang</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dong, Hongbin</au><au>Sun, Jing</au><au>Li, Tao</au><au>Ding, Rui</au><au>Sun, Xiaohang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A multi-objective algorithm for multi-label filter feature selection problem</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2020-11-01</date><risdate>2020</risdate><volume>50</volume><issue>11</issue><spage>3748</spage><epage>3774</epage><pages>3748-3774</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>Feature selection is an important data preprocessing method before classification. Multi-objective optimization algorithms have been proved an effective way to solve feature selection problems. However, there are few studies on multi-objective optimization feature selection methods for multi-label data. In this paper, a multi-objective multi-label filter feature selection algorithm based on two particle swarms (MOMFS) is proposed. We use mutual information to measure the relevance between features and label sets, and the redundancy between features, which are taken as two objectives. In order to avoid Particle Swarm Optimization (PSO) from falling into the local optimum and obtaining a false Pareto front, we employ two swarms to optimize the two objectives separately and propose an improved hybrid topology based on particle’s fitness value. Furthermore, an archive maintenance strategy is introduced to maintain the distribution of archive. In order to study the effectiveness of the proposed algorithm, we select five multi-label evaluation criteria and perform experiments on seven multi-label data sets. MOMFS is compared with classic single-objective multi-label feature selection algorithms, multi-objective filter and wrapper feature selection algorithms. The experimental results show that MOMFS can effectively reduce the multi-label data dimension and perform better than other approaches on five evaluation criteria.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-020-01785-2</doi><tpages>27</tpages><orcidid>https://orcid.org/0000-0002-4390-8866</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0924-669X |
ispartof | Applied intelligence (Dordrecht, Netherlands), 2020-11, Vol.50 (11), p.3748-3774 |
issn | 0924-669X 1573-7497 |
language | eng |
recordid | cdi_proquest_journals_2608622003 |
source | SpringerNature Journals |
subjects | Algorithms Archives & records Artificial Intelligence Computer Science Criteria Evaluation Feature selection Machines Manufacturing Mechanical Engineering Multiple objective analysis Optimization Pareto optimization Particle swarm optimization Processes Redundancy Topology |
title | A multi-objective algorithm for multi-label filter feature selection problem |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T10%3A10%3A50IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20multi-objective%20algorithm%20for%20multi-label%20filter%20feature%20selection%20problem&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Dong,%20Hongbin&rft.date=2020-11-01&rft.volume=50&rft.issue=11&rft.spage=3748&rft.epage=3774&rft.pages=3748-3774&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-020-01785-2&rft_dat=%3Cproquest_cross%3E2608622003%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2608622003&rft_id=info:pmid/&rfr_iscdi=true |