The embedded feature selection method using ANT colony optimization with structured sparsity norms
Feature selection is important in many machine learning applications. Our results demonstrate that it is not necessary to use all features of a dataset to perform classification and achieve lower classification error, which leads to higher accuracy. By selecting meaningful features and reducing the...
Gespeichert in:
Veröffentlicht in: | Computing 2025, Vol.107 (1), p.29 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 1 |
container_start_page | 29 |
container_title | Computing |
container_volume | 107 |
creator | Nemati, Khadijeh Sheikhani, Amir Hosein Refahi Kordrostami, Sohrab Roudposhti, Kamrad Khoshhal |
description | Feature selection is important in many machine learning applications. Our results demonstrate that it is not necessary to use all features of a dataset to perform classification and achieve lower classification error, which leads to higher accuracy. By selecting meaningful features and reducing the dimensions of the feature vector, we can significantly improve performance. The experimental results show that our proposed method, ANT–ANN–SSN, consistently outperforms existing methods across various datasets. For example, with 200 features, ANT–ANN–SSN achieved an accuracy of 77.85% on the RELATHE dataset and 77.68% on the PCMAC dataset (see Table 5). With 20 features, the ANT–ANN–SSN method reached 97.89% accuracy for ALLAML and 96.89% for PROSTATE-GE (see Table 6). Our approach, which employs an Ant Colony Optimization (ACO) algorithm alongside a two-layer perceptron classifier, addresses the feature selection problem as an optimization challenge, utilizing a new structured sparsity norm to evaluate feature subsets. |
doi_str_mv | 10.1007/s00607-024-01387-7 |
format | Article |
fullrecord | <record><control><sourceid>proquest_sprin</sourceid><recordid>TN_cdi_proquest_journals_3145281837</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3145281837</sourcerecordid><originalsourceid>FETCH-LOGICAL-p727-e2b75b9d9b5cfe42c05f7b21bc70e22a2d451a8f4f40fd4c650135250e6d5baf3</originalsourceid><addsrcrecordid>eNpFkMtOwzAQRS0EEqXwA6wssTaMX3G6rCpeUgWbLNhFcTymqZo4xI5Q-XrSFonV3Zw7o3sIueVwzwHMQwTIwDAQigGXuWHmjMy4khnToM05mQFwYCrXH5fkKsYtAAiZL2bEFhuk2Fp0Dh31WKVxQBpxh3VqQkdbTJvg6Bib7pMu3wpah13o9jT0qWmbn-oIfTdpQ2MaxvrQdjT21RCbtKddGNp4TS58tYt485dzUjw9FqsXtn5_fl0t16w3wjAU1mi7cAura49K1KC9sYLb2gAKUQmnNK9yr7wC71Sd6WmoFhowc9pWXs7J3elsP4SvEWMqt2EcuuljKbnSIue5NBMlT1Tsh2kSDv8Uh_Lgsjy5LCeX5dFlaeQvLuNpPw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3145281837</pqid></control><display><type>article</type><title>The embedded feature selection method using ANT colony optimization with structured sparsity norms</title><source>SpringerLink Journals</source><creator>Nemati, Khadijeh ; Sheikhani, Amir Hosein Refahi ; Kordrostami, Sohrab ; Roudposhti, Kamrad Khoshhal</creator><creatorcontrib>Nemati, Khadijeh ; Sheikhani, Amir Hosein Refahi ; Kordrostami, Sohrab ; Roudposhti, Kamrad Khoshhal</creatorcontrib><description>Feature selection is important in many machine learning applications. Our results demonstrate that it is not necessary to use all features of a dataset to perform classification and achieve lower classification error, which leads to higher accuracy. By selecting meaningful features and reducing the dimensions of the feature vector, we can significantly improve performance. The experimental results show that our proposed method, ANT–ANN–SSN, consistently outperforms existing methods across various datasets. For example, with 200 features, ANT–ANN–SSN achieved an accuracy of 77.85% on the RELATHE dataset and 77.68% on the PCMAC dataset (see Table 5). With 20 features, the ANT–ANN–SSN method reached 97.89% accuracy for ALLAML and 96.89% for PROSTATE-GE (see Table 6). Our approach, which employs an Ant Colony Optimization (ACO) algorithm alongside a two-layer perceptron classifier, addresses the feature selection problem as an optimization challenge, utilizing a new structured sparsity norm to evaluate feature subsets.</description><identifier>ISSN: 0010-485X</identifier><identifier>EISSN: 1436-5057</identifier><identifier>DOI: 10.1007/s00607-024-01387-7</identifier><language>eng</language><publisher>Vienna: Springer Vienna</publisher><subject>Algorithms ; Ant colony optimization ; Artificial Intelligence ; Artificial neural networks ; Classification ; Computer Appl. in Administrative Data Processing ; Computer Communication Networks ; Computer Science ; Datasets ; Feature selection ; Information Systems Applications (incl.Internet) ; Machine learning ; Regular Paper ; Software Engineering ; Sparsity</subject><ispartof>Computing, 2025, Vol.107 (1), p.29</ispartof><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Austria, part of Springer Nature 2024 Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><rights>Copyright Springer Nature B.V. Jan 2025</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s00607-024-01387-7$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s00607-024-01387-7$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Nemati, Khadijeh</creatorcontrib><creatorcontrib>Sheikhani, Amir Hosein Refahi</creatorcontrib><creatorcontrib>Kordrostami, Sohrab</creatorcontrib><creatorcontrib>Roudposhti, Kamrad Khoshhal</creatorcontrib><title>The embedded feature selection method using ANT colony optimization with structured sparsity norms</title><title>Computing</title><addtitle>Computing</addtitle><description>Feature selection is important in many machine learning applications. Our results demonstrate that it is not necessary to use all features of a dataset to perform classification and achieve lower classification error, which leads to higher accuracy. By selecting meaningful features and reducing the dimensions of the feature vector, we can significantly improve performance. The experimental results show that our proposed method, ANT–ANN–SSN, consistently outperforms existing methods across various datasets. For example, with 200 features, ANT–ANN–SSN achieved an accuracy of 77.85% on the RELATHE dataset and 77.68% on the PCMAC dataset (see Table 5). With 20 features, the ANT–ANN–SSN method reached 97.89% accuracy for ALLAML and 96.89% for PROSTATE-GE (see Table 6). Our approach, which employs an Ant Colony Optimization (ACO) algorithm alongside a two-layer perceptron classifier, addresses the feature selection problem as an optimization challenge, utilizing a new structured sparsity norm to evaluate feature subsets.</description><subject>Algorithms</subject><subject>Ant colony optimization</subject><subject>Artificial Intelligence</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Computer Appl. in Administrative Data Processing</subject><subject>Computer Communication Networks</subject><subject>Computer Science</subject><subject>Datasets</subject><subject>Feature selection</subject><subject>Information Systems Applications (incl.Internet)</subject><subject>Machine learning</subject><subject>Regular Paper</subject><subject>Software Engineering</subject><subject>Sparsity</subject><issn>0010-485X</issn><issn>1436-5057</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><recordid>eNpFkMtOwzAQRS0EEqXwA6wssTaMX3G6rCpeUgWbLNhFcTymqZo4xI5Q-XrSFonV3Zw7o3sIueVwzwHMQwTIwDAQigGXuWHmjMy4khnToM05mQFwYCrXH5fkKsYtAAiZL2bEFhuk2Fp0Dh31WKVxQBpxh3VqQkdbTJvg6Bib7pMu3wpah13o9jT0qWmbn-oIfTdpQ2MaxvrQdjT21RCbtKddGNp4TS58tYt485dzUjw9FqsXtn5_fl0t16w3wjAU1mi7cAura49K1KC9sYLb2gAKUQmnNK9yr7wC71Sd6WmoFhowc9pWXs7J3elsP4SvEWMqt2EcuuljKbnSIue5NBMlT1Tsh2kSDv8Uh_Lgsjy5LCeX5dFlaeQvLuNpPw</recordid><startdate>2025</startdate><enddate>2025</enddate><creator>Nemati, Khadijeh</creator><creator>Sheikhani, Amir Hosein Refahi</creator><creator>Kordrostami, Sohrab</creator><creator>Roudposhti, Kamrad Khoshhal</creator><general>Springer Vienna</general><general>Springer Nature B.V</general><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>2025</creationdate><title>The embedded feature selection method using ANT colony optimization with structured sparsity norms</title><author>Nemati, Khadijeh ; Sheikhani, Amir Hosein Refahi ; Kordrostami, Sohrab ; Roudposhti, Kamrad Khoshhal</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p727-e2b75b9d9b5cfe42c05f7b21bc70e22a2d451a8f4f40fd4c650135250e6d5baf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>Algorithms</topic><topic>Ant colony optimization</topic><topic>Artificial Intelligence</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Computer Appl. in Administrative Data Processing</topic><topic>Computer Communication Networks</topic><topic>Computer Science</topic><topic>Datasets</topic><topic>Feature selection</topic><topic>Information Systems Applications (incl.Internet)</topic><topic>Machine learning</topic><topic>Regular Paper</topic><topic>Software Engineering</topic><topic>Sparsity</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Nemati, Khadijeh</creatorcontrib><creatorcontrib>Sheikhani, Amir Hosein Refahi</creatorcontrib><creatorcontrib>Kordrostami, Sohrab</creatorcontrib><creatorcontrib>Roudposhti, Kamrad Khoshhal</creatorcontrib><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Computing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nemati, Khadijeh</au><au>Sheikhani, Amir Hosein Refahi</au><au>Kordrostami, Sohrab</au><au>Roudposhti, Kamrad Khoshhal</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The embedded feature selection method using ANT colony optimization with structured sparsity norms</atitle><jtitle>Computing</jtitle><stitle>Computing</stitle><date>2025</date><risdate>2025</risdate><volume>107</volume><issue>1</issue><spage>29</spage><pages>29-</pages><issn>0010-485X</issn><eissn>1436-5057</eissn><abstract>Feature selection is important in many machine learning applications. Our results demonstrate that it is not necessary to use all features of a dataset to perform classification and achieve lower classification error, which leads to higher accuracy. By selecting meaningful features and reducing the dimensions of the feature vector, we can significantly improve performance. The experimental results show that our proposed method, ANT–ANN–SSN, consistently outperforms existing methods across various datasets. For example, with 200 features, ANT–ANN–SSN achieved an accuracy of 77.85% on the RELATHE dataset and 77.68% on the PCMAC dataset (see Table 5). With 20 features, the ANT–ANN–SSN method reached 97.89% accuracy for ALLAML and 96.89% for PROSTATE-GE (see Table 6). Our approach, which employs an Ant Colony Optimization (ACO) algorithm alongside a two-layer perceptron classifier, addresses the feature selection problem as an optimization challenge, utilizing a new structured sparsity norm to evaluate feature subsets.</abstract><cop>Vienna</cop><pub>Springer Vienna</pub><doi>10.1007/s00607-024-01387-7</doi></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0010-485X |
ispartof | Computing, 2025, Vol.107 (1), p.29 |
issn | 0010-485X 1436-5057 |
language | eng |
recordid | cdi_proquest_journals_3145281837 |
source | SpringerLink Journals |
subjects | Algorithms Ant colony optimization Artificial Intelligence Artificial neural networks Classification Computer Appl. in Administrative Data Processing Computer Communication Networks Computer Science Datasets Feature selection Information Systems Applications (incl.Internet) Machine learning Regular Paper Software Engineering Sparsity |
title | The embedded feature selection method using ANT colony optimization with structured sparsity norms |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T07%3A24%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_sprin&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20embedded%20feature%20selection%20method%20using%20ANT%20colony%20optimization%20with%20structured%20sparsity%20norms&rft.jtitle=Computing&rft.au=Nemati,%20Khadijeh&rft.date=2025&rft.volume=107&rft.issue=1&rft.spage=29&rft.pages=29-&rft.issn=0010-485X&rft.eissn=1436-5057&rft_id=info:doi/10.1007/s00607-024-01387-7&rft_dat=%3Cproquest_sprin%3E3145281837%3C/proquest_sprin%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3145281837&rft_id=info:pmid/&rfr_iscdi=true |