The embedded feature selection method using ANT colony optimization with structured sparsity norms

Feature selection is important in many machine learning applications. Our results demonstrate that it is not necessary to use all features of a dataset to perform classification and achieve lower classification error, which leads to higher accuracy. By selecting meaningful features and reducing the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computing 2025, Vol.107 (1), p.29
Hauptverfasser: Nemati, Khadijeh, Sheikhani, Amir Hosein Refahi, Kordrostami, Sohrab, Roudposhti, Kamrad Khoshhal
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Feature selection is important in many machine learning applications. Our results demonstrate that it is not necessary to use all features of a dataset to perform classification and achieve lower classification error, which leads to higher accuracy. By selecting meaningful features and reducing the dimensions of the feature vector, we can significantly improve performance. The experimental results show that our proposed method, ANT–ANN–SSN, consistently outperforms existing methods across various datasets. For example, with 200 features, ANT–ANN–SSN achieved an accuracy of 77.85% on the RELATHE dataset and 77.68% on the PCMAC dataset (see Table 5). With 20 features, the ANT–ANN–SSN method reached 97.89% accuracy for ALLAML and 96.89% for PROSTATE-GE (see Table 6). Our approach, which employs an Ant Colony Optimization (ACO) algorithm alongside a two-layer perceptron classifier, addresses the feature selection problem as an optimization challenge, utilizing a new structured sparsity norm to evaluate feature subsets.
ISSN:0010-485X
1436-5057
DOI:10.1007/s00607-024-01387-7