Multi-strategy ensemble binary hunger games search for feature selection

Feature selection is a crucial preprocessing step in the sphere of machine learning and data mining, devoted to reducing the data dimensionality to improve the performance of learning models. In this paper, a vigorous metaheuristic named hunger games search (HGS) is integrated with a multi-strategy...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2022-07, Vol.248, p.108787, Article 108787
Hauptverfasser: Ma, Benedict Jun, Liu, Shuai, Heidari, Ali Asghar
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Feature selection is a crucial preprocessing step in the sphere of machine learning and data mining, devoted to reducing the data dimensionality to improve the performance of learning models. In this paper, a vigorous metaheuristic named hunger games search (HGS) is integrated with a multi-strategy (MS) framework, including chaos theory, greedy selection, and vertical crossover, to boost search equilibrium between explorative and exploitative cores. The new MS-HGS algorithm is developed for global optimization, and its binary variant MS-bHGS is applied to the feature selection problem particularly. To evaluate and validate the performance of our proposed approach, on the one hand, MS-HGS is compared with HGS alongside single strategy embedded HGS on 23 benchmark functions and compared with seven state-of-the-art algorithms on IEEE CEC 2017 test suite. On the other hand, MS-bHGS is employed for feature selection in 20 datasets from the UCI repository and compared with three groups of methods, i.e., traditional, recent, and enhanced, respectively. The relevant experimental results confirm that MS-bHGS exceeds bHGS and most existing techniques in terms of classification accuracy, the number of selected features, fitness values, and execution time. Overall, this paper’s findings suggest that MS-HGS is a superior optimizer, and MS-bHGS can be considered a valuable wrapper-mode feature selection technique.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2022.108787