An adaptively balanced grey wolf optimization algorithm for feature selection on high-dimensional classification

Feature selection, which aims to screen out redundant and irrelevant features from datasets, is integral to machine learning and data mining. Grey Wolf Optimization (GWO) is a recent meta-heuristic algorithm based on swarm intelligence and has wide applicability to various optimization problems due...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Engineering applications of artificial intelligence 2022-09, Vol.114, p.105088, Article 105088
Hauptverfasser: Wang, Jing, Lin, Dakun, Zhang, Yuanzi, Huang, Shiguo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Feature selection, which aims to screen out redundant and irrelevant features from datasets, is integral to machine learning and data mining. Grey Wolf Optimization (GWO) is a recent meta-heuristic algorithm based on swarm intelligence and has wide applicability to various optimization problems due to its fast convergence and few parameters. However, since the wolf pack is always dominated by the three leading wolves (i.e., α, β and δ), the GWO algorithm suffers from weak exploration throughout the whole optimization process and easily stagnates into local optima. In this paper, an Adaptively Balanced Grey Wolf Optimization (ABGWO) algorithm is proposed to seek out the optimal feature subset for high-dimensional classification. Specifically, to improve the exploration ability of GWO, a random wolf is introduced to cooperate with α, β and δ. A novel level-based strategy is further adopted to select the random wolf. Besides, to dynamically modulate the exploration and exploitation ability in different optimization stages, an adaptive coefficient is introduced to regulate the leadership of α, β, δ and the randomly-selected wolf. Finally, the improvement of exploration and exploitation is validated on 12 high-dimensional datasets provided by Arizona State University and University of California Irvine, and the superiority of ABGWO is further verified by comparing it with seven state-of-the-art feature selection approaches on the aspect of classification accuracy, size of feature subset and computational time. •Introducing a random individual to prevent sharp decline in population diversity.•Semi-random selection prevents the rapid decrease of the convergence speed.•A coefficient factor linearly increases to adaptively adjust the exploration and exploitation.•The improved GWO algorithm for high-dimensional feature selection significantly boosts the classification performance.
ISSN:0952-1976
1873-6769
DOI:10.1016/j.engappai.2022.105088