Multiple strategies based Grey Wolf Optimizer for feature selection in performance evaluation of open-ended funds

The methods for selecting the features in evaluating fund performance rely heavily on traditional statistics, which can potentially lead to excessive data dimensions in a multi-dimensional context. Grey Wolf Optimizer (GWO), a swarm intelligence optimization algorithm with its simple structure and f...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Swarm and evolutionary computation 2024-04, Vol.86, p.101518, Article 101518
Hauptverfasser: Chang, Dan, Rao, Congjun, Xiao, Xinping, Hu, Fuyan, Goh, Mark
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The methods for selecting the features in evaluating fund performance rely heavily on traditional statistics, which can potentially lead to excessive data dimensions in a multi-dimensional context. Grey Wolf Optimizer (GWO), a swarm intelligence optimization algorithm with its simple structure and few parameters, is widely used in feature selection. However, the algorithm suffers from local optimality and the imbalance in exploration and exploitation. This paper proposes a Multi-Strategy Grey Wolf Optimizer (MSGWO) to address the limitations, and identify the relevant features for evaluating fund performance. Random Opposition-based Learning is applied to enhance population quality during the initialization phase. Moreover, the convergence factor is nonlinearized to coordinate the global exploration and local exploitation capabilities. Finally, a two-stage hybrid mutation operator is applied to modify the updating mechanism, so as to increase population diversity and balance the exploration and exploitation abilities of GWO. The proposed algorithm is compared against 6 related algorithms and verified by the Wilcoxon signed-rank test on 12 quarterly datasets (2020-2022) of Chinese open-ended funds. The results inform that MSGWO reduces the feature size as well as the classification error rate.
ISSN:2210-6502
DOI:10.1016/j.swevo.2024.101518