Multiobjective Harris Hawks Optimization With Associative Learning and Chaotic Local Search for Feature Selection

In the classification problem, datasets often have a large number of features, but not all features are useful for classification. A lot of irrelevant features may even reduce the performance. Feature selection is to remove irrelevant features by minimizing the number of the feature subset and minim...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.72973-72987
Hauptverfasser: Zhang, Youhua, Zhang, Yuhe, Zhang, Cuijun, Zhou, Chong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the classification problem, datasets often have a large number of features, but not all features are useful for classification. A lot of irrelevant features may even reduce the performance. Feature selection is to remove irrelevant features by minimizing the number of the feature subset and minimizing the classification error rate.So it can be regarded as a multi-objective optimization problem. Because of its simple structure and easy implementation, Harris Hawks Optimization algorithm (HHO) is widely employed in optimization problems. In this paper, the multi-objective HHO is applied to address the feature selection problem. In order to improve the search ability of the algorithm, associative learning, grey wolf optimization and chaotic local search are introduced into it. An external repository is used to save non-dominant solution set. The results of feature selection on the sixteen University of California Irvine (UCI) datasets show that the proposed method can effectively remove redundant features and improve the classification performance of the algorithm.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3189476