Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study

Addressing real-time optimization problems becomes increasingly challenging as their complexity continues to escalate over time. So bio-optimization algorithms (BoAs) come into the picture to solve such problems due to their global search capability, adaptability, versatility, parallelism, and robus...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Mathematics (Basel) 2024-08, Vol.12 (16), p.2521
Hauptverfasser: Das, Madhusmita, Mohan, Biju R, Ram Mohana Reddy Guddeti, Prasad, Nandini
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Addressing real-time optimization problems becomes increasingly challenging as their complexity continues to escalate over time. So bio-optimization algorithms (BoAs) come into the picture to solve such problems due to their global search capability, adaptability, versatility, parallelism, and robustness. This article aims to perform hyperparameter tuning of machine learning (ML) models by integrating them with BoAs. Aiming to maximize the accuracy of the hybrid bio-optimized defect prediction (HBoDP) model, this research paper develops four novel hybrid BoAs named the gravitational force Lévy flight grasshopper optimization algorithm (GFLFGOA), the gravitational force Lévy flight grasshopper optimization algorithm–sparrow search algorithm (GFLFGOA-SSA), the gravitational force grasshopper optimization algorithm–sparrow search algorithm (GFGOA-SSA), and the Lévy flight grasshopper optimization algorithm–sparrow search algorithm (LFGOA-SSA). These aforementioned algorithms are proposed by integrating the good exploration capacity of the SSA with the faster convergence of the LFGOA and GFGOA. The performances of the GFLFGOA, GFLFGOA-SSA, GFGOA-SSA, and LFGOA-SSA are verified by conducting two different experiments. Firstly, the experimentation was conducted on nine benchmark functions (BFs) to assess the mean, standard deviation (SD), and convergence rate. The second experiment focuses on boosting the accuracy of the HBoDP model through the fine-tuning of the hyperparameters in the artificial neural network (ANN) and XGBOOST (XGB) models. To justify the effectiveness and performance of these hybrid novel algorithms, we compared them with four base algorithms, namely the grasshopper optimization algorithm (GOA), the sparrow search algorithm (SSA), the gravitational force grasshopper optimization algorithm (GFGOA), and the Lévy flight grasshopper optimization algorithm (LFGOA). Our findings illuminate the effectiveness of this hybrid approach in enhancing the convergence rate and accuracy. The experimental results show a faster convergence rate for BFs and improvements in software defect prediction accuracy for the NASA defect datasets by comparing them with some baseline methods.
ISSN:2227-7390
DOI:10.3390/math12162521