A new parallel galactic swarm optimization algorithm for training artificial neural networks
Metaheuristic algorithms are a family of algorithms that help solve NP-hard problems by providing near-optimal solutions in a reasonable amount of time. Galactic Swarm Optimization (GSO) is the state-of-the-art metaheuristic algorithm that takes inspiration from the motion of stars and galaxies unde...
Gespeichert in:
Veröffentlicht in: | Journal of intelligent & fuzzy systems 2020-01, Vol.38 (5), p.6691-6701 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Metaheuristic algorithms are a family of algorithms that help solve NP-hard problems by providing near-optimal solutions in a reasonable amount of time. Galactic Swarm Optimization (GSO) is the state-of-the-art metaheuristic algorithm that takes inspiration from the motion of stars and galaxies under the influence of gravity. In this paper, a new scalable algorithm is proposed to help overcome the inherent sequential nature of GSO and helps the modified version of the GSO algorithm to utilize the full computing capacity of the hardware efficiently. The modified algorithm includes new features to tackle the problem of training an Artificial Neural Network. The proposed algorithm is compared with Stochastic Gradient Descent based on performance and accuracy. The algorithm’s performance was evaluated based on per-CPU utilization on multiple platforms. Experimental results have shown that PGSO outperforms GSO and other competitors like PSO in a variety of challenging settings. |
---|---|
ISSN: | 1064-1246 1875-8967 |
DOI: | 10.3233/JIFS-179747 |