BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer

In this paper, we propose enhancements to Beetle Antennae search &#x0028 BAS &#x0029 algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/CAA journal of automatica sinica 2020-03, Vol.7 (2), p.461-471
Hauptverfasser: Khan, Ameer Hamza, Cao, Xinwei, Li, Shuai, Katsikis, Vasilios N., Liao, Liefa
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose enhancements to Beetle Antennae search &#x0028 BAS &#x0029 algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the adaptive moment estimation &#x0028 ADAM &#x0029 update rule. The proposed algorithm also increases the convergence rate in a narrow valley. A key feature of the ADAM update rule is the ability to adjust the step-size for each dimension separately instead of using the same step-size. Since ADAM is traditionally used with gradient-based optimization algorithms, therefore we first propose a gradient estimation model without the need to differentiate the objective function. Resultantly, it demonstrates excellent performance and fast convergence rate in searching for the optimum of non-convex functions. The efficiency of the proposed algorithm was tested on three different benchmark problems, including the training of a high-dimensional neural network. The performance is compared with particle swarm optimizer &#x0028 PSO &#x0029 and the original BAS algorithm.
ISSN:2329-9266
2329-9274
DOI:10.1109/JAS.2020.1003048