The Improved Stochastic Fractional Order Gradient Descent Algorithm

This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Fractal and fractional 2023-08, Vol.7 (8), p.631
Hauptverfasser: Yang, Yang, Mo, Lipo, Hu, Yusen, Long, Fei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are designed combining a fractional order gradient and it is shown that the corresponding regret functions are convergent at a sub-linear rate. Then we discuss the impact of the fractional order on the convergence and monotonicity and prove that the better performance can be obtained by adjusting the order of the fractional gradient. Finally, several practical examples are given to verify the superiority and validity of the proposed algorithm.
ISSN:2504-3110
2504-3110
DOI:10.3390/fractalfract7080631