AdaLo: Adaptive learning rate optimizer with loss for classification
Gradient-based algorithms are frequently used to optimize neural networks, with various methods developed to enhance their performance. Among them, the adaptive moment estimation (Adam) optimizer is well-known for its effectiveness and ease of implementation. However, it suffers from poor generaliza...
Gespeichert in:
Veröffentlicht in: | Information sciences 2025-02, Vol.690, p.121607, Article 121607 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Gradient-based algorithms are frequently used to optimize neural networks, with various methods developed to enhance their performance. Among them, the adaptive moment estimation (Adam) optimizer is well-known for its effectiveness and ease of implementation. However, it suffers from poor generalization without a learning rate scheduler. Additionally, it has the disadvantage of a large computational burden because of individual learning rate term, as known as second-order moments of gradients. In this study, we propose a novel gradient descent algorithm called AdaLo, which stands for Adaptive Learning Rate Optimizer with Loss. AdaLo addresses two problems using its adaptive learning rate (ALR). Firstly, the proposed ALR adjusts the learning rate, based on the model's training progress, specifically the loss value. Therefore AdaLo's ALR effectively replaces traditional learning rate schedulers. Secondly, the ALR is a scalar global learning rate, reducing the computational burden. In addition, the stability of the proposed method is analyzed from the perspective of the learning rate. The superiority of AdaLo was proven by non-convex functions. Simulation results indicated that the proposed optimizer outperformed the Adam, AdaBelief, and diffGrad with regard to the training error and test accuracy. |
---|---|
ISSN: | 0020-0255 |
DOI: | 10.1016/j.ins.2024.121607 |