Stochastic batch size for adaptive regularization in deep network optimization

•Adaptive regularization for deep network optimization via parameter-wise batch size.•The stochastic batch size reflects local and global properties of each parameter.•Beneficial for practical studies where the number of training examples is small. We propose a first-order stochastic optimization al...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition 2022-09, Vol.129, p.108776, Article 108776
Hauptverfasser: Nakamura, Kensuke, Soatto, Stefano, Hong, Byung-Woo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Adaptive regularization for deep network optimization via parameter-wise batch size.•The stochastic batch size reflects local and global properties of each parameter.•Beneficial for practical studies where the number of training examples is small. We propose a first-order stochastic optimization algorithm incorporating adaptive regularization for pattern recognition problems in deep learning framework. The adaptive regularization is imposed by stochastic process in determining batch size for each model parameter at each optimization iteration. The stochastic batch size is determined by the update probability of each parameter following a distribution of gradient norms in consideration of their local and global properties in the neural network architecture where the range of gradient norms may vary within and across layers. We empirically demonstrate the effectiveness of our algorithm using an image classification task based on conventional network models applied to commonly used benchmark datasets. The quantitative evaluation indicates that our algorithm outperforms the state-of-the-art optimization algorithms in generalization while providing less sensitivity to the selection of batch size which often plays a critical role in optimization, thus achieving more robustness to the selection of regularity.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2022.108776