Stochastic Learning Equation using Monotone Increasing Resolution of Quantization
In this paper, we propose a quantized learning equation with a monotone increasing resolution of quantization and stochastic analysis for the proposed algorithm. According to the white noise hypothesis for the quantization error with dense and uniform distribution, we can regard the quantization err...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we propose a quantized learning equation with a monotone
increasing resolution of quantization and stochastic analysis for the proposed
algorithm. According to the white noise hypothesis for the quantization error
with dense and uniform distribution, we can regard the quantization error as
i.i.d.\ white noise. Based on this, we show that the learning equation with
monotonically increasing quantization resolution converges weakly as the
distribution viewpoint. The analysis of this paper shows that global
optimization is possible for a domain that satisfies the Lipschitz condition
instead of local convergence properties such as the Hessian constraint of the
objective function. |
---|---|
DOI: | 10.48550/arxiv.2112.13006 |