Projection-Based Regularized Dual Averaging for Stochastic Optimization
We propose a novel stochastic-optimization framework based on the regularized dual averaging (RDA) method. The proposed approach differs from the previous studies of RDA in three major aspects. First, the squared-distance loss function to a "random" closed convex set is employed for stabil...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on signal processing 2019-05, Vol.67 (10), p.2720-2733 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We propose a novel stochastic-optimization framework based on the regularized dual averaging (RDA) method. The proposed approach differs from the previous studies of RDA in three major aspects. First, the squared-distance loss function to a "random" closed convex set is employed for stability. Second, a sparsity-promoting metric (used implicitly by a certain proportionate-type adaptive filtering algorithm) and a quadratically-weighted ℓ 1 regularizer are used simultaneously. Third, the step size and regularization parameters are both constant due to the smoothness of the loss function. These three differences yield an excellent sparsity-seeking property, high estimation accuracy, and insensitivity to the choice of the regularization parameter. Numerical examples show the remarkable advantages of the proposed method over the existing methods (including AdaGrad and the adaptive proximal forward-backward splitting method) in applications to regression and classification with real/synthetic data. |
---|---|
ISSN: | 1053-587X 1941-0476 |
DOI: | 10.1109/TSP.2019.2908901 |