Hanke-Raus heuristic rule for iteratively regularized stochastic gradient descent
In this work, we present a novel variant of the stochastic gradient descent method termed as iteratively regularized stochastic gradient descent (IRSGD) method to solve nonlinear ill-posed problems in Hilbert spaces. Under standard assumptions, we demonstrate that the mean square iteration error of...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this work, we present a novel variant of the stochastic gradient descent
method termed as iteratively regularized stochastic gradient descent (IRSGD)
method to solve nonlinear ill-posed problems in Hilbert spaces. Under standard
assumptions, we demonstrate that the mean square iteration error of the method
converges to zero for exact data. In the presence of noisy data, we first
propose a heuristic parameter choice rule (HPCR) based on the method suggested
by Hanke and Raus, and then apply the IRSGD method in combination with HPCR.
Precisely, HPCR selects the regularization parameter without requiring any
a-priori knowledge of the noise level. We show that the method terminates in
finitely many steps in case of noisy data and has regularizing features.
Further, we discuss the convergence rates of the method using well-known source
and other related conditions under HPCR as well as discrepancy principle. To
the best of our knowledge, this is the first work that establishes both the
regularization properties and convergence rates of a stochastic gradient method
using a heuristic type rule in the setting of infinite-dimensional Hilbert
spaces. Finally, we provide the numerical experiments to showcase the practical
efficacy of the proposed method. |
---|---|
DOI: | 10.48550/arxiv.2412.02397 |