DRSOM: A Dimension Reduced Second-Order Method
In this paper, we propose a Dimension-Reduced Second-Order Method (DRSOM) for convex and nonconvex (unconstrained) optimization. Under a trust-region-like framework, our method preserves the convergence of the second-order method while using only curvature information in a few directions. Consequent...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we propose a Dimension-Reduced Second-Order Method (DRSOM) for
convex and nonconvex (unconstrained) optimization. Under a trust-region-like
framework, our method preserves the convergence of the second-order method
while using only curvature information in a few directions. Consequently, the
computational overhead of our method remains comparable to the first-order such
as the gradient descent method. Theoretically, we show that the method has a
local quadratic convergence and a global convergence rate of
$O(\epsilon^{-3/2})$ to satisfy the first-order and second-order conditions if
the subspace satisfies a commonly adopted approximated Hessian assumption. We
further show that this assumption can be removed if we perform a corrector step
using a Krylov-like method periodically at the end stage of the algorithm. The
applicability and performance of DRSOM are exhibited by various computational
experiments, including $L_2 - L_p$ minimization, CUTEst problems, and sensor
network localization. |
---|---|
DOI: | 10.48550/arxiv.2208.00208 |