Fast Convex Optimization via Differential Equation with Hessian-Driven Damping and Tikhonov Regularization

In this paper, we consider a class of second-order ordinary differential equations with Hessian-driven damping and Tikhonov regularization, which arises from the minimization of a smooth convex function in Hilbert spaces. Inspired by Attouch et al. (J Differ Equ 261:5734–5783, 2016), we establish th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of optimization theory and applications 2024-10, Vol.203 (1), p.42-82
Hauptverfasser: Zhong, Gangfan, Hu, Xiaozhe, Tang, Ming, Zhong, Liuqiang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we consider a class of second-order ordinary differential equations with Hessian-driven damping and Tikhonov regularization, which arises from the minimization of a smooth convex function in Hilbert spaces. Inspired by Attouch et al. (J Differ Equ 261:5734–5783, 2016), we establish that the function value along the solution trajectory converges to the optimal value, and prove that the convergence rate can be as fast as o ( 1 / t 2 ) . By constructing proper energy function, we prove that the trajectory strongly converges to a minimizer of the objective function of minimum norm. Moreover, we propose a gradient-based optimization algorithm based on numerical discretization, and demonstrate its effectiveness in numerical experiments.
ISSN:0022-3239
1573-2878
DOI:10.1007/s10957-024-02462-x