Two efficient nonlinear conjugate gradient methods with restart procedures and their applications in image restoration

Nonlinear conjugate gradient method (CGM) is one of the most efficient iterative methods for dealing with large-scale optimization problems. In this paper, based on the Fletcher–Reeves and Dai–Yuan CGMs, two restart CGMs with different restart procedures are proposed for unconstrained optimization,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nonlinear dynamics 2023-03, Vol.111 (6), p.5469-5498
Hauptverfasser: Jiang, Xian-Zhen, Zhu, Yi-Han, Jian, Jin-Bao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Nonlinear conjugate gradient method (CGM) is one of the most efficient iterative methods for dealing with large-scale optimization problems. In this paper, based on the Fletcher–Reeves and Dai–Yuan CGMs, two restart CGMs with different restart procedures are proposed for unconstrained optimization, in which their restart conditions are designed according to their conjugate parameters with the aim of ensuring that their search directions are sufficient descent. Under usual assumptions and using the weak Wolfe line search to yield their steplengths, the proposed methods are proved to be global convergent. To test the validity of the proposed methods, we choose four restart directions for each method and perform large-scale numerical experiments for unconstrained optimization and image restoration problems. Moreover, we report their detailed numerical results and performance profiles, which show that the encouraging efficiency and applicability of the proposed methods even compared with the current well-accepted methods.
ISSN:0924-090X
1573-269X
DOI:10.1007/s11071-022-08013-1