Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization

In this paper, we propose a modified Polak-Ribière-Polyak conjugate gradient method. Different from the existent methods, a damping factor is introduced to monitor jamming and make instantaneous changes to incorrect predictions, and the resulting damping term is significantly embedded into the conju...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Numerical algorithms 2023-06, Vol.93 (2), p.765-783
1. Verfasser: Dong, Xiaoliang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose a modified Polak-Ribière-Polyak conjugate gradient method. Different from the existent methods, a damping factor is introduced to monitor jamming and make instantaneous changes to incorrect predictions, and the resulting damping term is significantly embedded into the conjugate gradient parameter to self-adjust weight between self-correcting property and global convergence. A bigger value is preferred to control the magnitude of conjugate gradient parameter being less than that of the Fletcher-Reeves method, which is a benefit to sufficient descent condition of the search directions; also, to a certain, a dramatically shrinking value makes the parameter above to approximate instantaneously that of the Polak-Ribière-Polyak method, which stimulates the return of restarting mechanism. Under mild conditions, we show that the proposed methods converge globally. Numerical experiments support the theoretical result.
ISSN:1017-1398
1572-9265
DOI:10.1007/s11075-022-01440-6