Superlinear Convergence of Conjugate Gradients

We give a theoretical explanation for superlinear convergence behavior observed while solving large symmetric systems of equations using the conjugate gradient method or other Krylov subspace methods. We present a new bound on the relative error after n iterations. This bound is valid in an asymptot...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:SIAM journal on numerical analysis 2002, Vol.39 (1), p.300-329
Hauptverfasser: Beckermann, Bernhard, Arno B. J. Kuijlaars
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We give a theoretical explanation for superlinear convergence behavior observed while solving large symmetric systems of equations using the conjugate gradient method or other Krylov subspace methods. We present a new bound on the relative error after n iterations. This bound is valid in an asymptotic sense when the size N of the system grows together with the number of iterations. The bound depends on the asymptotic eigenvalue distribution and on the ratio n/N. Under appropriate conditions we show that the bound is asymptotically sharp. Our findings are related to some recent results concerning asymptotics of discrete orthogonal polynomials. An important tool in our investigations is a constrained energy problem in logarithmic potential theory. The new asymptotic bounds for the rate of convergence are illustrated by discussing Toeplitz systems as well as a model problem stemming from the discretization of the Poisson equation.
ISSN:0036-1429
1095-7170
DOI:10.1137/S0036142999363188