The unpreconditioned conjugate gradient method constructs the th
iterate
as an element of
so that
is minimized , where
is the exact solution of
. This minimum is guaranteed
to exist in general only if
is symmetric positive definite. The
preconditioned version of the method uses a different subspace for
constructing the iterates, but it satisfies the same minimization
property, although over this different subspace. It requires in
addition that the preconditioner
is symmetric and positive
definite.
The above minimization of the error is equivalent to the residuals
being
orthogonal (that is,
if
). Since for symmetric
an orthogonal basis for the Krylov subspace
can be
constructed with only three-term recurrences , such a recurrence also
suffices for generating the residuals. In the Conjugate
Gradient method two coupled two-term recurrences are used; one that
updates residuals using a search direction vector, and one updating
the search direction with a newly computed residual.
This makes the
Conjugate Gradient Method quite attractive computationally.
Krylov sequence: For a given matrix and vector
, the
sequence of vectors
, or a finite initial part
of this sequence.
Krylov subspace: The subspace spanned by a Krylov sequence.
There is a close relationship between the Conjugate Gradient method
and the Lanczos method for determining eigensystems, since both are
based on the construction of an orthogonal basis for the Krylov
subspace, and a similarity transformation of the coefficient matrix to
tridiagonal form. The coefficients computed during
the CG iteration then arise from the factorization of this
tridiagonal matrix.
From the CG iteration one can reconstruct the Lanczos process, and vice versa;
see Paige and Saunders [168]
and Golub and Van Loan [.2.6]GoVL:matcomp.
This relationship can be exploited to obtain relevant information
about the eigensystem of the (preconditioned) matrix
;
see §
.