General Framework of Preconditioning

Preconditioned methods are designed to handle
the case when the only operation we can perform with
matrices and of the pencil
is multiplication of a vector by and .
To accelerate the convergence, we introduce a
preconditioner. It is also common to call the inverse
the preconditioner; see, e.g., the previous section.
Applying the preconditioner
to a vector usually involves solving a linear system .

In many engineering applications, preconditioned iterative
solvers for linear systems are already available,
and efficient preconditioners
are constructed.
We shall show that the same preconditioner can be used to solve
an eigenvalue problem
and .
Moreover, existing
codes for the system can often be just slightly modified to
solve the partial eigenvalue problem with .

We will assume that the preconditioner is
symmetric positive definite.
As is also symmetric positive definite, there exist
positive constants
such that

(279)

The ratio
can be viewed as
the spectral condition number
of the preconditioned matrix
and measures how well
approximates, up to a scaling, the original matrix .
A smaller ratio
ensures faster convergence.

Indefinite preconditioners for
symmetric eigenproblems are also possible, but not recommended.
Iterative methods for nonsymmetric problems,
e.g., based on minimization of the residual,
should generally be used when the
preconditioner is indefinite, which may increase
computational costs considerably.

We will not assume that
the preconditioner commutes with , or , despite
the fact that
such an assumption would greatly simplify the theory
of preconditioned methods.

We first define, following [268], a preconditioned
single-vector iterative solver for the pencil
,
as a generalized polynomial method of the following kind:

(280)

where is a polynomial of the th degree of two
independent
variables, is an initial guess, and is a fixed
preconditioner.

We only need to choose a polynomial,
either a priori or during the process of iterations,
and use a recursive formula which leads to
an iterative scheme. For
an approximation
to an eigenvalue of the pencil for a given eigenvector
approximation , the Rayleigh quotient
(11.8) is typically used:

Thus, we have a complete description of a general preconditioned
eigensolver for the pencil , as shown below:

THE PRECONDITIONED EIGENSOLVER FOR

Start: select .

Iterate steps to compute
.

Compute
.

With and , we can
obtain a similar algorithm for .
The polynomials can be chosen in a special way
to force convergence of to an eigenvector other than
the one corresponding to an extreme eigenvalue.

Similarly, we define general preconditioned
block-iterative methods,
where a group of vectors
,
is computed simultaneously:

(281)

where is a polynomial of the th
degree of two independent variables, and
are initial guesses.
The polynomial does not have to be
(and in practice is usually not) the same for
different values of .
If it is the same, then
(11.11) can be rewritten as the subspace
iterations:

(282)

where and are -dimensional
subspaces. Such a method is easier to analyze theoretically; see
an example of a preconditioned
subspace iterations based on the power method with shift
in [148,149], but may converge slower in practice.

of individual vectors .
It is common to use (11.11)
recursively by combining it with a procedure
for selecting individual vectors in
as initial vectors for the next recursive step.
The Rayleigh-Ritz method is the usual choice for such a procedure.
One step of the recursion is shown below:

THE BLOCK PRECONDITIONED EIGENSOLVER FOR

Start: select
.

Iterate steps to compute
.

Use the Rayleigh-Ritz method for the pencil
in the subspace
,
to compute the Ritz values
and
the corresponding Ritz vectors .

In the following sections, we
consider particular examples of preconditioned eigensolvers.