In many applications, the coefficient matrix is symmetric and positive
definite. The reason for this is usually that the partial differential
operator from which it is derived is self-adjoint, coercive, and bounded
(see Axelsson and Barker [.2]AxBa:febook).
It follows that for the coefficient
matrix the following relation holds for any matrix from a
similar differential equation:

where , do not depend on the matrix size. The importance of
this is that the use of as a preconditioner gives an iterative
method with a number of iterations that does not depend on the matrix size.

Thus we can precondition our original matrix by one derived from a
different PDE, if one can be found that has attractive properties as
preconditioner.
The most common choice is to take a matrix from a separable PDE. A system involving such a matrix can be solved with
various so-called ``fast solvers'', such as FFT methods, cyclic
reduction, or the generalized marching algorithm
(see Dorr [75],
Swarztrauber [195], Bank [25] and
Bank and Rose [27]).

As a simplest example, any elliptic operator can be preconditioned
with the Poisson operator, giving the iterative method

In Concus and Golub [59] a transformation of this
method is considered to speed up the convergence.
As another example, if the original matrix arises from

the preconditioner can be formed from

An extension to the non-self adjoint case is considered
by Elman and Schultz [94].

Fast solvers are attractive in that the number of operations they
require is (slightly higher than) of the order of the number of
variables. Coupled with the fact that the number of iterations in the
resulting preconditioned iterative methods is independent of the
matrix size, such methods are close to optimal. However, fast solvers
are usually only applicable if the physical domain is a rectangle or
other Cartesian product structure. (For a domain consisting of a
number of such pieces, domain decomposition methods can be used;
see §).