The fast multipole method has been called one of the ten most
significant numerical algorithms discovered in the 20th century
(along with algorithms such as the fast Fourier transform), and
won its inventors, V. Rokhlin and L. Greengard, the 2001 Steele
prize, in addition to getting Greengard the ACM best dissertation
award. The algorithm allows the product of particular dense
matrices with a vector to be evaluated approximately (to a
specified precision) in
operations, when direct
multiplication requires operations. For extremely large
problems, the gain in efficiency and memory can be very
significant, and enables the use of more powerful modeling
approaches that may have been discarded as computationally
infeasible in the past.

The fast Gauss transform (FGT) introduced by Greengard and Strain
is an important variant of the more general fast multipole method.
While the fast multipole method has been successfully in many
mathematics and physics domains, the fast Gauss transform is
widely applied in many applications of computer vision and pattern
recognition.
While the original FGT has been successfully applied to problem in
low dimensional spaces, it suffers from two serious defects in
higher dimensional spaces:

The exponential growth of complexity with dimensionality.

The use of the box data structure in the FGT is inefficient in higher
dimensions.

To overcome the difficulties of FGT in higher dimensional spaces,
we proposed a new multivariate Taylor expansion whose complexity
is asymptomatically polynomial order. To adaptively fit the
density of points, we use -center algorithm (a.k.a.
farthest-point algorithm) to subdivide the space. The complexity
of -center algorithm is
. The result of
-center algorithm is shown by the following Java demonstration.
We also give a relatively simple error bound of our improved fast
Gauss transform.