The uselessness of the Fast Gauss Transform for summing Gaussian radial basis function series

  • Authors:
  • John P. Boyd

  • Affiliations:
  • Department of Atmospheric, Oceanic and Space Science, University of Michigan, 2455, Hayward Avenue, Ann Arbor, MI 48109-2143, United States

  • Venue:
  • Journal of Computational Physics
  • Year:
  • 2010

Quantified Score

Hi-index 31.45

Visualization

Abstract

The Fast Gauss Transform is an algorithm for summing a series of Gaussians which is sometimes much faster than direct summation. Gaussian series in d dimensions are of the form @?"j"="1^N@l"jexp(-[@a/h]^2@?x-x"j@?^2) where the x"j are the centers, h is the average separation between centers and @a is the relative inverse width parameter. We show that the speed-up of the Fast Gauss Transform is bounded by a factor @W(@a). When @a@?1, @W can be large. However, when applied to Gaussian radial basis function interpolation, it is difficult to apply the Gaussian basis in this parameter range because the interpolation matrix is exponentially ill-conditioned: the condition number @k~(1/2)exp@p^24@a^2 for a uniform, one-dimensional grid, and larger still in two dimensions or when the grid is irregular. Furthermore, the Gaussian RBF interpolant is ill-conditioned for most series in the sense that the interpolant is the small difference of terms with exponentially large coefficients. Fornberg and Piret developed a ''QR-basis'' that ameliorates this difficulty for approximations on the surface of a sphere, but because the recombined basis functions are perturbed spherical harmonics, not Gaussians, the Fast Gauss Transform is no longer applicable. The solution of the interpolation matrix system by a preconditioned iteration is less sensitive to condition numbers than direct methods because iterations are self-correcting and also because the preconditioning reduces the spread of the eigenvalues. However, each iteration requires a matrix-vector multiply which is fast only if this operation can be performed by some species of Fast Summation. When @a~O(1), alas, we show that @W is not large and the Fast Gauss Transform is not accelerative. Gaussian RBFs are unusual among RBF species through the absence of long-range interactions due to the exponential decay of the Gaussians with distance from their centers; many other RBF species do have long-range interactions, and it is well-established that these can be accelerated by fast multipole and treecode algorithms. We offer a less rigorous scale analysis argument to explain why the underlying difficulty in accelerating short-range interactions is not peculiar to the Gaussian RBF basis or to the Fast Gauss Transform, but rather is likely to be a generic difficulty in accelerating the short-range interactions of almost any RBF basis with almost any Fast Summation.