Fast Radial Basis Function Interpolation via Preconditioned Krylov Iteration

  • Authors:
  • Nail A. Gumerov;Ramani Duraiswami

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Scientific Computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.03

Visualization

Abstract

We consider a preconditioned Krylov subspace iterative algorithm presented by Faul, Goodsell, and Powell (IMA J. Numer. Anal. 25 (2005), pp. 1-24) for computing the coefficients of a radial basis function interpolant over $N$ data points. This preconditioned Krylov iteration has been demonstrated to be extremely robust to the distribution of the points and the iteration rapidly convergent. However, the iterative method has several steps whose computational and memory costs scale as $O(N^{2}),$ both in preliminary computations that compute the preconditioner and in the matrix-vector product involved in each step of the iteration. We effectively accelerate the iterative method to achieve an overall cost of $O(N\log N).$ The matrix vector product is accelerated via the use of the fast multipole method. The preconditioner requires the computation of a set of closest points to each point. We develop an $O(N\log N)$ algorithm for this step as well. Results are presented for multiquadric interpolation in $\mathbb{R}^{2}$ and biharmonic interpolation in $\mathbb{R}^{3}$. A novel FMM algorithm for the evaluation of sums involving multiquadric functions in $\mathbb{R}^{2}$ is presented as well.