Journal of Computational Physics
Artificial Intelligence Review - Special issue on lazy learning
Geostatistical classification for remote sensing: an introduction
Computers & Geosciences
A two-dimensional interpolation function for irregularly-spaced data
ACM '68 Proceedings of the 1968 23rd ACM national conference
Spartan Gibbs Random Field Models for Geostatistical Applications
SIAM Journal on Scientific Computing
Gaussian Markov Random Fields: Theory And Applications (Monographs on Statistics and Applied Probability)
Kriging filters for multidimensional signal processing
Signal Processing - Special section on content-based image and video retrieval
Gappy data: To Krig or not to Krig?
Journal of Computational Physics
Image Analysis, Random Fields and Markov Chain Monte Carlo Methods: A Mathematical Introduction (Stochastic Modelling and Applied Probability)
Journal of Computational Methods in Sciences and Engineering
Generalized smoothing splines and the optimal discretization of the Wiener filter
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing - Part I
Analytic Properties and Covariance Functions for a New Class of Generalized Gibbs Random Fields
IEEE Transactions on Information Theory
IEEE Journal on Selected Areas in Communications
Image Reconstruction and Multidimensional Field Estimation From Randomly Scattered Sensors
IEEE Transactions on Image Processing
Hi-index | 35.68 |
This paper addresses the spatial interpolation of scattered data in d dimensions. The problem is approached using the theory of Spartan spatial random fields (SSRFs), focusing on a specific Gaussian SSRF, i.e., the fluctuation-gradient-curvature (FGC) model. A family of spatial interpolators (predictors) is formulated by maximizing the FGC-SSRF probability density function at each prediction point, conditioned by the data. An analytical expression for the general uniform bandwidth Spartan (GUBS) predictor is derived. The linear weights of this predictor involve weighted summations of kernel functions over the sample and prediction points. Approximations for the sums are obtained at the asymptotic limit of a dense sampling network, leading to simplified explicit expressions of the weights. An asymptotic locally adaptive Spartan (ALAS) predictor is defined by means of a kernel family that involves a tunable local parameter. The relevant equations are fully developed in d = 2. Using simulated data in two dimensions, we show that the ALAS prediction accuracy is comparable to that of ordinary kriging (OK), which is an optimal spatial linear predictor (SOLP). The numerical complexity of the ALAS predictor increases linearly with the sample size, in contrast with the cubic dependence of OK. For large data sets, the ALAS predictor is shown to be orders of magnitude faster than OK at the cost of a slightly higher prediction dispersion. The performance of the ALAS predictor and OK are compared on a data set of rainfall measurements using cross validation measures.