Information-based complexity
Optimal importance sampling for the approximation of integrals
Journal of Complexity
Hi-index | 0.00 |
Hinrichs (2009) [3] recently studied multivariate integration defined over reproducing kernel Hilbert spaces in the randomized setting and for the normalized error criterion. In particular, he showed that such problems are strongly polynomially tractable if the reproducing kernels are pointwise nonnegative and integrable. More specifically, let n^r^a^n(@e,INT"d) be the minimal number of randomized function samples that is needed to compute an@e-approximation for the d-variate case of multivariate integration. Hinrichs proved that n^r^a^n(@e,INT"d)@?@?@p2(1@e)^2@?for all@e@?(0,1) and d@?N. In this paper we prove that the exponent 2 of @e^-^1 is sharp for tensor product Hilbert spaces whose univariate reproducing kernel is decomposable and univariate integration is not trivial for the two parts of the decomposition. More specifically we have n^r^a^n(@e,INT"d)=@?18(1@e)^2@?for all@e@?(0,1) and d=2ln@e^-^1-ln2ln@a^-^1, where @a@?[1/2,1) depends on the particular space. We stress that these estimates hold independently of the smoothness of functions in a Hilbert space. Hence, even for spaces of very smooth functions the exponent of strong polynomial tractability must be 2. Our lower bounds hold not only for multivariate integration but for all linear tensor product functionals defined over a Hilbert space with a decomposable reproducing kernel and with a non-trivial univariate functional for the two spaces corresponding to decomposable parts. We also present lower bounds for reproducing kernels that are not decomposable but have a decomposable part. However, in this case it is not clear if the lower bounds are sharp.