Random approximations to some measures of accuracy in nonparametric curve estimation
Journal of Multivariate Analysis
Approximation and Estimation Bounds for Artificial Neural Networks
Machine Learning - Special issue on computational learning theory
Asymptotic normality of a combined regression estimator
Journal of Multivariate Analysis
A well-conditioned estimator for large-dimensional covariance matrices
Journal of Multivariate Analysis
Sample covariance shrinkage for high dimensional dependent data
Journal of Multivariate Analysis
Hi-index | 0.00 |
Nonparametric density estimators on R^K may fail to be consistent when the sample size n does not grow fast enough relative to reduction in smoothing. For example a Gaussian kernel estimator with bandwidths proportional to some sequence h"n is not consistent if nh"n^K fails to diverge to infinity. The paper studies shrinkage estimators in this scenario and shows that we can still meaningfully use-in a sense to be specified in the paper-a nonparametric density estimator in high dimensions, even when it is not asymptotically consistent. Due to the ''curse of dimensionality'', this framework is quite relevant to many practical problems. In this context, unlike other studies, the reason to shrink towards a possibly misspecified low dimensional parametric estimator is not to improve on the bias, but to reduce the estimation error.