Minimax nonparametric classification. II. Model selection for adaptation

  • Authors:
  • Yuhong Yang

  • Affiliations:
  • Dept. of Stat., Iowa State Univ., Ames, IA

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 1999

Quantified Score

Hi-index 754.84

Visualization

Abstract

For pt.I see ibid., vol.45, no.7, p.2271-84 (1999). We study nonparametric estimation of a conditional probability for classification based on a collection of finite-dimensional models. For the sake of flexibility, different types of models, linear or nonlinear, are allowed as long as each satisfies a dimensionality assumption. We show that with a suitable model selection criterion, the penalized maximum-likelihood estimator has a risk bounded by an index of resolvability expressing a good tradeoff among approximation error, estimation error, and model complexity. The bound does not require any assumption on the target conditional probability and can be used to demonstrate the adaptivity of estimators based on model selection. Examples are given with both splines and neural nets, and problems of high-dimensional estimation are considered. The resulting adaptive estimator is shown to behave optimally or near optimally over Sobolev classes (with unknown orders of interaction and smoothness) and classes of integrable Fourier transform of gradient. In terms of rates of convergence, the performance is the same as if one knew which of them contains the true conditional probability in advance. The corresponding classifier also converges optimally or nearly optimally simultaneously over these classes