Functional classification with margin conditions

  • Authors:
  • Magalie Fromont;Christine Tuleau

  • Affiliations:
  • Université Rennes 2, Rennes, France;Université Paris-Sud, Orsay, France

  • Venue:
  • COLT'06 Proceedings of the 19th annual conference on Learning Theory
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Let (X,Y) be a $\mathcal{X}$× 0,1 valued random pair and consider a sample (X1,Y1),...,(Xn,Yn) drawn from the distribution of (X,Y). We aim at constructing from this sample a classifier that is a function which would predict the value of Y from the observation of X. The special case where $\mathcal{X}$is a functional space is of particular interest due to the so called curse of dimensionality. In a recent paper, Biau et al. [1] propose to filter the Xi’s in the Fourier basis and to apply the classical k–Nearest Neighbor rule to the first d coefficients of the expansion. The selection of both k and d is made automatically via a penalized criterion. We extend this study, and note here the penalty used by Biau et al. is too heavy when we consider the minimax point of view under some margin type assumptions. We prove that using a penalty of smaller order or equal to zero is preferable both in theory and practice. Our experimental study furthermore shows that the introduction of a small-order penalty stabilizes the selection process, while preserving rather good performances.