Learning with infinitely many features

  • Authors:
  • A. Rakotomamonjy;R. Flamary;F. Yger

  • Affiliations:
  • LITIS, EA 4108, Université/INSA de Rouen, Saint Etienne du Rouvray, France 76801;Laboratoire Lagrange, UMR CNRS 7293, Observatoire de la Côte d'Azur, Université de Nice Sophia-Antipolis, Nice, France;LITIS, EA 4108, Université/INSA de Rouen, Saint Etienne du Rouvray, France 76801

  • Venue:
  • Machine Learning
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a principled framework for learning with infinitely many features, situations that are usually induced by continuously parametrized feature extraction methods. Such cases occur for instance when considering Gabor-based features in computer vision problems or when dealing with Fourier features for kernel approximations. We cast the problem as the one of finding a finite subset of features that minimizes a regularized empirical risk. After having analyzed the optimality conditions of such a problem, we propose a simple algorithm which has the flavour of a column-generation technique. We also show that using Fourier-based features, it is possible to perform approximate infinite kernel learning. Our experimental results on several datasets show the benefits of the proposed approach in several situations including texture classification and large-scale kernelized problems (involving about 100 thousand examples).