Strong Minimax Lower Bounds for Learning

  • Authors:
  • András Antos;Gábor Lugosi

  • Affiliations:
  • Department of Mathematics and Computer Science, Faculty of Electrical Engineering, Technical University of Budapest, 1521 Stoczek u.2, Budapest, Hungary. E-mail: antos@inf.bme.hu;Department of Economics, Pompeu Fabra University, Ramon Trias Fargas, 25-27, 08005 Barcelona, Spain. E-mail: lugosi@upf.es

  • Venue:
  • Machine Learning - Special issue on the ninth annual conference on computational theory (COLT '96)
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

Minimax lower bounds for concept learning state, forexample, that for each sample size n and learning rule g_n, thereexists a distribution of the observation X and a concept C to be learnt such that the expectederror of g_n is at least a constant times V/n,where V is the vc dimension of the concept class. However, these bounds do not tell anything about therate of decrease of the error for a fixed distribution-concept pair.In this paper we investigate minimax lower bounds in such a (stronger) sense. We show that forseveral natural k-parameter concept classes, including theclass of linear halfspaces, the class of balls, the class of polyhedrawith a certain number of faces, and a class of neural networks,for any sequence of learningrules g_n, there exists a fixed distribution of X and a fixed conceptC such that the expected error is larger than a constant timesk/n for infinitely many n. We also obtain suchstrong minimax lower bounds for the tail distribution of theprobability of error, which extend the corresponding minimaxlower bounds.