Microchoice Bounds and Self Bounding Learning Algorithms

  • Authors:
  • John Langford;Avrim Blum

  • Affiliations:
  • Computer Science Department, Carnegie Mellon University, Pittsburgh, PA 15213, USA. jcl@cs.cmu.edu;Computer Science Department, Carnegie Mellon University, Pittsburgh, PA 15213, USA. avrim@cs.cmu.edu

  • Venue:
  • Machine Learning
  • Year:
  • 2003
  • Occam's hammer

    COLT'07 Proceedings of the 20th annual conference on Learning theory

Quantified Score

Hi-index 0.00

Visualization

Abstract

A major topic in machine learning is to determine good upper bounds on the true error rates of learned hypotheses based upon their empirical performance on training data. In this paper, we demonstrate new adaptive bounds designed for learning algorithms that operate by making a sequence of choices. These bounds, which we call Microchoice bounds, are similar to Occam-style bounds and can be used to make learning algorithms self-bounding in the style of Freund (1998). We then show how to combine these bounds with Freund's query-tree approach producing a version of Freund's query-tree structure that can be implemented with much more algorithmic efficiency.