Agnostically learning decision trees

  • Authors:
  • Parikshit Gopalan;Adam Tauman Kalai;Adam R. Klivans

  • Affiliations:
  • University of Washington, Seattle, WA, USA;Georgia Institute of Technology, Atlanta, GA, USA;UT-Austin, Austin, TX, USA

  • Venue:
  • STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We give a query algorithm for agnostically learning decision trees with respect to the uniform distribution on inputs. Given black-box access to an *arbitrary* binary function f on the n-dimensional hypercube, our algorithm finds a function that agrees with f on almost (within an epsilon fraction) as many inputs as the best size-t decision tree, in time poly(n,t,1ε). This is the first polynomial-time algorithm for learning decision trees in a harsh noise model. We also give a *proper* agnostic learning algorithm for juntas, a sub-class of decision trees, again using membership queries. Conceptually, the present paper parallels recent work towards agnostic learning of halfspaces (Kalai et al, 2005); algorithmically, it is more challenging. The core of our learning algorithm is a procedure to implicitly solve a convex optimization problem over the L1 ball in 2n dimensions using an approximate gradient projection method.