Efficient algorithms for general active learning

  • Authors:
  • Claire Monteleoni

  • Affiliations:
  • MIT

  • Venue:
  • COLT'06 Proceedings of the 19th annual conference on Learning Theory
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Selective sampling, a realistic active learning model, has received recent attention in the learning theory literature. While the analysis of selective sampling is still in its infancy, we focus here on one of the (seemingly) simplest problems that remain open. Given a pool of unlabeled examples, drawn i.i.d. from an arbitrary input distribution known to the learner, and oracle access to their labels, the objective is to achieve a target error-rate with minimum label-complexity, via an efficient algorithm. No prior distribution is assumed over the concept class, however the problem remains open even under the realizability assumption: there exists a target hypothesis in the concept class that perfectly classifies all examples, and the labeling oracle is noiseless. As a precise variant of the problem, we consider the case of learning homogeneous half-spaces in the realizable setting: unlabeled examples, xt, are drawn i.i.d. from a known distribution D over the surface of the unit ball in ℝdand labels ytare either –1 or +1. The target function is a half-space ux ≥0 represented by a unit vector u ∈ℝdsuch that yt(uxt) 0 for all t. We denote a hypothesis v’s prediction as v(x)=SGN(v ·x).