A Geometric Approach to Leveraging Weak Learners

  • Authors:
  • Nigel Duffy;David P. Helmbold

  • Affiliations:
  • -;-

  • Venue:
  • EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

AdaBoost is a popular and effective leveraging procedure for improving the hypotheses generated by weak learning algorithms. AdaBoost and many other leveraging algorithms can be viewed as performing a constrained gradient descent over a potential function. At each iteration the distribution over the sample given to the weak learner is the direction of steepest descent. We introduce a new leveraging algorithm based on a natural potential function. For this potential function, the direction of steepest descent can have negative components. Therefore we provide two transformations for obtaining suitable distributions from these directions of steepest descent. The resulting algorithms have bounds that are incomparable to AdaBoost's, and their empirical performance is similar to AdaBoost's.