The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins

  • Authors:
  • Cynthia Rudin;Ingrid Daubechies;Robert E. Schapire

  • Affiliations:
  • -;-;-

  • Venue:
  • The Journal of Machine Learning Research
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

In order to study the convergence properties of the AdaBoost algorithm, we reduce AdaBoost to a nonlinear iterated map and study the evolution of its weight vectors. This dynamical systems approach allows us to understand AdaBoost's convergence properties completely in certain cases; for these cases we find stable cycles, allowing us to explicitly solve for AdaBoost's output.Using this unusual technique, we are able to show that AdaBoost does not always converge to a maximum margin combined classifier, answering an open question. In addition, we show that "non-optimal" AdaBoost (where the weak learning algorithm does not necessarily choose the best weak classifier at each iteration) may fail to converge to a maximum margin classifier, even if "optimal" AdaBoost produces a maximum margin. Also, we show that if AdaBoost cycles, it cycles among "support vectors", i.e., examples that achieve the same smallest margin.