Upper Bounds for Error Rates of Linear Combinations of Classifiers

  • Authors:
  • Alejandro Murua

  • Affiliations:
  • Insightful Corporation, Seattle, WA

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 2002

Quantified Score

Hi-index 0.15

Visualization

Abstract

A useful notion of weak dependence between many classifiers constructed with the same training data is introduced. It is shown that if both this weak dependence is low and the expected margins are large, then decison rules based on linear combinations of these classifiers can achieve error rates that decrease exponentially fast. Empirical results with randomized trees and trees constructed via boosting and bagging show that weak dependence is present in these type of trees. Furthermore, these results also suggest that there is a trade-off between weak dependence and expected margins, in the sense that to compensate for low expected margins, there should be low mutual dependence between the classifiers involved in the linear combination.