On selection and combination of weak learners in AdaBoost

  • Authors:
  • Changxin Gao;Nong Sang;Qiling Tang

  • Affiliations:
  • Institute for Pattern Recognition and Artificial Intelligence, Huazhong University of Science and Technology, Wuhan, 430074, China;Institute for Pattern Recognition and Artificial Intelligence, Huazhong University of Science and Technology, Wuhan, 430074, China and Department of Mathematics and Physics, Wuhan Polytechnic Univ ...;Institute for Pattern Recognition and Artificial Intelligence, Huazhong University of Science and Technology, Wuhan, 430074, China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2010

Quantified Score

Hi-index 0.10

Visualization

Abstract

Despite of its great success, two key problems are still unresolved for AdaBoost algorithms: how to select the most discriminative weak learners and how to optimally combine them. In this paper, a new AdaBoost algorithm is proposed to make improvement in the two aspects. First, we select the most discriminative weak learners by minimizing a novel distance related criterion, i.e., error-degree-weighted training error metric (ETEM) together with generalization capability metric (GCM), rather than training error rate only. Second, after getting the coefficients that are set empirically, we combine the weak learners optimally by tuning the coefficients using kernel-based perceptron. Experiments with synthetic and real scene data sets show our algorithm outperforms conventional AdaBoost.