Decision Committee Learning with Dynamic Integration of Classifiers

  • Authors:
  • Alexey Tsymbal

  • Affiliations:
  • -

  • Venue:
  • ADBIS-DASFAA '00 Proceedings of the East-European Conference on Advances in Databases and Information Systems Held Jointly with International Conference on Database Systems for Advanced Applications: Current Issues in Databases and Information Systems
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Decision committee learning has demonstrated spectacular success in reducing classification error from learned classifiers. These techniques develop a classifier in the form of a committee of subsidiary classifiers. The combination of outputs is usually performed by majority vote. Voting, however, has a shortcoming. It is unable to take into account local expertise. When a new instance is difficult to classify, then the average classifier will give a wrong prediction, and the majority vote will more probably result in a wrong prediction. Instead of voting, dynamic integration of classifiers can be used, which is based on the assumption that each committee member is best inside certain subareas of the whole feature space. In this paper, the proposed dynamic integration technique is evaluated with AdaBoost and Bagging, the decision committee approaches which have received extensive attention recently. The comparison results show that boosting and bagging have often significantly better accuracy with dynamic integration of classifiers than with simple voting.