Classifier combining rules under independence assumptions

  • Authors:
  • Shoushan Li;Chengqing Zong

  • Affiliations:
  • National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China

  • Venue:
  • MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Classifier combining rules are designed for the fusion of the results from the component classifiers in a multiple classifier system. In this paper, we firstly propose a theoretical explanation of one important classifier combining rule, the sum rule, adopting the Bayes viewpoint under some independence assumptions. Our explanation is more general than what did in the existed previous by Kittler et al. [1]. Then, we present a new combining rule, named SumPro rule, which combines the sum rule with the product rule in a weighted average way. The weights for combining the two rules are tuned according to the development data using a genetic algorithm. The experimental evaluation and comparison among some combining rules are reported, which are done on a biometric authentication set. The results show that the SumPro rule takes a distinct advantage over both the sum rule and the product rule. Moreover, this new rule gradually outperforms the other popular trained combining rules when the classifier number increases.