Probabilistic Framework for Combining Multiple Classifiers at Abstract Level

  • Authors:
  • Hee-Joong Kang;Jin H. Kim

  • Affiliations:
  • -;-

  • Venue:
  • ICDAR '97 Proceedings of the 4th International Conference on Document Analysis and Recognition
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

Most previous studies assumed that classifiers behave independently. Such an assumption makes classification performance be degraded and biased, in case of adding highly dependent classifiers. In order to overcome such weakness, it should be considered to combine multiple classifiers in a probabilistic framework using a Bayesian formalism without the assumption. Probabilistic combination of K classifiers needs a (K+1)st-order probability distribution. However, it is well known that the distribution becomes unmanageable in storing and estimating, even for a small K. Chow and Liu as well as Lewis proposed a product approximation of a high order distribution with a set of only first-order tree dependencies or second-order distributions. However, if a classifier follows more than one classifier, such a first-order dependency will not be suitable to approximate the high order distribution properly. In this paper, a probabilistic framework is proposed to identify an optimal product set of kth-order dependencies for an approximation of the (K+1)st-order probability distribution where 1 \leq k \leq K, and to combine K decisions by the identified product set using a Bayesian formalism. This framework was experimented and evaluated with a standardized CENPARMI data base and showed superior performance than other combination methods.