Maximum Contrast Classifiers

  • Authors:
  • Peter Meinicke;Thorsten Twellmann;Helge Ritter

  • Affiliations:
  • -;-;-

  • Venue:
  • ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Within the Bayesian setting of classification we present a method for classifier design based on constrained density modelling. The approach leads to maximization of some contrast function, which measures the discriminative power of the class-conditional densities used for classification. By an upper bound on the density contrast the sensitivity of the classifiers can be increased in regions with low density differences which are usually most important for discrimination. We introduce a parametrization of the contrast in terms of modified kernel density estimators with variable mixing weights. In practice the approach shows some favourable properties: first, for fixed hyperparameters, training of the resulting Maximum Contrast Classifier (MCC) is achieved by linear programming for optimization of the mixing weights. Second for a certain choice of the density contrast bound and the kernel bandwidth, the maximum contrast solutions lead to sparse representations of the classifiers with good generalization performance, similar to the maximum margin solutions of support vector machines. Third the method is readily furnished for the general multi-class problem since training proceeds in the same way as in the binary case.