The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Learning compatibility functions for feature binding and perceptual grouping
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Hi-index | 0.00 |
Within the Bayesian setting of classification we present a method for classifier design based on constrained density modelling. The approach leads to maximization of some contrast function, which measures the discriminative power of the class-conditional densities used for classification. By an upper bound on the density contrast the sensitivity of the classifiers can be increased in regions with low density differences which are usually most important for discrimination. We introduce a parametrization of the contrast in terms of modified kernel density estimators with variable mixing weights. In practice the approach shows some favourable properties: first, for fixed hyperparameters, training of the resulting Maximum Contrast Classifier (MCC) is achieved by linear programming for optimization of the mixing weights. Second for a certain choice of the density contrast bound and the kernel bandwidth, the maximum contrast solutions lead to sparse representations of the classifiers with good generalization performance, similar to the maximum margin solutions of support vector machines. Third the method is readily furnished for the general multi-class problem since training proceeds in the same way as in the binary case.