Machine Learning
Pairwise classification and support vector machines
Advances in kernel methods
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
IEEE Transactions on Neural Networks
Multicategory Proximal Support Vector Machine Classifiers
Machine Learning
Adapting two-class support vector classification methods to many class problems
ICML '05 Proceedings of the 22nd international conference on Machine learning
A novel and quick SVM-based multi-class classifier
Pattern Recognition
A framework for kernel-based multi-category classification
Journal of Artificial Intelligence Research
ICONIP'06 Proceedings of the 13th international conference on Neural information processing - Volume Part III
Probabilistic outputs for twin support vector machines
Knowledge-Based Systems
Fast sparse approximation of extreme learning machine
Neurocomputing
Hi-index | 0.00 |
A common way of solving the multiclass categorization problem is to reformulate the problem into a set of binary classification problems. Discriminative binary classifiers like, e.g., Support Vector Machines (SVMs), directly optimize the decision boundary with respect to a certain cost function. In a pragmatic and computationally simple approach, Least Squares SVMs (LS-SVMs) are inferred by minimizing a related regression least squares cost function. The moderated outputs of the binary classifiers are obtained in a second step within the evidence framework. In this paper, Bayes' rule is repeatedly applied to infer the posterior multiclass probabilities, using the moderated outputs of the binary plug-in classifiers and the prior multiclass probabilities. This Bayesian decoding motivates the use of loss function based decoding instead of Hamming decoding. For SVMs and LS-SVMs with linear kernel, experimental evidence suggests the use of one-versus-one coding. With a Radial Basis Function kernel one-versus-one and error correcting output codes yield the best performances, but simpler codings may still yield satisfactory results.