Elements of information theory
Elements of information theory
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Performance analysis of physical signature authentication
IEEE Transactions on Information Theory
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
New results on error correcting output codes of kernel machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, we consider the multiclass classification problem based on sets of independent binary classifiers. Each binary classifier represents the output of a quantized projection of training data onto a randomly generated orthonormal basis vector thus producing a binary label. The ensemble of all binary labels forms an analogue of a coding matrix. The properties of such kind of matrices and their impact on the maximum number of uniquely distinguishable classes are analyzed in this paper from an information-theoretic point of view. We also consider a concept of reliability for such kind of coding matrix generation that can be an alternative to other adaptive training techniques and investigate the impact on the bit error probability. We demonstrate that it is equivalent to the considered random coding matrix without any bit reliability information in terms of recognition rate.