On Optimal Pairwise Linear Classifiers for Normal Distributions: The Two-Dimensional Case
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selecting the best hyperplane in the framework of optimal pairwise linear classifiers
Pattern Recognition Letters
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Where Are Linear Feature Extraction Methods Applicable?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Subclass Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
A new approach to multi-class linear dimensionality reduction
CIARP'06 Proceedings of the 11th Iberoamerican conference on Progress in Pattern Recognition, Image Analysis and Applications
Hi-index | 0.00 |
Linear dimensionality reduction techniques have been studied very well for the two-class problem, while the corresponding issues encountered when dealing with multiple classes are far from trivial. In this paper, we show that dealing with multiple classes, it is not expedient to treat it as a multi-class problem, but it is better to treat it as an ensemble of Chernoff-based two-class reductions onto different subspaces. The solution is achieved by resorting to either Voting, Weighting, or a Decision Treecombination scheme. The ensemble methods were tested on benchmark datasets demonstrating that the proposed method is not only efficient, but also yields an accuracy comparable to that obtained by the optimal Bayes classifier.