Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
On Optimal Pairwise Linear Classifiers for Normal Distributions: The Two-Dimensional Case
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Where Are Linear Feature Extraction Methods Applicable?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Subclass Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition, Third Edition
Pattern Recognition, Third Edition
Bayes Optimality in Linear Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
General Averaged Divergence Analysis
ICDM '07 Proceedings of the 2007 Seventh IEEE International Conference on Data Mining
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Transductive Component Analysis
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Patch Alignment for Dimensionality Reduction
IEEE Transactions on Knowledge and Data Engineering
Discriminant Locally Linear Embedding With High-Order Tensor Data
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Heteroscedastic linear feature extraction based on sufficiency conditions
Pattern Recognition
Error-correcting output codes based ensemble feature extraction
Pattern Recognition
Hi-index | 0.01 |
Linear dimensionality reduction (LDR) techniques have been increasingly important in pattern recognition (PR) due to the fact that they permit a relatively simple mapping of the problem onto a lower-dimensional subspace, leading to simple and computationally efficient classification strategies. Although the field has been well developed for the two-class problem, the corresponding issues encountered when dealing with multiple classes are far from trivial. In this paper, we argue that, as opposed to the traditional LDR multi-class schemes, if we are dealing with multiple classes, it is not expedient to treat it as a multi-class problem per se. Rather, we shall show that it is better to treat it as an ensemble of Chernoff-based two-class reductions onto different subspaces, whence the overall solution is achieved by resorting to either Voting, Weighting, or to a Decision Tree strategy. The experimental results obtained on benchmark datasets demonstrate that the proposed methods are not only efficient, but that they also yield accuracies comparable to that obtained by the optimal Bayes classifier.