Linear Discriminant Analysis for Two Classes via Removal of Classification Structure
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Deterministic Annealing Approach for Parsimonious Design of Piecewise Regression Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Two Variations on Fisher's Linear Discriminant for Pattern Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
On Optimal Pairwise Linear Classifiers for Normal Distributions: The Two-Dimensional Case
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Selecting the best hyperplane in the framework of optimal pairwise linear classifiers
Pattern Recognition Letters
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rapid and brief communication: Why direct LDA is not equivalent to LDA
Pattern Recognition
A new approach to multi-class linear dimensionality reduction
CIARP'06 Proceedings of the 11th Iberoamerican conference on Progress in Pattern Recognition, Image Analysis and Applications
Hi-index | 0.00 |
A new linear dimensionality reduction (LDR) technique for pattern classification and machine learning is presented, which, though linear, aims at maximizing the Chernoff distance in the transformed space. The corresponding two-class criterion, which is maximized via a gradient-based algorithm, is presented and initialization procedures are also discussed. Empirical results of this and traditional LDR approaches combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data show that the proposed criterion outperforms the traditional schemes.