Non-iterative Heteroscedastic Linear Dimension Reduction for Two-Class Data
Proceedings of the Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition, Third Edition
Pattern Recognition, Third Edition
Rapid and brief communication: Why direct LDA is not equivalent to LDA
Pattern Recognition
Hi-index | 0.10 |
We present a theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques for two classes, a homoscedastic LDR scheme, Fisher's discriminant (FD), and a heteroscedastic LDR scheme, Loog-Duin (LD). We formalize the necessary and sufficient conditions for which the FD and LD criteria are maximized for the same linear transformation. To derive these conditions, we first show that the two criteria preserve the same maximum values after a diagonalization process is applied. We derive the necessary and sufficient conditions for various cases, including coincident covariance matrices, coincident prior probabilities, and for when one of the covariances is the identity matrix. We empirically show that the conditions are statistically related to the classification error for a post-processing one-dimensional quadratic classifier and the Chernoff distance in the transformed space.