A theoretical comparison of two-class Fisher's and heteroscedastic linear dimensionality reduction schemes

  • Authors:
  • Luis Rueda;Myriam Herrera

  • Affiliations:
  • Member of the IEEE, School of Computer Science, University of Windsor, 401 Sunset Avenue, Windsor, ON, Canada N9B 3P4;Department and Institute of Informatics, National University of San Juan, Cereceto y Meglioli, San Juan 5400, Argentina

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2008

Quantified Score

Hi-index 0.10

Visualization

Abstract

We present a theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques for two classes, a homoscedastic LDR scheme, Fisher's discriminant (FD), and a heteroscedastic LDR scheme, Loog-Duin (LD). We formalize the necessary and sufficient conditions for which the FD and LD criteria are maximized for the same linear transformation. To derive these conditions, we first show that the two criteria preserve the same maximum values after a diagonalization process is applied. We derive the necessary and sufficient conditions for various cases, including coincident covariance matrices, coincident prior probabilities, and for when one of the covariances is the identity matrix. We empirically show that the conditions are statistically related to the classification error for a post-processing one-dimensional quadratic classifier and the Chernoff distance in the transformed space.