A theoretical comparison of two linear dimensionality reduction techniques

  • Authors:
  • Luis Rueda;Myriam Herrera

  • Affiliations:
  • Department of Computer Science and Center for Biotechnology, University of Concepción, Concepción, Chile;Department and Institute of Informatics, National University of San Juan, San Juan, Argentina

  • Venue:
  • CIARP'06 Proceedings of the 11th Iberoamerican conference on Progress in Pattern Recognition, Image Analysis and Applications
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques, namely Fisher's discriminant (FD) and Loog-Duin (LD) dimensionality reduciton, is presented. The necessary and sufficient conditions for which FD and LD provide the same linear transformation are discussed and proved. To derive these conditions, it is first shown that the two criteria preserve the same maximum value after a diagonalization process is applied, and then the necessary and sufficient conditions for various cases, including coincident covariance matrices, coincident prior probabilities, and for when one of the covariances is the identity matrix. A measure for comparing the two criteria is derived from the necessary and sufficient conditions, and used to empirically show that the conditions are statistically related to the classification error for a post-processing quadratic classifier and the Chernoff distance in the transformed space.