On the performance of chernoff-distance-based linear dimensionality reduction techniques

  • Authors:
  • Mohammed Liakat Ali;Luis Rueda;Myriam Herrera

  • Affiliations:
  • School of Computer Science, University of Windsor, Windsor, ON, Canada;Department of Computer Science, University of Concepción, Concepción, Chile;Institute of Informatics, National University of San Juan, Cereceto y Meglioli, San Juan, Argentina

  • Venue:
  • AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a performance analysis of three linear dimensionality reduction techniques: Fisher's discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), which aims to maximize the Chernoff distance in the transformed space. A comprehensive performance analysis of these methods combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data shows that LD and RH outperform FDA, specially in the quadratic classifier, which is strongly related to the Chernoff distance in the transformed space. In the case of the linear classifier, the superiority of RH over the other two methods is also demonstrated.