Approximate information discriminant analysis: A computationally simple heteroscedastic feature extraction technique

  • Authors:
  • Koel Das;Zoran Nenadic

  • Affiliations:
  • Department of Electrical Engineering and Computer Science, University of California, Irvine, CA 92697, USA;Department of Biomedical Engineering, University of California, Irvine, CA 92697, USA

  • Venue:
  • Pattern Recognition
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this article we develop a novel linear dimensionality reduction technique for classification. The technique utilizes the first two statistical moments of data and retains the computational simplicity, characteristic of second-order techniques, such as linear discriminant analysis. Formally, the technique maximizes a criterion that belongs to the class of probability dependence measures, and is naturally defined for multiple classes. The criterion is based on an approximation of an information-theoretic measure and is capable of handling heteroscedastic data. The performance of our method, along with similar feature extraction approaches, is demonstrated based on experimental results with real-world datasets. Our method compares favorably to similar second-order linear dimensionality techniques.