Robust linearly optimized discriminant analysis

  • Authors:
  • Zhao Zhang;Tommy W. S. Chow

  • Affiliations:
  • Department of Electronic Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong;Department of Electronic Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong

  • Venue:
  • Neurocomputing
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

Supervised Fisher Linear Discriminant Analysis (LDA) is a classical dimensionality reduction approach. LDA assumes each class has a Gaussian density and may suffer from the singularity problem when handling high-dimensional data. We in this work consider more general class densities and show that optimizing LDA criterion cannot always achieve maximum class discrimination with the geometrical interpretation. By defining new marginal inter- and intra-class scatters, we elaborate a pairwise criteria based optimized LDA technique called robust linearly optimized discriminant analysis (LODA). A multimodal extension of LODA is also presented. In extracting the informative features, two effective solution schemes are proposed. The kernelized extension of our methods is also detailed. Compared with LDA, LODA has four significant advantages. First, LODA needs not the assumption on intra-class distributions. Second, LODA characterizes the inter-class separability with the marginal criterion. Third, LODA avoids the singularity problem and is robust to outliers. Fourth, the delivered projection matrix by LODA is orthogonal. These properties make LODA more general and suitable for discriminant analysis than using LDA. The delivered results of our investigated cases demonstrate that our methods are highly competitive with and even outperform some widely used state-of-the-art techniques.