A New and Fast Orthogonal Linear Discriminant Analysis on Undersampled Problems

  • Authors:
  • Delin Chu;Siong Thye Goh

  • Affiliations:
  • matchudl@nus.edu.sg and g0700501@nus.edu.sg;-

  • Venue:
  • SIAM Journal on Scientific Computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Dimensionality reduction has become a ubiquitous preprocessing step in many applications. Linear discriminant analysis (LDA) has been known to be one of the most optimal dimensionality reduction methods for classification. However, a main disadvantage of LDA is that the so-called total scatter matrix must be nonsingular. But, in many applications, the scatter matrices can be singular since the data points are from a very high-dimensional space, and thus usually the number of the data samples is smaller than the data dimension. This is known as the undersampled problem. Many generalized LDA methods have been proposed in the past to overcome this singularity problem. There is a commonality for these generalized LDA methods; that is, they compute the optimal linear transformations by computing some eigen-decompositions and involving some matrix inversions. However, the eigen-decomposition is computationally expensive, and the involvement of matrix inverses may lead to the methods not numerically stable if the associated matrices are ill-conditioned. Hence, many existing LDA methods have high computational cost and have potential numerical instability problems. In this paper we present a new orthogonal LDA method for the undersampled problem. The main features of our proposed LDA method include the following: (i) the optimal transformation matrix is obtained easily by only orthogonal transformations without computing any eigen-decomposition and matrix inverse, and, consequently, our LDA method is inverse-free and numerically stable; (ii) our LDA method is implemented by using several QR factorizations and is a fast one. The effectiveness of our new method is illustrated by some real-world data sets.