Kernel Uncorrelated and Regularized Discriminant Analysis: A Theoretical and Computational Study

  • Authors:
  • Shuiwang Ji;Jieping Ye

  • Affiliations:
  • Arizona State University, Tempe;Arizona State University, Tempe

  • Venue:
  • IEEE Transactions on Knowledge and Data Engineering
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Linear and kernel discriminant analysis are popular approaches for supervised dimensionality reduction. Uncorrelated and regularized discriminant analysis have been proposed to overcome the singularity problem encountered by classical discriminant analysis. In this paper, we study the properties of kernel uncorrelated and regularized discriminant analysis, called KUDA and KRDA, respectively. In particular, we show that under a mild condition, both linear and kernel uncorrelated discriminant analysis project samples in the same class to a common vector in the dimensionality-reduced space. This implies that uncorrelated discriminant analysis may suffer from the overfitting problem if there are a large number of samples in each class. We show that as the regularization parameter in KRDA tends to zero, KRDA approaches KUDA. This shows that KUDA is a special case of KRDA, and that regularization can be applied to overcome the overfitting problem in uncorrelated discriminant analysis. As the performance of KRDA depends on the value of the regularization parameter, we show that the matrix computations involved in KRDA can be simplified, so that a large number of candidate values can be crossvalidated efficiently. Finally, we conduct experiments to evaluate the proposed theories and algorithms.