Kernel quadratic discriminant analysis for small sample size problem

  • Authors:
  • Jie Wang;K. N. Plataniotis;Juwei Lu;A. N. Venetsanopoulos

  • Affiliations:
  • Department of Electrical and Computer Engineering, University of Toronto, 10 King's College Road, Toronto, Canada M5A 3G4;Department of Electrical and Computer Engineering, University of Toronto, 10 King's College Road, Toronto, Canada M5A 3G4;Vidient Systems, Inc., 4000 Burton Dr., Santa Clara, CA 95054, USA;Ryerson University, 360 Victoria St, Toronto, Canada M5B 2K3

  • Venue:
  • Pattern Recognition
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

It is generally believed that quadratic discriminant analysis (QDA) can better fit the data in practical pattern recognition applications compared to linear discriminant analysis (LDA) method. This is due to the fact that QDA relaxes the assumption made by LDA-based methods that the covariance matrix for each class is identical. However, it still assumes that the class conditional distribution is Gaussian which is usually not the case in many real-world applications. In this paper, a novel kernel-based QDA method is proposed to further relax the Gaussian assumption by using the kernel machine technique. The proposed method solves the complex pattern recognition problem by combining the QDA solution and the kernel machine technique, and at the same time, tackles the so-called small sample size problem through a regularized estimation of the covariance matrix. Extensive experimental results indicate that the proposed method is a more sophisticated solution outperforming many traditional kernel-based learning algorithms.