A Gradient Linear Discriminant Analysis for Small Sample Sized Problem

  • Authors:
  • Alok Sharma;Kuldip K. Paliwal

  • Affiliations:
  • Signal Processing Lab, Griffith University, Brisbane, Australia;Signal Processing Lab, Griffith University, Brisbane, Australia

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

The purpose of conventional linear discriminant analysis (LDA) is to find an orientation which projects high dimensional feature vectors of different classes to a more manageable low dimensional space in the most discriminative way for classification. The LDA technique utilizes an eigenvalue decomposition (EVD) method to find such an orientation. This computation is usually adversely affected by the small sample size problem. In this paper we have presented a new direct LDA method (called gradient LDA) for computing the orientation especially for small sample size problem. The gradient descent based method is used for this purpose. It also avoids discarding the null space of within-class scatter matrix and between-class scatter matrix which may have discriminative information useful for classification.