Discriminant Subspace Analysis: A Fukunaga-Koontz Approach

  • Authors:
  • Sheng Zhang;Terence Sim

  • Affiliations:
  • -;-

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.14

Visualization

Abstract

The Fisher Linear Discriminant (FLD) is commonly used in pattern recognition. It finds a linear subspace that maximally separates class patterns according to the Fisher Criterion. Several methods of computing the FLD have been proposed in the literature, most of which require the calculation of the so-called scatter matrices. In this paper, we bring a fresh perspective to FLD via the Fukunaga-Koontz Transform (FKT). We do this by decomposing the whole data space into four subspaces with different discriminability, as measured by eigenvalue ratios. By connecting the eigenvalue ratio with the generalized eigenvalue, we show where the Fisher Criterion is maximally satisfied. We prove the relationship between FLD and FKT analytically, and propose a unified framework to understanding some existing work. Furthermore, we extend our our theory to Multiple Discriminant Analysis (MDA). This is done by transforming the data into intra- and extra-class spaces, followed by maximizing the Bhattacharyya distance. Based on our FKT analysis, we identify the discriminant subspaces of MDA/FKT, and propose an efficient algorithm, which works even when the scatter matrices are singular, or too large to be formed. Our method is general and may be applied to different pattern recognition problems. We validate our method by experimenting on synthetic and real data.