An incremental dimensionality reduction method on discriminant information for pattern classification

  • Authors:
  • Xiaoqin Hu;Zhixia Yang;Ling Jing

  • Affiliations:
  • College of Science, China Agricultural University, 100083 Beijing, PR China;College of Mathematics and Systems Science, Xinjiang University, 830046 Urumuqi, PR China and Academy of Mathematics and Systems Science, CAS, 100190 Beijing, PR China;College of Science, China Agricultural University, 100083 Beijing, PR China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.10

Visualization

Abstract

From the view of classification, linear discriminant analysis (LDA) is a proper dimensionality reduction method which finds an optimal linear transformation that maximizes the class separability. However it is difficult to apply LDA in under sampled problems where the number of data samples is smaller than the dimensionality of data space, due to the singularity of scatter matrices caused by high-dimensionality. In order to make LDA applicable, we propose a new dimensionality reduction algorithm called discriminant multidimensional mapping (DMM), which combines the advantages of multidimensional scaling (MDS) and LDA. DMM is effective for small sample datasets with high-dimensionality. Its superiority is given from theoretical point of view. Then we extend DMM for large datasets and datasets with non-linear manifold respectively, and get two algorithms: landmark DMM (LDMM) and geodesic-metric discriminant mapping (GDM). The performances of these algorithms are also shown by preliminary numerical experiments.