A Fast Feature-based Dimension Reduction Algorithm for Kernel Classifiers

  • Authors:
  • Senjian An;Wanquan Liu;Svetha Venkatesh;Ronny Tjahyadi

  • Affiliations:
  • Department of Computing, Curtin University of Technology, Perth, Australia 6845;Department of Computing, Curtin University of Technology, Perth, Australia 6845;Department of Computing, Curtin University of Technology, Perth, Australia 6845;Department of Computing, Curtin University of Technology, Perth, Australia 6845

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a novel dimension reduction algorithm for kernel based classification. In the feature space, the proposed algorithm maximizes the ratio of the squared between-class distance and the sum of the within-class variances of the training samples for a given reduced dimension. This algorithm has lower complexity than the recently reported kernel dimension reduction (KDR) for supervised learning. We conducted several simulations with large training datasets, which demonstrate that the proposed algorithm has similar performance or is marginally better compared with KDR whilst having the advantage of computational efficiency. Further, we applied the proposed dimension reduction algorithm to face recognition in which the number of training samples is very small. This proposed face recognition approach based on the new algorithm outperforms the eigenface approach based on the principal component analysis (PCA), when the training data is complete, that is, representative of the whole dataset.