Kernel Self-optimized Locality Preserving Discriminant Analysis for feature extraction and recognition

  • Authors:
  • Jun-Bao Li;Jeng-Shyang Pan;Shyi-Ming Chen

  • Affiliations:
  • Department of Automatic Test and Control, Harbin Institute of Technology, Harbin, China;Department of Electronics, National Kaohsiung University of Applied Sciences, Kaohsiung, Taiwan, ROC;Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology, Taipei, Taiwan, ROC and Graduate Institute of Educational Measurement and Statisti ...

  • Venue:
  • Neurocomputing
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

We propose Kernel Self-optimized Locality Preserving Discriminant Analysis (KSLPDA) for feature extraction and recognition. The procedure of KSLPDA is divided into two stages, i.e., one is to solve the optimal expansion of the data-dependent kernel with the proposed kernel self-optimization method, and the second is to seek the optimal projection matrix for dimensionality reduction. Since the optimal parameters of data-dependent kernel are achieved automatically through solving the constraint optimization equation, based on maximum margin criterion and Fisher criterion in the empirical feature space, KSLPDA works well on feature extraction for classification. The comparative experiments show that KSLPDA outperforms PCA, LDA, LPP, supervised LPP and kernel supervised LPP.