Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Robust linear dimensionality reduction
IEEE Transactions on Visualization and Computer Graphics
An optimization criterion for generalized discriminant analysis on undersampled problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalizing discriminant analysis using the generalized singular value decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.02 |
Subspace analysis is an effective technique for dimensionality reduction, which aims at finding a low-dimensional space of high-dimensional data. In this paper, a novel subspace method called robust kernel discriminant analysis is proposed for dimensionality reduction. An optimization function is firstly defined in terms of the distance between similar elements and the distance between dissimilar elements, which can preserve the structure of the data in the mapping space. Then the optimization function is transformed into an eigenvalue problem and the projection vectors are obtained by solving the eigenvalue problem. Finally, experimental results on face images and handwritten numerical characters demonstrate the effectiveness and feasibility of the proposed method.