The nature of statistical learning theory
The nature of statistical learning theory
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Cognitive Neuroscience
Discriminative Locality Alignment
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part I
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Transductive Component Analysis
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Locality sensitive discriminant analysis
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Generalized KPCA by adaptive rules in feature space
International Journal of Computer Mathematics
Discriminant Locally Linear Embedding With High-Order Tensor Data
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Gabor-Based Region Covariance Matrices for Face Recognition
IEEE Transactions on Circuits and Systems for Video Technology
Spiral cube for biometric template protection
ICISP'12 Proceedings of the 5th international conference on Image and Signal Processing
Security analysis of key binding biometric cryptosystems
ICISP'12 Proceedings of the 5th international conference on Image and Signal Processing
Hi-index | 0.01 |
In this paper, we show how support vector machine (SVM) can be employed as a powerful tool for k-nearest neighbor (kNN) classifier. A novel multi-class dimensionality reduction approach, discriminant analysis via support vectors (SVDA), is proposed. First, the SVM is employed to compute an optimal direction to discriminant each two classes. Then, the criteria of class separability is constructed. At last, the projection matrix is computed. The kernel mapping idea is used to derive the non-linear version, kernel discriminant via support vectors (SVKD). In SVDA, only support vectors are involved to compute the transformation matrix. Thus, the computational complexity can be greatly reduced for kernel based feature extraction. Experiments carried out on several standard databases show a clear improvement on LDA-based recognition.