Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Fast computation of low rank matrix approximations
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Artificial Intelligence
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
A formal analysis of why heuristic functions work
Artificial Intelligence
Problem-Solving Methods in Artificial Intelligence
Problem-Solving Methods in Artificial Intelligence
Multi-category classification by kernel based nonlinear subspace method
ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 02
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Two-dimensional subspace classifiers for face recognition
Neurocomputing
Subspace based linear programming support vector machines
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Automatic model selection for the optimization of SVM kernels
Pattern Recognition
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Improved kernel common vector method for face recognition varying in background conditions
CompIMAGE'10 Proceedings of the Second international conference on Computational Modeling of Objects Represented in Images
Hi-index | 0.14 |
In Kernel-based Nonlinear Subspace (KNS) methods, the subspace dimensions have a strong influence on the performance of the subspace classifier. In order to get a high classification accuracy, a large dimension is generally required. However, if the chosen subspace dimension is too large, it leads to a low performance due to the overlapping of the resultant subspaces and, if it is too small, it increases the classification error due to the poor resulting approximation. The most common approach is of an ad hoc nature, which selects the dimensions based on the so-called cumulative proportion [13] computed from the kernel matrix for each class. In this paper, we propose a new method of systematically and efficiently selecting optimal or near-optimal subspace dimensions for KNS classifiers using a search strategy and a heuristic function termedthe Overlapping criterion. The rationale for this function has been motivated in the body of the paper. The task of selecting optimal subspace dimensions is reduced to finding the best ones from a given problem-domain solution space using this criterion as a heuristic function. Thus, the search space can be pruned to very efficiently find the best solution. Our experimental results demonstrate that the proposed mechanism selects the dimensions efficiently without sacrificing the classification accuracy.