Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Kernel principal component analysis
Advances in kernel methods
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
A robust minimax approach to classification
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Convex Optimization
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Kernels and Distances for Structured Data
Machine Learning
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Combined SVM-Based Feature Selection and Classification
Machine Learning
The class imbalance problem: A systematic study
Intelligent Data Analysis
A subspace kernel for nonlinear feature extraction
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Optimal Double-Kernel Combination for Classification
MLDM '09 Proceedings of the 6th International Conference on Machine Learning and Data Mining in Pattern Recognition
Similarity and Kernel Matrix Evaluation Based on Spatial Autocorrelation Analysis
ISMIS '09 Proceedings of the 18th International Symposium on Foundations of Intelligent Systems
A novel Gaussian kernel paramter choosing method
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
Multiple Kernel Learning Algorithms
The Journal of Machine Learning Research
A research of reduction algorithm for support vector machine
ICSI'11 Proceedings of the Second international conference on Advances in swarm intelligence - Volume Part II
Evaluation measures for kernel optimization
Pattern Recognition Letters
Eigenvalues perturbation of integral operator for kernel selection
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Hi-index | 0.01 |
We study the problem of evaluating the goodness of a kernel matrix for a classification task. As kernel matrix evaluation is usually used in other expensive procedures like feature and model selections, the goodness measure must be calculated efficiently. Most previous approaches are not efficient except for kernel target alignment (KTA) that can be calculated in O(n^2) time complexity. Although KTA is widely used, we show that it has some serious drawbacks. We propose an efficient surrogate measure to evaluate the goodness of a kernel matrix based on the data distributions of classes in the feature space. The measure not only overcomes the limitations of KTA but also possesses other properties like invariance, efficiency and an error bound guarantee. Comparative experiments show that the measure is a good indication of the goodness of a kernel matrix.