Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
A robust minimax approach to classification
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Kernels and Distances for Structured Data
Machine Learning
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Combined SVM-Based Feature Selection and Classification
Machine Learning
Kernel Matrix Learning for One-Class Classification
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
Learning by local kernel polarization
Neurocomputing
Eigenvalues perturbation of integral operator for kernel selection
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Hi-index | 0.00 |
We study the problem of evaluating the goodness of a kernel matrix for a classification task. As kernel matrix evaluation is usually used in other expensive procedures like feature and model selections, the goodness measure must be calculated efficiently. Most previous approaches are not efficient, except for Kernel Target Alignment (KTA) that can be calculated in O(n2) time complexity. Although KTA is widely used, we show that it has some serious drawbacks. We propose an efficient surrogate measure to evaluate the goodness of a kernel matrix based on the data distributions of classes in the feature space. The measure not only overcomes the limitations of KTA, but also possesses other properties like invariance, efficiency and error bound guarantee. Comparative experiments show that the measure is a good indication of the goodness of a kernel matrix.