Choosing Multiple Parameters for Support Vector Machines
Machine Learning
IEEE Intelligent Systems
Kernel k-means: spectral clustering and normalized cuts
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
A DC-programming algorithm for kernel selection
ICML '06 Proceedings of the 23rd international conference on Machine learning
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Direct Method for Building Sparse Kernel Learning Algorithms
The Journal of Machine Learning Research
Multi-class Discriminant Kernel Learning via Convex Programming
The Journal of Machine Learning Research
Accurate sequence-based prediction of catalytic residues
Bioinformatics
Learning Translation Invariant Kernels for Classification
The Journal of Machine Learning Research
Learning convex combinations of continuously parameterized basic kernels
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.02 |
The Gaussian kernel function implicitly defines the feature space of an algorithm and plays an essential role in the application of kernel methods. The parameter of Gaussian kernel function is a scalar that has significant influences on final results. However, until now, it is still unclear how to choose an optimal kernel parameter. In this paper, we propose a novel data-driven method to optimize the Gaussian kernel parameter, which only depends on the original dataset distribution and yields a simple solution to this complex problem. The proposed method is task irrelevant and can be used in any Gaussian kernel-based approach, including supervised and unsupervised machine learning. Simulation experiments demonstrate the efficacy of the obtained results. A user-friendly online calculator is implemented at: www.csbio.sjtu.edu.cn/bioinf/kernel/ for public use.