Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix
Pattern Recognition Letters
Geometry and invariance in kernel based methods
Advances in kernel methods
Kernel principal component analysis
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Neural Networks - 2005 Special issue: IJCNN 2005
A DC-programming algorithm for kernel selection
ICML '06 Proceedings of the 23rd international conference on Machine learning
Nonstationary kernel combination
ICML '06 Proceedings of the 23rd international conference on Machine learning
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Learnability of Gaussians with Flexible Variances
The Journal of Machine Learning Research
A kernel path algorithm for support vector machines
Proceedings of the 24th international conference on Machine learning
Structured large margin machines: sensitive to data distributions
Machine Learning
Pattern Recognition, Fourth Edition
Pattern Recognition, Fourth Edition
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Mercer kernel-based clustering in feature space
IEEE Transactions on Neural Networks
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
A Geometrical Method to Improve Performance of the Support Vector Machine
IEEE Transactions on Neural Networks
Weighted Mahalanobis Distance Kernels for Support Vector Machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
An important step in the construction of a support vector machine (SVM) is to select optimal hyperparameters. This paper proposes a novel method for tuning the hyperparameters by maximizing the distance between two classes (DBTC) in the feature space. With a normalized kernel function, we find that DBTC can be used as a class separability criterion since the between-class separation and the within-class data distribution are implicitly taken into account. Employing DBTC as an objective function, we develop a gradient-based algorithm to search the optimal kernel parameter. On the basis of the geometric analysis and simulation results, we find that the optimal algorithm and the initialization problem become very simple. Experimental results on the synthetic and real-world data show that the proposed method consistently outperforms other existing hyperparameter tuning methods.