A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning
The covering number in learning theory
Journal of Complexity
On the Vgamma Dimension for Regression in Reproducing Kernel Hilbert Spaces
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
The Journal of Machine Learning Research
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
Neural Computation
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Classification with Gaussians and Convex Loss
The Journal of Machine Learning Research
The bounds on the rate of uniform convergence for learning machine
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
IEEE Transactions on Information Theory
Distribution-free performance bounds for potential function rules
IEEE Transactions on Information Theory
An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In many practical applications, the performance of a learning algorithm is not actually affected only by an unitary factor just like the complexity of hypothesis space, stability of the algorithm and data quality. This paper addresses in the performance of the regularization algorithm associated with Gaussian kernels. The main purpose is to provide a framework of evaluating the generalization performance of the algorithm conjointly in terms of hypothesis space complexity, algorithmic stability and data quality. The new bounds on generalization error of such algorithm measured by regularization error and sample error are established. It is shown that the regularization error has polynomial decays under some conditions, and the new bounds are based on uniform stability of the algorithm, covering number of hypothesis space and data information simultaneously. As an application, the obtained results are applied to several special regularization algorithms, and some new results for the special algorithms are deduced.