Machine Learning
A sharp concentration inequality with application
Random Structures & Algorithms
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
The Journal of Machine Learning Research
Almost-everywhere algorithmic stability and generalization error
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
The performance bounds of learning machines based on exponentially strongly mixing sequences
Computers & Mathematics with Applications
Generalization Bounds of Regularization Algorithm with Gaussian Kernels
Neural Processing Letters
Hi-index | 0.00 |
The generalization performance is the important property of learning machines. The desired learning machines should have the quality of stability with respect to the training samples. We consider the empirical risk minimization on the function sets which are eliminated noisy. By applying the Kutin's inequality we establish the bounds of the rate of uniform convergence of the empirical risks to their expected risks for learning machines and compare the bounds with known results.