Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Weighted Parzen Windows for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
The covering number in learning theory
Journal of Complexity
A theoretical characterization of linear SVM-based feature selection
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
Estimation of Gradients and Coordinate Covariation in Classification
The Journal of Machine Learning Research
Learnability of Gaussians with Flexible Variances
The Journal of Machine Learning Research
Classification with Gaussians and Convex Loss
The Journal of Machine Learning Research
Classifying the Geometric Dilution of Precision of GPS satellites utilizing Bayesian decision theory
Computers and Electrical Engineering
Hi-index | 0.00 |
We consider the multi-class classification problem in learning theory. A learning algorithm by means of Parzen windows is introduced. Under some regularity conditions on the conditional probability for each class and some decay condition of the marginal distribution near the boundary of the input space, we derive learning rates in terms of the sample size, window width and the decay of the basic window. The choice of the window width follows from bounds for the sample error and approximation error. A novelly defined splitting function for the multi-class classification and a comparison theorem, bounding the excess misclassification error by the norm of the difference of function vectors, play an important role.