The nature of statistical learning theory
The nature of statistical learning theory
A framework for structural risk minimisation
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
An Algorithm for Finding Best Matches in Logarithmic Expected Time
ACM Transactions on Mathematical Software (TOMS)
Multidimensional binary search trees used for associative searching
Communications of the ACM
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Practical Handbook on Image Processing for Scientific and Technical Applications, Second Edition
Practical Handbook on Image Processing for Scientific and Technical Applications, Second Edition
Maximal Margin Estimation with Perceptron-Like Algorithm
ICAISC '08 Proceedings of the 9th international conference on Artificial Intelligence and Soft Computing
Neural Network Learning: Theoretical Foundations
Neural Network Learning: Theoretical Foundations
Probability of error, equivocation, and the Chernoff bound
IEEE Transactions on Information Theory
Sample complexity of linear learning machines with different restrictions over weights
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part II
Mini-models --- local regression models for the function approximation learning
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part II
Hi-index | 0.10 |
According to a certain misconception sometimes met in the literature: for the nearest-neighbors algorithms there is no fixed hypothesis class of limited Vapnik-Chervonenkis dimension. In the paper a simple reformulation (not a modification) of the nearest-neighbors algorithm is shown where instead of a natural number k, a percentage @a@?(0,1) of nearest neighbors is used. Owing to this reformulation one can construct sets of approximating functions, which we prove to have finite VC dimension. In a special (but practical) case this dimension is equal to @?2/@a@?. It is also then possible to form a sequence of sets of functions with increasing VC dimension, and to perform complexity selection via cross-validation or similarly to the structural risk minimization framework. Results of such experiments are also presented.