The nature of statistical learning theory
The nature of statistical learning theory
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Ridge Regression Confidence Machine
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Machine-Learning Applications of Algorithmic Randomness
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Computationally Efficient Transductive Machines
ALT '00 Proceedings of the 11th International Conference on Algorithmic Learning Theory
Transduction with confidence and credibility
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Transductive Confidence Machines for Pattern Recognition
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Regression conformal prediction with nearest neighbours
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Statistical learning theory considers three main problems, pattern recognition, regression and density estimation. This paper studies solvability of these problems (mainly concentrating on pattern recognition and density estimation) in the "high-dimensional" case, where the patterns in the training and test sets are never repeated. We show that, assuming an i.i.d. data source but without any further assumptions, the problems of pattern recognition and regression can often be solved (and there are practically useful algorithms to solve them). On the other hand, the problem of density estimation, as we formalize it, cannot be solved under the general i.i.d. assumption, and additional assumptions are required.