Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
An equivalence between sparse approximation and support vector machines
Neural Computation
From Regression to Classification in Support Vector Machines
From Regression to Classification in Support Vector Machines
On the Noise Model of Support Vector Machine Regression
On the Noise Model of Support Vector Machine Regression
Scale-sensitive dimensions, uniform convergence, and learnability
SFCS '93 Proceedings of the 1993 IEEE 34th Annual Foundations of Computer Science
A hybrid analytical-heuristic method for calibrating land-use change models
Environmental Modelling & Software
Hi-index | 0.00 |
Statistical learning theory has emerged in the last few years as a solid and elegant frameworkfor studying the problem of learning from examples. Unlike previous "classical" learning techniques, this theory completely characterizes the necessary and sufficient conditions for a learning algorithm to be consistent. The key quantity is the capacity of the set of hypotheses employed in the learning algorithm and the goal is to control this capacity depending on the given examples. Structural risk minimization (SRM) is the main theoretical algorithm which implements this idea. SRMis inspired and closely related to regularization theory. For practical purposes, however, SRM is a very hard problem and impossible to implement when dealing with a large number of examples. Techniques such as support vector machines and older regularization networks are a viable solution to implement the idea of capacity control. The paper also discusses how these techniques can be formulated as a variational problem in a Hilbert space and show how SRM can be extended in order to implement both classical regularization networks and support vector machines.