Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Facial Analysis and Synthesis Using Image-Based Models
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
A Unified Framework for Regularization Networks and Support Vector Machines
A Unified Framework for Regularization Networks and Support Vector Machines
A General Framework for Object Detection
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
Generalized Robust Conjoint Estimation
Marketing Science
Modelling of a SISO and MIMO non linear communication channel using two modelling techniques
WSEAS Transactions on Circuits and Systems
Support vector machine classifiers for asymmetric proximities
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
A computer vision approach for weeds identification through Support Vector Machines
Applied Soft Computing
Radial basis function regularization for linear inverse problems with random noise
Journal of Multivariate Analysis
Hi-index | 0.00 |
In this paper we first overview the main concepts of Statistical Learning Theory, a framework in which learning from examples can be studied in a principled way. We then briefly discuss well known as well as emerging learning techniques such as Regularization Networks and Support Vector Machines which can be justified in term of the same induction principle.