The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Using analytic QP and sparseness to speed training of support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
A parallel solver for large quadratic programs in training support vector machines
Parallel Computing - Special issue: Parallel computing in numerical optimization
Everything old is new again: a fresh look at historical approaches in machine learning
Everything old is new again: a fresh look at historical approaches in machine learning
An accelerated procedure for recursive feature ranking on microarray data
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
An introduction to variable and feature selection
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs
The Journal of Machine Learning Research
IEEE Transactions on Information Theory
Training multilayer perceptron classifiers based on a modified support vector method
IEEE Transactions on Neural Networks
Deriving the kernel from training data
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
A quantitative evaluation of techniques for detection of abnormal change events in blogs.
Proceedings of the 12th ACM/IEEE-CS joint conference on Digital Libraries
Hi-index | 0.00 |
We propose a novel algorithm, Terminated Ramp-Support Vector Machines (TR-SVM), for classification and feature ranking purposes in the family of Support Vector Machines. The main improvement relies on the fact that the kernel is automatically determined by the training examples. It is built as a function of simple classifiers, generalized terminated ramp functions, obtained by separating oppositely labeled pairs of training points. The algorithm has a meaningful geometrical interpretation, and it is derived in the framework of Tikhonov regularization theory. Its unique free parameter is the regularization one, representing a trade-off between empirical error and solution complexity. Employing the equivalence between the proposed algorithm and two-layer networks, a theoretical bound on the generalization error is also derived, together with Vapnik-Chervonenkis dimension. Performances are tested on a number of synthetic and real data sets.