The nature of statistical learning theory
The nature of statistical learning theory
Further results on the margin distribution
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
A new approximate maximal margin classification algorithm
The Journal of Machine Learning Research
Analysis of generic perceptron-like large margin classifiers
ECML'05 Proceedings of the 16th European conference on Machine Learning
Approximate maximum margin algorithms with rules controlled by the number of mistakes
Proceedings of the 24th international conference on Machine learning
The perceptron with dynamic margin
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Hi-index | 0.00 |
We present a new class of Perceptron-like algorithms with margin in which the “effective” learning rate ηeff, defined as the ratio of the learning rate to the length of the weight vector, remains constant. We prove that for ηeff sufficiently small the new algorithms converge in a finite number of steps and show that there exists a limit of the parameters involved in which convergence leads to classification with maximum margin. A soft margin extension for Perceptron-like large margin classifiers is also discussed.