The nature of statistical learning theory
The nature of statistical learning theory
Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
The Relaxed Online Maximum Margin Algorithm
Machine Learning
Links between perceptrons, MLPs and SVMs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Training linear SVMs in linear time
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Pattern Recognition Letters
Probabilities of discrepancy between minima of cross-validation, Vapnik bounds and true risks
International Journal of Applied Mathematics and Computer Science
Hi-index | 0.00 |
In this paper we propose and analyse a 茂戮驴-margin generalisation of the perceptron learning algorithm of Rosenblatt. The difference between the original approach and the 茂戮驴-margin approach is only in the update step. We consider the behaviour of such a modified algorithm in both separable and non-separable case and also when the 茂戮驴-margin is negative. We give the convergence proof of such a modified algorithm, similar to the classical proof by Novikoff. Moreover we show how to change the margin of the update step in the progress of the algorithm to obtain the maximal possible margin of separation. In application part, we show the connection of the maximal margin of separation with SVM methods.