Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
The Relaxed Online Maximum Margin Algorithm
Machine Learning
A new approximate maximal margin classification algorithm
The Journal of Machine Learning Research
Online Passive-Aggressive Algorithms
The Journal of Machine Learning Research
Tracking the best hyperplane with a simple budget Perceptron
Machine Learning
The Forgetron: A Kernel-Based Perceptron on a Budget
SIAM Journal on Computing
Confidence-weighted linear classification
Proceedings of the 25th international conference on Machine learning
The projectron: a bounded kernel-based Perceptron
Proceedings of the 25th international conference on Machine learning
Identifying suspicious URLs: an application of large-scale online learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
To solve online nonlinear problems, usually, a set of misclassified observed examples (defined as support set) should be stored in the internal memory for computing kernel values. With the increase of a large scale of training data, computing all the kernel values is expensive and also can lead to an out-of-memory problem. In the paper, a fusion strategy is proposed to compress the size of support set for online learning and the fused kernel can best represent the current instance and its nearest one in the support set in the previous time. The proposed algorithm is based on Perceptron-like method, and thus it is called as Fuseptron. Different from the most recently proposed nonlinear online algorithms, the internal memory can be bounded in Fuseptron and the mistake bound is also derived. Experiments carried out on one synthetic and four real large-scale datasets validate the effectiveness and efficiency of Fuseptron compared to the state-of-the-art algorithms.