Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
A Generalized Representer Theorem
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Learning the unified kernel machines for classification
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Online Passive-Aggressive Algorithms
The Journal of Machine Learning Research
Tracking the best hyperplane with a simple budget Perceptron
Machine Learning
Confidence-weighted linear classification
Proceedings of the 25th international conference on Machine learning
The projectron: a bounded kernel-based Perceptron
Proceedings of the 25th international conference on Machine learning
Density-weighted nyström method for computing large kernel eigensystems
Neural Computation
Bounded Kernel-Based Online Learning
The Journal of Machine Learning Research
Double Updating Online Learning
The Journal of Machine Learning Research
Sampling methods for the Nyström method
The Journal of Machine Learning Research
Online Multiple Kernel Classification
Machine Learning
Online multi-modal distance learning for scalable multimedia retrieval
Proceedings of the sixth ACM international conference on Web search and data mining
Cost-Sensitive Online Classification
ICDM '12 Proceedings of the 2012 IEEE 12th International Conference on Data Mining
Efficient Kernel Clustering Using Random Fourier Features
ICDM '12 Proceedings of the 2012 IEEE 12th International Conference on Data Mining
Hi-index | 0.00 |
In this work, we present a new framework for large scale online kernel classification, making kernel methods efficient and scalable for large-scale online learning tasks. Unlike the regular budget kernel online learning scheme that usually uses different strategies to bound the number of support vectors, our framework explores a functional approximation approach to approximating a kernel function/matrix in order to make the subsequent online learning task efficient and scalable. Specifically, we present two different online kernel machine learning algorithms: (i) the Fourier Online Gradient Descent (FOGD) algorithm that applies the random Fourier features for approximating kernel functions; and (ii) the Nyström Online Gradient Descent (NOGD) algorithm that applies the Nyström method to approximate large kernel matrices. We offer theoretical analysis of the proposed algorithms, and conduct experiments for large-scale online classification tasks with some data set of over 1 million instances. Our encouraging results validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget kernel online learning approaches.