Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Machine Learning
Transductive Inference for Text Classification using Support Vector Machines
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
The Journal of Machine Learning Research
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
Beyond the point cloud: from transductive to semi-supervised learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Efficient co-regularised least squares regression
ICML '06 Proceedings of the 23rd international conference on Machine learning
The Journal of Machine Learning Research
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Multi-view discriminative sequential learning
ECML'05 Proceedings of the 16th European conference on Machine Learning
IEEE Transactions on Signal Processing
On the generalization ability of on-line learning algorithms
IEEE Transactions on Information Theory
Visual tracking via online manifold regularization
ICIMCS '10 Proceedings of the Second International Conference on Internet Multimedia Computing and Service
Supervised and semi-supervised online boosting tree for industrial machine vision application
Proceedings of the Fifth International Workshop on Knowledge Discovery from Sensor Data
Manifold coarse graining for online semi-supervised learning
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part I
On-line inverse multiple instance boosting for classifier grids
Pattern Recognition Letters
A tree-based regressor that adapts to intrinsic dimension
Journal of Computer and System Sciences
Hi-index | 0.00 |
We consider a novel "online semi-supervised learning" setting where (mostly unlabeled) data arrives sequentially in large volume, and it is impractical to store it all before learning. We propose an online manifold regularization algorithm. It differs from standard online learning in that it learns even when the input point is unlabeled. Our algorithm is based on convex programming in kernel space with stochastic gradient descent, and inherits the theoretical guarantees of standard online algorithms. However, naïve implementation of our algorithm does not scale well. This paper focuses on efficient, practical approximations; we discuss two sparse approximations using buffering and online random projection trees. Experiments show our algorithm achieves risk and generalization accuracy comparable to standard batch manifold regularization, while each step runs quickly. Our online semi-supervised learning setting is an interesting direction for further theoretical development, paving the way for semi-supervised learning to work on real-world life-long learning tasks.