The nature of statistical learning theory
The nature of statistical learning theory
Training Invariant Support Vector Machines
Machine Learning
Transductive Inference for Text Classification using Support Vector Machines
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
An Introduction to the Conjugate Gradient Method Without the Agonizing Pain
An Introduction to the Conjugate Gradient Method Without the Agonizing Pain
Convex Optimization
A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs
The Journal of Machine Learning Research
Beyond the point cloud: from transductive to semi-supervised learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Training linear SVMs in linear time
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
Training a Support Vector Machine in the Primal
Neural Computation
Building Support Vector Machines with Reduced Classifier Complexity
The Journal of Machine Learning Research
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Large-scale RLSC learning without agony
Proceedings of the 24th international conference on Machine learning
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
Proceedings of the 24th international conference on Machine learning
On semi-supervised kernel methods
On semi-supervised kernel methods
An RKHS for multi-view learning and manifold co-regularization
Proceedings of the 25th international conference on Machine learning
Optimization Techniques for Semi-Supervised Support Vector Machines
The Journal of Machine Learning Research
Trust Region Newton Method for Logistic Regression
The Journal of Machine Learning Research
Towards a theoretical foundation for Laplacian-based manifold methods
Journal of Computer and System Sciences
Introduction to Semi-Supervised Learning
Introduction to Semi-Supervised Learning
Semi-Supervised Learning
Semi-supervised multiclass Kernel machines with probabilistic constraints
AI*IA'11 Proceedings of the 12th international conference on Artificial intelligence around man and beyond
Knowledge propagation in large image databases using neighborhood information
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Semi-supervised learning with mixed knowledge information
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Integrating Spectral Kernel Learning and Constraints in Semi-Supervised Classification
Neural Processing Letters
A unified learning framework for auto face annotation by mining web facial images
Proceedings of the 21st ACM international conference on Information and knowledge management
Semi-supervised learning with nuclear norm regularization
Pattern Recognition
Annotation propagation in image databases using similarity graphs
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Facial expression recognition based on Hessian regularized support vector machine
Proceedings of the Fifth International Conference on Internet Multimedia Computing and Service
Semi-supervised image classification based on sparse coding spatial pyramid matching
Proceedings of the Fifth International Conference on Internet Multimedia Computing and Service
Semi-supervised learning using greedy max-cut
The Journal of Machine Learning Research
Discriminative feature selection for multi-view cross-domain learning
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Semi-supervised object recognition based on Connected Image Transformations
Expert Systems with Applications: An International Journal
Training Lp norm multiple kernel learning in the primal
Neural Networks
Journal of Biomedical Informatics
Laplacian minimax probability machine
Pattern Recognition Letters
Hi-index | 0.00 |
In the last few years, due to the growing ubiquity of unlabeled data, much effort has been spent by the machine learning community to develop better understanding and improve the quality of classifiers exploiting unlabeled data. Following the manifold regularization approach, Laplacian Support Vector Machines (LapSVMs) have shown the state of the art performance in semi-supervised classification. In this paper we present two strategies to solve the primal LapSVM problem, in order to overcome some issues of the original dual formulation. In particular, training a LapSVM in the primal can be efficiently performed with preconditioned conjugate gradient. We speed up training by using an early stopping strategy based on the prediction on unlabeled data or, if available, on labeled validation examples. This allows the algorithm to quickly compute approximate solutions with roughly the same classification accuracy as the optimal ones, considerably reducing the training time. The computational complexity of the training algorithm is reduced from O(n3) to O(kn2), where n is the combined number of labeled and unlabeled examples and k is empirically evaluated to be significantly smaller than n. Due to its simplicity, training LapSVM in the primal can be the starting point for additional enhancements of the original LapSVM formulation, such as those for dealing with large data sets. We present an extensive experimental evaluation on real world data showing the benefits of the proposed approach.