A Generalized Representer Theorem
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
A robust minimax approach to classification
The Journal of Machine Learning Research
Optimal Inequalities in Probability Theory: A Convex Optimization Approach
SIAM Journal on Optimization
Beyond the point cloud: from transductive to semi-supervised learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Learning low-rank kernel matrices
ICML '06 Proceedings of the 23rd international conference on Machine learning
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Information-theoretic metric learning
Proceedings of the 24th international conference on Machine learning
Pairwise constraint propagation by semidefinite programming for semi-supervised classification
Proceedings of the 25th international conference on Machine learning
Optimization Techniques for Semi-Supervised Support Vector Machines
The Journal of Machine Learning Research
Semi-supervised learning by mixed label propagation
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Laplacian Support Vector Machines Trained in the Primal
The Journal of Machine Learning Research
Towards a theoretical foundation for laplacian-based manifold methods
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Manifold-Regularized minimax probability machine
PSL'11 Proceedings of the First IAPR TC3 conference on Partially Supervised Learning
Hi-index | 0.10 |
In this paper, we propose a Laplacian minimax probability machine, which is a semi-supervised version of minimax probability machine based on the manifold regularization framework. We also show that the proposed method can be kernelized on the basis of a theorem similar to the representer theorem for non-linear cases. Experiments confirm that the proposed methods achieve competitive results, as compared to existing graph-based learning methods such as the Laplacian support vector machine and the Laplacian regularized least square, for publicly available datasets from the UCI machine learning repository.