SIAM Review
Constrained K-means Clustering with Background Knowledge
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Learning nonparametric kernel matrices from pairwise constraints
Proceedings of the 24th international conference on Machine learning
A tutorial on spectral clustering
Statistics and Computing
Pairwise constraint propagation by semidefinite programming for semi-supervised classification
Proceedings of the 25th international conference on Machine learning
Low-Rank Kernel Learning with Bregman Matrix Divergences
The Journal of Machine Learning Research
A Family of Simple Non-Parametric Kernel Learning Algorithms
The Journal of Machine Learning Research
Foundations and Trends® in Machine Learning
Semi-supervised learning with mixed knowledge information
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Hi-index | 0.00 |
Side information is highly useful in the learning of a nonparametric kernel matrix. However, this often leads to an expensive semidefinite program (SDP). In recent years, a number of dedicated solvers have been proposed. Though much better than off-the-shelf SDP solvers, they still cannot scale to large data sets. In this paper, we propose a novel solver based on the alternating direction method of multipliers (ADMM). The key idea is to use a low-rank decomposition of the kernel matrix K = VTU, with the constraint that V = U. The resultant optimization problem, though non-convex, has favorable convergence properties and can be efficiently solved without requiring eigen-decomposition in each iteration. Experimental results on a number of real-world data sets demonstrate that the proposed method is as accurate as directly solving the SDP, but can be one to two orders of magnitude faster.