SIAM Review
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Strengthened semidefinite relaxations via a second lifting for the Max-Cut problem
Discrete Applied Mathematics
A Spectral Bundle Method for Semidefinite Programming
SIAM Journal on Optimization
Learning from Labeled and Unlabeled Data using Graph Mincuts
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Convex Optimization
Local Minima and Convergence in Low-Rank Semidefinite Programming
Mathematical Programming: Series A and B
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
The Interplay of Optimization and Machine Learning Research
The Journal of Machine Learning Research
A tutorial on spectral clustering
Statistics and Computing
Spectral clustering with inconsistent advice
Proceedings of the 25th international conference on Machine learning
Spectral clustering based on the graph p-Laplacian
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
A novel transductive learning algorithm based on multi-agent-system
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
Semi-supervised Bayesian ARTMAP
Applied Intelligence
ECODE: event-based community detection from social networks
DASFAA'11 Proceedings of the 16th international conference on Database systems for advanced applications - Volume Part I
An efficient algorithm for maximal margin clustering
Journal of Global Optimization
Influence of erroneous pairwise constraints in semi-supervised clustering
AMT'12 Proceedings of the 8th international conference on Active Media Technology
Maximum volume clustering: a new discriminative clustering approach
The Journal of Machine Learning Research
Hi-index | 0.00 |
The rise of convex programming has changed the face of many research fields in recent years, machine learning being one of the ones that benefitted the most. A very recent developement, the relaxation of combinatorial problems to semi-definite programs (SDP), has gained considerable attention over the last decade (Helmberg, 2000; De Bie and Cristianini, 2004a). Although SDP problems can be solved in polynomial time, for many relaxations the exponent in the polynomial complexity bounds is too high for scaling to large problem sizes. This has hampered their uptake as a powerful new tool in machine learning. In this paper, we present a new and fast SDP relaxation of the normalized graph cut problem, and investigate its usefulness in unsupervised and semi-supervised learning. In particular, this provides a convex algorithm for transduction, as well as approaches to clustering. We further propose a whole cascade of fast relaxations that all hold the middle between older spectral relaxations and the new SDP relaxation, allowing one to trade off computational cost versus relaxation accuracy. Finally, we discuss how the methodology developed in this paper can be applied to other combinatorial problems in machine learning, and we treat the max-cut problem as an example.