A linear-time heuristic for improving network partitions
25 years of DAC Papers on Twenty-five years of electronic design automation
Recent directions in netlist partitioning: a survey
Integration, the VLSI Journal
Spectral partitioning with multiple eigenvectors
Discrete Applied Mathematics - Special volume on VLSI
On the performance of spectral graph partitioning methods
Proceedings of the sixth annual ACM-SIAM symposium on Discrete algorithms
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Some simplified NP-complete problems
STOC '74 Proceedings of the sixth annual ACM symposium on Theory of computing
Kernel k-means: spectral clustering and normalized cuts
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
K-means clustering via principal component analysis
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A Combined Evolutionary Search and Multilevel Optimisation Approach to Graph-Partitioning
Journal of Global Optimization
The uniqueness of a good optimum for K-means
ICML '06 Proceedings of the 23rd international conference on Machine learning
Learning Spectral Clustering, With Application To Speech Separation
The Journal of Machine Learning Research
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
A tutorial on spectral clustering
Statistics and Computing
Spectral clustering with eigenvector selection
Pattern Recognition
An algorithm for improving graph partitions
Proceedings of the nineteenth annual ACM-SIAM symposium on Discrete algorithms
Lower bounds for the partitioning of graphs
IBM Journal of Research and Development
Hi-index | 0.01 |
Spectral clustering techniques are heuristic algorithms aiming to find approximate solutions to difficult graph-cutting problems, usually NP-complete, which are useful to clustering. A fundamental working hypothesis of these techniques is that the optimal partition of K classes can be obtained from the first K eigenvectors of the graph normalized Laplacian matrix L"N if the gap between the K-th and the K+1-th eigenvalue of L"N is sufficiently large. If the gap is small a perturbation may swap the corresponding eigenvectors and the results can be very different from the optimal ones. In this paper we suggest a weaker working hypothesis: the optimal partition of K classes can be obtained from a K-dimensional subspace of the first MK eigenvectors, where M is a parameter chosen by the user. We show that the validity of this hypothesis can be confirmed by the gap size between the K-th and the M+1-th eigenvalue of L"N. Finally we present and analyse a simple probabilistic algorithm that generalizes current spectral techniques in this extended framework. This algorithm gives results on real world graphs that are close to the state of the art by selecting correct K-dimensional subspaces of the linear span of the first M eigenvectors, robust to small changes of the eigenvalues.