Communications of the ACM
Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Non-linear dimensionality reduction techniques for classification and visualization
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Building k Edge-Disjoint Spanning Trees of Minimum Total Length for Isometric Data Embedding
IEEE Transactions on Pattern Analysis and Machine Intelligence
Building k-edge-connected neighborhood graph for distance-based data projection
Pattern Recognition Letters
Building k-Connected Neighborhood Graphs for Isometric Data Embedding
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiple Manifolds Learning Framework Based on Hierarchical Mixture Density Model
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Nonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nonlinear Dimensionality Reduction of Data Lying on the Multicluster Manifold
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.01 |
Nonlinear dimensionality reduction of data lying on multi-cluster manifolds is a crucial issue in manifold learning research. An effective method, called the passage method, is proposed in this paper to alleviate the disconnectivity, short-circuit, and roughness problems ordinarily encountered by the existing methods. The specific characteristic of the proposed method is that it constructs a globally connected neighborhood graph superimposed on the data set through technically building the smooth passages between separate clusters, instead of supplementing some rough inter-cluster connections like some existing methods. The neighborhood graph so constructed is naturally configured as a smooth manifold, and hence complies with the effectiveness condition underlying manifold learning. This theoretical argument is supported by a series of experiments performed on the synthetic and real data sets residing on multi-cluster manifolds.