Building k Edge-Disjoint Spanning Trees of Minimum Total Length for Isometric Data Embedding
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental Nonlinear Dimensionality Reduction by Manifold Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Building k-Connected Neighborhood Graphs for Isometric Data Embedding
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selection of the optimal parameter value for the Isomap algorithm
Pattern Recognition Letters
Pattern Recognition
Image distance functions for manifold learning
Image and Vision Computing
Letters: ISOLLE: LLE with geodesic distance
Neurocomputing
Using graph algebra to optimize neighborhood for isometric mapping
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Clustering-based nonlinear dimensionality reduction on manifold
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Performing locally linear embedding with adaptable neighborhood size on manifold
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An effective double-bounded tree-connected Isomap algorithm for microarray data classification
Pattern Recognition Letters
Perceptual relativity-based local hyperplane classification
Neurocomputing
Generating relevant and diverse query suggestions using sparse manifold ranking with sink regions
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part I
Perceptual relativity-based semi-supervised dimensionality reduction algorithm
Applied Soft Computing
Hi-index | 0.01 |
Isometric embedding approaches can nicely deal with noiseless data sets, but they are topologically unstable when confronted with sparse data sets or with data sets containing a large amount of noise and outliers, as where the neighborhood is critically distorted. Inspired from the cognitive relativity, this paper proposes a relative transformation that can be applied to build the relative space from the original space of data. In relative space, the noise and outliers will become further away from the normal points, while the near points will become relative closer. Accordingly we determine the neighborhood in the relative space for isometric embedding, while the embedding is still performed in the original space. The conducted experiments on both synthetic and real data sets validate the approach.