Introduction to algorithms
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Journal of Cognitive Neuroscience
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised learning of image manifolds by semidefinite programming
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Latent space segmentation for mobile gait analysis
ACM Transactions on Embedded Computing Systems (TECS) - Special Section on Wireless Health Systems, On-Chip and Off-Chip Network Architectures
Hi-index | 0.00 |
Isometric feature mapping (ISOMAP) has two computational bottlenecks. The first is calculating the N×N graph distance matrix DN. Using Floyd's algorithm, this is O(N3); this can be improved to O(kN2 log N) by implementing Dijkstra's algorithm. The second bottleneck is the MDS eigenvalue calculation, which involves a full N×N matrix and has complexity O(N3). In this paper, we address both of these inefficiencies by a greedy approximation algorithm of minimum set coverage (MSC). The algorithm learns a minimum subset of overlapping neighborhoods for high dimensional data that lies on or near a low dimensional manifold. The new framework leads to order-of-magnitude reductions in computation time and makes it possible to study much larger problems in manifold learning.