Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
General Construction of Time-Domain Filters for Orientation Data
IEEE Transactions on Visualization and Computer Graphics
Transfer Discriminative Logmaps
PCM '09 Proceedings of the 10th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Geodesic discriminant analysis on curved riemannian manifold
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 5
Curvature analysis of frequency modulated manifolds in dimensionality reduction
Calcolo: a quarterly on numerical analysis and theory of computation
Riemannian manifold learning for nonlinear dimensionality reduction
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Geodesic Polar Coordinates on Polygonal Meshes
Computer Graphics Forum
Dimensionality reduction by low-rank embedding
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Hi-index | 0.00 |
We present a novel method for manifold learning, i.e. identification of the low-dimensional manifold-like structure present in a set of data points in a possibly high-dimensional space. The main idea is derived from the concept of Riemannian normal coordinates. This coordinate system is in a way a generalization of Cartesian coordinates in Euclidean space. We translate this idea to a cloud of data points in order to perform dimension reduction. Our implementation currently uses Dijkstra's algorithm for shortest paths in graphs and some basic concepts from differential geometry. We expect this approach to open up new possibilities for analysis of e.g. shape in medical imaging and signal processing of manifold-valued signals, where the coordinate system is “learned” from experimental high-dimensional data rather than defined analytically using e.g. models based on Lie-groups.