The out-of-sample problem for classical multidimensional scaling
Computational Statistics & Data Analysis
Deep learning via semi-supervised embedding
Proceedings of the 25th international conference on Machine learning
Large-Scale Clustering through Functional Embedding
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Partial order embedding with multiple kernels
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Dynamic network data exploration through semi-supervised functional embedding
Proceedings of the 17th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
Using Wavelet Extraction for Haptic Texture Classification
IVIC '09 Proceedings of the 1st International Visual Informatics Conference on Visual Informatics: Bridging Research and Practice
Large Scale Online Learning of Image Similarity Through Ranking
The Journal of Machine Learning Research
Auto adjustable ANN-based classification system for optimal high dimensional data analysis
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
Neural object recognition by hierarchical learning and extraction of essential shapes
BVAI'07 Proceedings of the 2nd international conference on Advances in brain, vision and artificial intelligence
Cumulative global distance for dimension reduction in handwritten digits database
VISUAL'07 Proceedings of the 9th international conference on Advances in visual information systems
On the expressive power of deep architectures
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Learning temporal coherent features through life-time sparsity
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part I
Nonparametric guidance of autoencoder representations using label information
The Journal of Machine Learning Research
Feature learning and deep architectures: new directions for music informatics
Journal of Intelligent Information Systems
Hi-index | 0.00 |
Dimensionality reduction involves mapping a set of high dimensional input points onto a low dimensional manifold so that 'similar" points in input space are mapped to nearby points on the manifold. We present a method - called Dimensionality Reduction by Learning an Invariant Mapping (DrLIM) - for learning a globally coherent nonlinear function that maps the data evenly to the output manifold. The learning relies solely on neighborhood relationships and does not require any distancemeasure in the input space. The method can learn mappings that are invariant to certain transformations of the inputs, as is demonstrated with a number of experiments. Comparisons are made to other techniques, in particular LLE.