SIAM Review
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Analysis and extension of spectral methods for nonlinear dimensionality reduction
ICML '05 Proceedings of the 22nd international conference on Machine learning
Unsupervised learning of image manifolds by semidefinite programming
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
A dependence maximization view of clustering
Proceedings of the 24th international conference on Machine learning
A discriminant analysis for undersampled data
AIDM '07 Proceedings of the 2nd international workshop on Integrating artificial intelligence and data mining - Volume 84
Feature extraction using constrained maximum variance mapping
Pattern Recognition
A Meta-analysis of Timbre Perception Using Nonlinear Extensions to CLASCAL
Computer Music Modeling and Retrieval. Sense of Sounds
Semi-supervised metric learning by maximizing constraint margin
Proceedings of the 17th ACM conference on Information and knowledge management
Constrained many-objective optimization: a way forward
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Learning an Efficient Texture Model by Supervised Nonlinear Dimensionality Reduction Methods
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Dimensionality reduction by self organizing maps that preserve distances in output space
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Beyond 2D-grids: a dependence maximization view on image browsing
Proceedings of the international conference on Multimedia information retrieval
Distinguishing variance embedding
Image and Vision Computing
EMO'07 Proceedings of the 4th international conference on Evolutionary multi-criterion optimization
Modelling architectural visual experience using non-linear dimensionality reduction
ACAL'07 Proceedings of the 3rd Australian conference on Progress in artificial life
Incorporating the loss function into discriminative clustering of structured outputs
IEEE Transactions on Neural Networks
Relevance learning in generative topographic mapping
Neurocomputing
Dimensionality reduction on multi-dimensional transfer functions for multi-channel volume data sets
Information Visualization - Special issue on selected papers from visualization and data analysis 2010
A general framework for dimensionality reduction for large data sets
WSOM'11 Proceedings of the 8th international conference on Advances in self-organizing maps
A general framework for dimensionality-reducing data visualization mapping
Neural Computation
Ordinal regularized manifold feature extraction for image ranking
Signal Processing
Feature extraction using two-dimensional neighborhood margin and variation embedding
Computer Vision and Image Understanding
Visualizing the quality of dimensionality reduction
Neurocomputing
On the convergence of maximum variance unfolding
The Journal of Machine Learning Research
Large-scale SVD and manifold learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
Many problems in AI are simplified by clever representations of sensory or symbolic input. How to discover such representations automatically, from large amounts of unlabeled data, remains a fundamental challenge. The goal of statistical methods for dimensionality reduction is to detect and discover low dimensional structure in high dimensional data. In this paper, we review a recently proposed algorithm-- maximum, variance unfolding--for learning faithful low dimensional representations of high dimensional data. The algorithm relies on modem tools in convex optimization that are proving increasingly useful in many areas of machine learning.