Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dimension reduction of microarray data based on local tangent space alignment
ICCI '05 Proceedings of the Fourth IEEE International Conference on Cognitive Informatics
Feature extraction using constrained maximum variance mapping
Pattern Recognition
Locally linear discriminant embedding: An efficient method for face recognition
Pattern Recognition
Weighted locally linear embedding for dimension reduction
Pattern Recognition
Nonlinear process monitoring based on maximum variance unfolding projections
Expert Systems with Applications: An International Journal
Enhanced supervised locally linear embedding
Pattern Recognition Letters
Incremental Laplacian eigenmaps by preserving adjacent information between data points
Pattern Recognition Letters
Adaptive Neighborhood Select Based on Local Linearity for Nonlinear Dimensionality Reduction
ISICA '09 Proceedings of the 4th International Symposium on Advances in Computation and Intelligence
Constrained Laplacian Eigenmap for dimensionality reduction
Neurocomputing
Rapid and brief communication: Incremental locally linear embedding
Pattern Recognition
A Modified Semi-Supervised Learning Algorithm on Laplacian Eigenmaps
Neural Processing Letters
Gene expression data classification using locally linear discriminant embedding
Computers in Biology and Medicine
An improved local tangent space alignment method for manifold learning
Pattern Recognition Letters
Unsupervised learning of image manifolds by semidefinite programming
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Incremental manifold learning by spectral embedding methods
Pattern Recognition Letters
An effective double-bounded tree-connected Isomap algorithm for microarray data classification
Pattern Recognition Letters
Kernel ridge regression for out-of-sample mapping in supervised manifold learning
Expert Systems with Applications: An International Journal
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Feature extraction using orthogonal discriminant local tangent space alignment
Pattern Analysis & Applications
EvoBIO'13 Proceedings of the 11th European conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics
Discriminative Orthogonal Nonnegative matrix factorization with flexibility for data representation
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
The paper presents an empirical comparison of the most prominent nonlinear manifold learning techniques for dimensionality reduction in the context of high-dimensional microarray data classification. In particular, we assessed the performance of six methods: isometric feature mapping, locally linear embedding, Laplacian eigenmaps, Hessian eigenmaps, local tangent space alignment and maximum variance unfolding. Unlike previous studies on the subject, the experimental framework adopted in this work properly extends to dimensionality reduction the supervised learning paradigm, by regarding the test set as an out-of-sample set of new points which are excluded from the manifold learning process. This in order to avoid a possible overestimate of the classification accuracy which may yield misleading comparative results. The different empirical approach requires the use of a fast and effective out-of-sample embedding method for mapping new high-dimensional data points into an existing reduced space. To this aim we propose to apply multi-output kernel ridge regression, an extension of linear ridge regression based on kernel functions which has been recently presented as a powerful method for out-of-sample projection when combined with a variant of isometric feature mapping. Computational experiments on a wide collection of cancer microarray data sets show that classifiers based on Isomap, LLE and LE were consistently more accurate than those relying on HE, LTSA and MVU. In particular, under different experimental conditions LLE-based classifier emerged as the most effective method whereas Isomap algorithm turned out to be the second best alternative for dimensionality reduction.