The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Non-linear dimensionality reduction techniques for classification and visualization
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
A general regression technique for learning transductions
ICML '05 Proceedings of the 22nd international conference on Machine learning
Learning manifolds for bankruptcy analysis
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A general regression neural network
IEEE Transactions on Neural Networks
A comparative study of nonlinear manifold learning methods for cancer microarray data classification
Expert Systems with Applications: An International Journal
EvoBIO'13 Proceedings of the 11th European conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics
Hi-index | 12.05 |
Manifold learning methods for unsupervised nonlinear dimensionality reduction have proven effective in the visualization of high dimensional data sets. When dealing with classification tasks, supervised extensions of manifold learning techniques, in which class labels are used to improve the embedding of the training points, require an appropriate method for out-of-sample mapping. In this paper we propose multi-output kernel ridge regression (KRR) for out-of-sample mapping in supervised manifold learning, in place of general regression neural networks (GRNN) that have been adopted by previous studies on the subject. Specifically, we consider a supervised agglomerative variant of Isomap and compare the performance of classification methods when the out-of-sample embedding is based on KRR and GRNN, respectively. Extensive computational experiments, using support vector machines and k-nearest neighbors as base classifiers, provide statistical evidence that out-of-sample mapping based on KRR consistently dominates its GRNN counterpart, and that supervised agglomerative Isomap with KRR achieves a higher accuracy than direct classification methods on most data sets.