The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Building k-edge-connected neighborhood graph for distance-based data projection
Pattern Recognition Letters
Selection of the optimal parameter value for the Isomap algorithm
Pattern Recognition Letters
Text Classification on Embedded Manifolds
IBERAMIA '08 Proceedings of the 11th Ibero-American conference on AI: Advances in Artificial Intelligence
Adaptive Neighborhood Select Based on Local Linearity for Nonlinear Dimensionality Reduction
ISICA '09 Proceedings of the 4th International Symposium on Advances in Computation and Intelligence
ISBRA'07 Proceedings of the 3rd international conference on Bioinformatics research and applications
Parameterless isomap with adaptive neighborhood selection
DAGM'06 Proceedings of the 28th conference on Pattern Recognition
Selection of the optimal parameter value for the ISOMAP algorithm
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A comparative study of nonlinear manifold learning methods for cancer microarray data classification
Expert Systems with Applications: An International Journal
EvoBIO'13 Proceedings of the 11th European conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics
Hi-index | 0.10 |
Isometric mapping (Isomap) is a popular nonlinear dimensionality reduction technique which has shown high potential in visualization and classification. However, it appears sensitive to noise or scarcity of observations. This inadequacy may hinder its application for the classification of microarray data, in which the expression levels of thousands of genes in a few normal and tumor sample tissues are measured. In this paper we propose a double-bounded tree-connected variant of Isomap, aimed at being more robust to noise and outliers when used for classification and also computationally more efficient. It differs from the original Isomap in the way the neighborhood graph is generated: in the first stage we apply a double-bounding rule that confines the search to at most k nearest neighbors contained within an @e-radius hypersphere; the resulting subgraphs are then joined by computing a minimum spanning tree among the connected components. We therefore achieve a connected graph without unnaturally inflating the values of k and @e. The computational experiences show that the new method performs significantly better in terms of accuracy with respect to Isomap, k-edge-connected Isomap and the direct application of support vector machines to data in the input space, consistently across seven microarray datasets considered in our tests.