Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Non-linear dimensionality reduction techniques for classification and visualization
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A kernel view of the dimensionality reduction of manifolds
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Selection of the optimal parameter value for the Isomap algorithm
Pattern Recognition Letters
Supervised Isomap with Explicit Mapping
ICICIC '06 Proceedings of the First International Conference on Innovative Computing, Information and Control - Volume 3
Iterative sliced inverse regression for segmentation of ultrasound and MR images
Pattern Recognition
Nonlinear Dimension Reduction with Kernel Sliced Inverse Regression
IEEE Transactions on Knowledge and Data Engineering
GAP: A graphical environment for matrix visualization and cluster analysis
Computational Statistics & Data Analysis
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Sliced inverse regression (SIR) was developed to find effective linear dimension-reduction directions for exploring the intrinsic structure of the high-dimensional data. In this study, we present isometric SIR for nonlinear dimension reduction, which is a hybrid of the SIR method using the geodesic distance approximation. First, the proposed method computes the isometric distance between data points; the resulting distance matrix is then sliced according to K-means clustering results, and the classical SIR algorithm is applied. We show that the isometric SIR (ISOSIR) can reveal the geometric structure of a nonlinear manifold dataset (e.g., the Swiss roll). We report and discuss this novel method in comparison to several existing dimension-reduction techniques for data visualization and classification problems. The results show that ISOSIR is a promising nonlinear feature extractor for classification applications.