Universal approximation using radial-basis-function networks
Neural Computation
Communications of the ACM
Interior Methods for Nonlinear Optimization
SIAM Review
An Interior Point Algorithm for Large-Scale Nonlinear Programming
SIAM Journal on Optimization
A Globally Convergent Linearly Constrained Lagrangian Method for Nonlinear Optimization
SIAM Journal on Optimization
An interior algorithm for nonlinear optimization that combines line search and trust region steps
Mathematical Programming: Series A and B
Unsupervised Learning of Image Manifolds by Semidefinite Programming
International Journal of Computer Vision
Non-isometric manifold learning: analysis and an algorithm
Proceedings of the 24th international conference on Machine learning
A Nonlinear Mapping for Data Structure Analysis
IEEE Transactions on Computers
m-SNE: Multiview Stochastic Neighbor Embedding
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.10 |
Nonlinear dimensionality reduction is essential for the analysis and the interpretation of high dimensional data sets. In this manuscript, we propose a distance order preserving manifold learning algorithm that extends the basic mean-squared error cost function used mainly in multidimensional scaling (MDS)-based methods. We develop a constrained optimization problem by assuming explicit constraints on the order of distances in the low-dimensional space. In this optimization problem, as a generalization of MDS, instead of forcing a linear relationship between the distances in the high-dimensional original and low-dimensional projection space, we learn a non-decreasing relation approximated by radial basis functions. We compare the proposed method with existing manifold learning algorithms using synthetic datasets based on the commonly used residual variance and proposed percentage of violated distance orders metrics. We also perform experiments on a retinal image dataset used in Retinopathy of Prematurity (ROP) diagnosis.