Matrix computations (3rd ed.)
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A kernel view of the dimensionality reduction of manifolds
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Non-isometric manifold learning: analysis and an algorithm
Proceedings of the 24th international conference on Machine learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Manifold Learning: The Price of Normalization
The Journal of Machine Learning Research
Hi-index | 0.00 |
In this paper, we propose a novel linear dimensionality reduction algorithm, called Orthogonal Projection Analysis (OPA), from a gradient field perspective. Our approach is based on the following two criteria. First, the linear map should preserve the metric of the ambient space, which is based on the assumption that the metric of the ambient space is reliable. The second is the well-known smoothness criterion which is critical for clustering. Interestingly, gradient field is a natural tool to connect to these two requirements. We give a continuous objective function based on gradient fields and discuss how to discretize it by using tangent space. We also show the geometric meaning of our approach, which is requiring the gradient field as orthogonal as possible to the tangent spaces. The experimental results have demonstrated the effectiveness of our proposed approach.