Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Classes of kernels for machine learning: a statistics perspective
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
K-means clustering via principal component analysis
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Non-negative Laplacian Embedding
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Multi-label linear discriminant analysis
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part VI
Hi-index | 0.00 |
Dimensionality reduction plays a vital role in pattern recognition. However, for normalized vector data, existing methods do not utilize the fact that the data is normalized. In this paper, we propose to employ an Angular Decomposition of the normalized vector data which corresponds to embedding them on a unit surface. On graph data for similarity/ kernel matrices with constant diagonal elements, we propose the Angular Decomposition of the similarity matrices which corresponds to embedding objects on a unit sphere. In these angular embeddings, the Euclidean distance is equivalent to the cosine similarity. Thus data structures best described in the cosine similarity and data structures best captured by the Euclidean distance can both be effectively detected in our angular embedding. We provide the theoretical analysis, derive the computational algorithm, and evaluate the angular embedding on several datasets. Experiments on data clustering demonstrate that our method can provide a more discriminative subspace.