Visual learning and recognition of 3-D objects from appearance
International Journal of Computer Vision
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Kernel independent component analysis
The Journal of Machine Learning Research
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Locally Multidimensional Scaling for Nonlinear Dimensionality Reduction
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 04
Locality preserving CCA with applications to data visualization and pose estimation
Image and Vision Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Patch Alignment for Dimensionality Reduction
IEEE Transactions on Knowledge and Data Engineering
A new method of feature fusion and its application in image recognition
Pattern Recognition
Multiset canonical correlations analysis and multispectral, truly multitemporal remote sensing data
IEEE Transactions on Image Processing
Hi-index | 0.01 |
Canonical correlation analysis (CCA) is a well-known technique for finding the correlations between two sets of multi-dimensional variables. It projects both sets of variables into a lower-dimensional space in which they are maximally correlated. One popular use of CCA is for dimensionality reduction. CCA can be regarded as a linear subspace approach for one view of an object set (e.g. X) which is directed by another view of the object set (e.g. Y). However, if the correlations between X and Y are nonlinear, CCA may fail to reveal the latent structures of X. In this paper, we propose a new nonlinear dimensionality reduction algorithm, called local canonical correlation analysis alignment (LCCA). In LCCA, CCA is implemented on patches of an object set to get the local low-dimensional coordinates of X"p (X"p is a patch of X), then the local coordinates are aligned to obtain the global low-dimensional embeddings of X. Furthermore, in order to solve out-of-sample problems, a linear version of LCCA (LLCCA) algorithm is also developed. Different from LCCA, LLCCA is not only suitable for training samples but also for testing samples. Experiments for data visualization and pose estimation show that LCCA and LLCCA are superior to the related algorithms.