Fundamentals of speech recognition
Fundamentals of speech recognition
Natural gradient works efficiently in learning
Neural Computation
Kernel Canonical Correlation Analysis and Least Squares Support Vector Machines
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Canonical correlation analysis based on information theory
Journal of Multivariate Analysis
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Using KCCA for Japanese---English cross-language information retrieval and document classification
Journal of Intelligent Information Systems
Speeded-Up Robust Features (SURF)
Computer Vision and Image Understanding
A Least-squares Approach to Direct Importance Estimation
The Journal of Machine Learning Research
Canonical correlation analysis using within-class coupling
Pattern Recognition Letters
IEEE Transactions on Pattern Analysis and Machine Intelligence
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Proceedings of the 21st ACM international conference on Multimedia
Hi-index | 0.00 |
Canonical correlation analysis (CCA) is a classical dimensionality reduction technique for two sets of variables that iteratively finds projection directions with maximum correlation. Although CCA is still in vital use in many practical application areas, recent real-world data often contain more complicated nonlinear correlations that cannot be properly captured by classical CCA. In this paper, we thus propose an extension of CCA that can effectively capture such complicated nonlinear correlations through statistical dependency maximization. The proposed method, which we call least-squares canonical dependency analysis (LSCDA), is based on a squared-loss variant of mutual information, and it has various useful properties besides its ability to capture higher-order correlations: for example, it can simultaneously find multiple projection directions (i.e., subspaces), it does not involve density estimation, and it is equipped with a model selection strategy. We demonstrate the usefulness of LSCDA through various experiments on artificial and real-world datasets.