Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
A neural implementation of canonical correlation analysis
Neural Networks
Nonlinear canonical correlation analysis by neural networks
Neural Networks
A network for recursive extraction of canonical coordinates
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Kernel independent component analysis
The Journal of Machine Learning Research
Fast RLS-Like Algorithm for Generalized Eigendecomposition and its Applications
Journal of VLSI Signal Processing Systems
Finite-length MIMO equalization using canonical correlationanalysis
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
EURASIP Journal on Advances in Signal Processing
Joint blind source separation by multiset canonical correlation analysis
IEEE Transactions on Signal Processing
Stable algorithms for multiset canonical correlation analysis
ACC'09 Proceedings of the 2009 conference on American Control Conference
A New Canonical Correlation Analysis Algorithm with Local Discrimination
Neural Processing Letters
Properness and widely linear processing of quaternion random vectors
IEEE Transactions on Information Theory
Neighborhood Correlation Analysis for Semi-paired Two-View Data
Neural Processing Letters
Ensemble canonical correlation analysis
Applied Intelligence
Hi-index | 0.06 |
Canonical correlation analysis (CCA) is a classical tool in statistical analysis to find the projections that maximize the correlation between two data sets. In this work we propose a generalization of CCA to several data sets, which is shown to be equivalent to the classical maximum variance (MAXVAR) generalization proposed by Kettenring. The reformulation of this generalization as a set of coupled least squares regression problems is exploited to develop a neural structure for CCA. In particular, the proposed CCA model is a two layer feedforward neural network with lateral connections in the output layer to achieve the simultaneous extraction of all the CCA eigenvectors through deflation. The CCA neural model is trained using a recursive least squares (RLS) algorithm. Finally, the convergence of the proposed learning rule is proved by means of stochastic approximation techniques and their performance is analyzed through simulations.