An information-theoretic unsupervised learning algorithm for neural networks
An information-theoretic unsupervised learning algorithm for neural networks
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
A neural implementation of canonical correlation analysis
Neural Networks
Fast subspace tracking and neural network learning by a novelinformation criterion
IEEE Transactions on Signal Processing
Learning in linear neural networks: a survey
IEEE Transactions on Neural Networks
Hi-index | 0.02 |
A new hybrid information maximization (HIM) algorithm is derived. This algorithm is able to perform subspace mapping of multi-channel signals, where the input (feature) vector for each of the channels is linearly transformed to an output vector. The algorithm is based on maximizing the mutual information (MI) between input and output sets for each of the channels, and between output sets across channels. Such formulation leads to a substantial redundancy reduction in the output sets, and the extraction of higher order features that exhibit coherence across time and/or space. In this paper, we develop the proposed algorithm and show that it combines efficiently the strengths of two well-known subspace mapping techniques, namely the principal component analysis (PCA) and the canonical correlation analysis (CCA). Unlike CCA, which is limited to two channels, the HIM algorithm can easily be extended to multiple channels. A number of simulations and real experiments are conducted to compare the performance of HIM to that of PCA and CCA.