Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
New approximations of differential entropy for independent component analysis and projection pursuit
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
High-order contrasts for independent component analysis
Neural Computation
Orthogonal series density estimation and the kernel eigenvalue problem
Neural Computation
Kurtosis Extrema and Identification of Independent Components: A Neural Network Approach
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97) -Volume 4 - Volume 4
Generalized anti-Hebbian learning for source separation
ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 02
Adaptive paraunitary filter banks for contrast-based multichannel blind deconvolution
ICASSP '01 Proceedings of the Acoustics, Speech, and Signal Processing, 2001. on IEEE International Conference - Volume 05
Blind separation of instantaneous mixture of sources via anindependent component analysis
IEEE Transactions on Signal Processing
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
A maximum entropy approach for blind deconvolution
Signal Processing - Fractional calculus applications in signals and systems
Some Equivalences between Kernel Methods and Information Theoretic Methods
Journal of VLSI Signal Processing Systems
A robust H∞ learning approach to blind separation of signals
Digital Signal Processing
Complex independent component analysis by entropy bound minimization
IEEE Transactions on Circuits and Systems Part I: Regular Papers
Independent component analysis by entropy bound minimization
IEEE Transactions on Signal Processing
Hi-index | 0.01 |
Minimum output mutual information is regarded as a natural criterion for independent component analysis (ICA) and is used as the performance measure in many ICA algorithms. Two common approaches in information-theoretic ICA algorithms are minimum mutual information and maximum output entropy approaches. In the former approach, we substitute some form of probability density function (pdf) estimate into the mutual information expression, and in the latter we incorporate the source pdf assumption in the algorithm through the use of nonlinearities matched to the corresponding cumulative density functions (cdf). Alternative solutions to ICA use higher-order cumulant-based optimization criteria, which are related to either one of these approaches through truncated series approximations for densities. In this article, we propose a new ICA algorithm motivated by the maximum entropy principle (for estimating signal distributions). The optimality criterion is the minimum output mutual information, where the estimated pdfs are from the exponential family and are approximate solutions to a constrained entropy maximization problem. This approach yields an upper bound for the actual mutual information of the output signals--hence, the name minimax mutual information ICA algorithm. In addition, we demonstrate that for a specific selection of the constraint functions in the maximum entropy density estimation procedure, the algorithm relates strongly to ICA methods using higher-order cumulants.