Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Natural gradient works efficiently in learning
Neural Computation
Entropy Optimization - Application to Blind Source Separation
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Separating Convolutive Mixtures by Mutual Information Minimization
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Bio-inspired Applications of Connectionism-Part II
ICA using spacings estimates of entropy
The Journal of Machine Learning Research
Three easy ways for separating nonlinear mixtures?
Signal Processing - Special issue on independent components analysis and beyond
MISEP - Linear and nonlinear ICA based on mutual information
The Journal of Machine Learning Research
On the entropy minimization of a linear mixture of variables for source separation
Signal Processing - Special issue: Information theoretic signal processing
Source separation in post-nonlinear mixtures
IEEE Transactions on Signal Processing
Equivariant adaptive source separation
IEEE Transactions on Signal Processing
Mutual information approach to blind separation of stationary sources
IEEE Transactions on Information Theory
On the entropy minimization of a linear mixture of variables for source separation
Signal Processing - Special issue: Information theoretic signal processing
Noise speech wavelet analyzing in special time ranges
ICACT'10 Proceedings of the 12th international conference on Advanced communication technology
Noise wavelet evaluating near to zero by thresholding method analyzing
ICCC'11 Proceedings of the 2011 international conference on Computers and computing
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Computers and Electrical Engineering
Hi-index | 0.00 |
In this paper, a nonparametric "gradient" of the mutual information is first introduced. It is used for showing that mutual information has no local minima. Using the introduced "gradient", two general gradient based approaches for minimizing mutual information in a parametric model are then presented. These approaches are quite general, and principally they can be used in any mutual information minimization problem. In blind source separation, these approaches provide powerful tools for separating any complicated (yet separable) mixing model. In this paper, they are used to develop algorithms for separating four separable mixing models: linear instantaneous, linear convolutive, post nonlinear (PNL) and convolutive post nonlinear (CPNL) mixtures.