Estimation of entropy and mutual information
Neural Computation
Input Variable Selection: Mutual Information and Linear Mixing Measures
IEEE Transactions on Knowledge and Data Engineering
Computing Information in Neuronal Spikes
Neural Processing Letters
A new mutual information based measure for feature selection
Intelligent Data Analysis
EURASIP Journal on Advances in Signal Processing
Blind Extraction of Chaotic Sources from White Gaussian Noise Based on a Measure of Determinism
ICA '09 Proceedings of the 8th International Conference on Independent Component Analysis and Signal Separation
Improving Pseudobagging techniques
Proceedings of the 2008 conference on Artificial Intelligence Research and Development: Proceedings of the 11th International Conference of the Catalan Association for Artificial Intelligence
A dependency-based search strategy for feature selection
Expert Systems with Applications: An International Journal
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Discriminative wavelet packet filter bank selection for pattern recognition
IEEE Transactions on Signal Processing
Mutual information approximation via maximum likelihood estimation of density ratio
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
High-dimensional entropy estimation for finite accuracy data: R-NN entropy estimator
IPMI'07 Proceedings of the 20th international conference on Information processing in medical imaging
ACIVS'07 Proceedings of the 9th international conference on Advanced concepts for intelligent vision systems
IEEE Transactions on Signal Processing
Blind source separation of overdetermined linear-quadratic mixtures
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
Neural Networks
Measuring global behaviour of multi-agent systems from pair-wise mutual information
KES'05 Proceedings of the 9th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part IV
Low bias histogram-based estimation of mutual information for feature selection
Pattern Recognition Letters
Computers and Electrical Engineering
Hi-index | 754.84 |
We demonstrate that it is possible to approximate the mutual information arbitrarily closely in probability by calculating the relative frequencies on appropriate partitions and achieving conditional independence on the rectangles of which the partitions are made. Empirical results, including a comparison with maximum-likelihood estimators, are presented