Elements of information theory
Elements of information theory
The upward bias in measures of information derived from limited data samples
Neural Computation
Spikes: exploring the neural code
Spikes: exploring the neural code
Convergence properties of functional estimates for discrete distributions
Random Structures & Algorithms - Special issue on analysis of algorithms dedicated to Don Knuth on the occasion of his (100)8th birthday
Numerical Recipes in C: The Art of Scientific Computing
Numerical Recipes in C: The Art of Scientific Computing
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Second-order noiseless source coding theorems
IEEE Transactions on Information Theory
Estimation of the information by an adaptive partitioning of the observation space
IEEE Transactions on Information Theory
Tree-structured neural decoding
The Journal of Machine Learning Research
Edgeworth Approximation of Multivariate Differential Entropy
Neural Computation
Estimating Entropy Rates with Bayesian Confidence Intervals
Neural Computation
Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
Neural Computation
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
An information-theoretic approach to network monitoring and measurement
IMC '05 Proceedings of the 5th ACM SIGCOMM conference on Internet Measurement
Approximating entropy from sublinear samples
SODA '07 Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms
Information-theoretic inference of large transcriptional regulatory networks
EURASIP Journal on Bioinformatics and Systems Biology
Sampling stable properties of massive track datasets
Proceedings of the 2007 workshop on Massive datasets
Sequential fixed-point ica based on mutual information minimization
Neural Computation
Inferring Connectivity of Genetic Regulatory Networks Using Information-Theoretic Criteria
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Motif discovery in tissue-specific regulatory sequences using directed information
EURASIP Journal on Bioinformatics and Systems Biology
Multi-classification by categorical features via clustering
Proceedings of the 25th international conference on Machine learning
Generalization from Observed to Unobserved Features by Clustering
The Journal of Machine Learning Research
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part I
Speeding Up Feature Subset Selection Through Mutual Information Relevance Filtering
PKDD 2007 Proceedings of the 11th European conference on Principles and Practice of Knowledge Discovery in Databases
Ranking the Uniformity of Interval Pairs
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Learning and Generalization with the Information Bottleneck
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
Towards an information model of consistency maintenance in distributed interactive applications
International Journal of Computer Games Technology - Networking for Computer Games
Variational Graph Embedding for Globally and Locally Consistent Feature Extraction
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Learning to Interpret Cognitive States from fMRI Brain Images
Proceedings of the 2009 conference on Computational Intelligence and Bioengineering: Essays in Memory of Antonina Starita
Universal estimation of erasure entropy
IEEE Transactions on Information Theory
Least conservative support and tolerance tubes
IEEE Transactions on Information Theory
Discrete camera calibration from pixel streams
Computer Vision and Image Understanding
Learning Gaussian mixture models with entropy-based criteria
IEEE Transactions on Neural Networks
A novel approach for distributed application scheduling based on prediction of communication events
Future Generation Computer Systems
Biological network inference using redundancy analysis
BIRD'07 Proceedings of the 1st international conference on Bioinformatics research and development
A novel information theory method for filter feature selection
MICAI'07 Proceedings of the artificial intelligence 6th Mexican international conference on Advances in artificial intelligence
Information theoretic derivations for causality detection: application to human gait
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Probabilistic identification for hard to classify protocol
WISTP'08 Proceedings of the 2nd IFIP WG 11.2 international conference on Information security theory and practices: smart devices, convergence and next generation networks
Learning and generalization with the information bottleneck
Theoretical Computer Science
A distribution-based approach to anomaly detection and application to 3G mobile traffic
GLOBECOM'09 Proceedings of the 28th IEEE conference on Global telecommunications
Entropy-based traffic filtering to support real-time Skype detection
Proceedings of the 6th International Wireless Communications and Mobile Computing Conference
Information theory and neural information processing
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
Anthropic correction of information estimates and its application to neural coding
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
ACM Transactions on Algorithms (TALG)
The interplay between entropy and variational distance
IEEE Transactions on Information Theory
On the use of financial data as a random beacon
EVT/WOTE'10 Proceedings of the 2010 international conference on Electronic voting technology/workshop on trustworthy elections
PAC-Bayesian Analysis of Co-clustering and Beyond
The Journal of Machine Learning Research
Journal of Computational Neuroscience
Information theory in neuroscience
Journal of Computational Neuroscience
On directed information theory and Granger causality graphs
Journal of Computational Neuroscience
Of passwords and people: measuring the effect of password-composition policies
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Entropy estimation for real-time encrypted traffic identification
TMA'11 Proceedings of the Third international conference on Traffic monitoring and analysis
Proceedings of the forty-third annual ACM symposium on Theory of computing
Applying the multivariate time-rescaling theorem to neural population models
Neural Computation
A comprehensive evaluation of mutual information analysis using a fair evaluation framework
CRYPTO'11 Proceedings of the 31st annual conference on Advances in cryptology
Mutual information-based sensor positioning for car cabin comfort control
KES'11 Proceedings of the 15th international conference on Knowledge-based and intelligent information and engineering systems - Volume Part III
A mutual information based face clustering algorithm for movie content analysis
Image and Vision Computing
Extracting biomarkers of autism from MEG resting-state functional connectivity networks
Computers in Biology and Medicine
Information distances over clusters
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Statistical measurement of information leakage
TACAS'10 Proceedings of the 16th international conference on Tools and Algorithms for the Construction and Analysis of Systems
Entropy estimation in turing's perspective
Neural Computation
The Journal of Machine Learning Research
An exploration of the kolmogorov-smirnov test as a competitor to mutual information analysis
CARDIS'11 Proceedings of the 10th IFIP WG 8.8/11.2 international conference on Smart Card Research and Advanced Applications
Adaptive Compressed Image Sensing Using Dictionaries
SIAM Journal on Imaging Sciences
Proceedings of the sixth ACM conference on Recommender systems
Information-theoretic selection of high-dimensional spectral features for structural recognition
Computer Vision and Image Understanding
Information estimators for weighted observations
Neural Networks
Journal of Computational Neuroscience
Learning dependence from samples
International Journal of Bioinformatics Research and Applications
Optimized dissimilarity space embedding for labeled graphs
Information Sciences: an International Journal
Synergy, redundancy, and multivariate information measures: an experimentalist's perspective
Journal of Computational Neuroscience
Hi-index | 0.18 |
We present some new results on the nonparametric estimation of entropy and mutual information. First, we use an exact local expansion of the entropy function to prove almost sure consistency and central limit theorems for three of the most commonly used discretized information estimators. The setup is related to Grenander's method of sieves and places no assumptions on the underlying probability measure generating the data. Second, we prove a converse to these consistency theorems, demonstrating that a misapplication of the most common estimation techniques leads to an arbitrarily poor estimate of the true information, even given unlimited data. This "inconsistency" theorem leads to an analytical approximation of the bias, valid in surprisingly small sample regimes and more accurate than the usual 1/N formula of Miller and Madow over a large region of parameter space. The two most practical implications of these results are negative: (1) information estimates in a certain data regime are likely contaminated by bias, even if "bias-corrected" estimators are used, and (2) confidence intervals calculated by standard techniques drastically underestimate the error of the most common estimation methods.Finally, we note a very useful connection between the bias of entropy estimators and a certain polynomial approximation problem. By casting bias calculation problems in this approximation theory framework, we obtain the best possible generalization of known asymptotic bias results. More interesting, this framework leads to an estimator with some nice properties: the estimator comes equipped with rigorous bounds on the maximum error over all possible underlying probability distributions, and this maximum error turns out to be surprisingly small. We demonstrate the application of this new estimator on both real and simulated data.