Elements of information theory
Elements of information theory
Neural Computation
An introduction to Kolmogorov complexity and its applications
An introduction to Kolmogorov complexity and its applications
Claude Elwood Shannon: collected papers
Claude Elwood Shannon: collected papers
Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension
Machine Learning - Special issue on computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Rigorous learning curve bounds from statistical mechanics
Machine Learning - Special issue on COLT '94
Mutual information, Fisher information, and population coding
Neural Computation
Spikes: exploring the neural code
Spikes: exploring the neural code
A Theory of Program Size Formally Identical to Information Theory
Journal of the ACM (JACM)
Worst-Case Bounds for the Logarithmic Loss of Predictors
Machine Learning
Extrapolation, Interpolation, and Smoothing of Stationary Time Series
Extrapolation, Interpolation, and Smoothing of Stationary Time Series
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Fisher information and stochastic complexity
IEEE Transactions on Information Theory
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
Fluctuation-Dissipation Theorem and Models of Learning
Neural Computation
Estimating Entropy Rates with Bayesian Confidence Intervals
Neural Computation
Optimal Signal Estimation in Neuronal Models
Neural Computation
Visual Verification and Analysis of Cluster Detection for Molecular Dynamics
IEEE Transactions on Visualization and Computer Graphics
NOLASC'05 Proceedings of the 4th WSEAS International Conference on Non-linear Analysis, Non-linear Systems and Chaos
Formal Tools for the Analysis of Brain-Like Structures and Dynamics
Creating Brain-Like Intelligence
Information dynamics: patterns of expectation and surprise in the perception of music
Connection Science - Music, Brain, Cognition
Deformed statistics formulation of the information bottleneck method
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
Higher Coordination With Less Control-A Result of Information Maximization in the Sensorimotor Loop
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
IEEE Transactions on Audio, Speech, and Language Processing
Impoverished empowerment: 'meaningful' action sequence generation through bandwidth limitation
ECAL'09 Proceedings of the 10th European conference on Advances in artificial life: Darwin meets von Neumann - Volume Part II
Quantitative tools for examining the vocalizations of juvenile songbirds
Computational Intelligence and Neuroscience
The evolution of representation in simple cognitive networks
Neural Computation
Perceptual learning is specific to the trained structure of information
Journal of Cognitive Neuroscience
Project dynamics and emergent complexity
Computational & Mathematical Organization Theory
Hi-index | 0.00 |
We define predictive information Ipred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T:Ipred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If the time series allows us to learn a model with a finite number of parameters, then Ipred(T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, power-law growth is associated, for example, with the learning of infinite parameter (or nonparametric) models such as continuous functions with smoothness constraints. There are connections between the predictive information and measures of complexity that have been defined both in learning theory and the analysis of physical systems through statistical mechanics and dynamical systems theory. Furthermore, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of Ipred(T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in problems in physics, statistics, and biology.