Measures of mutual and causal dependence between two time series
IEEE Transactions on Information Theory
Entropy and information theory
Entropy and information theory
Elements of information theory
Elements of information theory
Application of methods based on higher-order statistics for chaotic time series analysis
Signal Processing - Special issue on higher order statistics
Causality: models, reasoning, and inference
Causality: models, reasoning, and inference
Estimation of entropy and mutual information
Neural Computation
Permutation, Parametric, and Bootstrap Tests of Hypotheses (Springer Series in Statistics)
Permutation, Parametric, and Bootstrap Tests of Hypotheses (Springer Series in Statistics)
Distinguishing Causal Interactions in Neural Populations
Neural Computation
The capacity of channels with feedback
IEEE Transactions on Information Theory
Divergence estimation for multidimensional densities via k-nearest-neighbor distances
IEEE Transactions on Information Theory
Learning graphical models for stationary time series
IEEE Transactions on Signal Processing
Source Coding With Feed-Forward: Rate-Distortion Theorems and Error Exponents for a General Source
IEEE Transactions on Information Theory
Journal of Computational Neuroscience
Information theory in neuroscience
Journal of Computational Neuroscience
Hi-index | 0.00 |
Directed information theory deals with communication channels with feedback. When applied to networks, a natural extension based on causal conditioning is needed. We show here that measures built from directed information theory in networks can be used to assess Granger causality graphs of stochastic processes. We show that directed information theory includes measures such as the transfer entropy, and that it is the adequate information theoretic framework needed for neuroscience applications, such as connectivity inference problems.