Time series: theory and methods
Time series: theory and methods
Asymptotic bootstrap corrections of AIC for linear regression models
Signal Processing
Journal of Computational Neuroscience
On directed information theory and Granger causality graphs
Journal of Computational Neuroscience
A small sample model selection criterion based on Kullback's symmetric divergence
IEEE Transactions on Signal Processing
The AIC Criterion and Symmetrizing the Kullback–Leibler Divergence
IEEE Transactions on Neural Networks
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Detecting and characterizing causal interdependencies and couplings between different activated brain areas from functional neuroimage time series measurements of their activity constitutes a significant step toward understanding the process of brain functions. In this letter, we make the simple point that all current statistics used to make inferences about directed influences in functional neuroimage time series are variants of the same underlying quantity. This includes directed transfer entropy, transinformation, Kullback-Leibler formulations, conditional mutual information, and Granger causality. Crucially, in the case of autoregressive modeling, the underlying quantity is the likelihood ratio that compares models with and without directed influences from the past when modeling the influence of one time series on another. This framework is also used to derive the relation between these measures of directed influence and the complexity or the order of directed influence. These results provide a framework for unifying the Kullback-Leibler divergence, Granger causality, and the complexity of directed influence.