Adaptive Entropy Rates for f MRI Time-Series Analysis

  • Authors:
  • John W. Fisher, III;Eric R. Cosman, Jr.;Cindy Wible;William M. Wells, III

  • Affiliations:
  • -;-;-;-

  • Venue:
  • MICCAI '01 Proceedings of the 4th International Conference on Medical Image Computing and Computer-Assisted Intervention
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

In previous work [Tsai et al, 1999] we introduced an information theoretic approach for analysis of fMRI time-series data. Subsequently, [Kim et al, 2000] we established a relationship between our information theoretic approach and a simple non-parametric hypothesis test. In this work, we describe an adaptive approach for incorporating the temporal structure that relates the fMRI time-series to both the current and past values of the experimental protocol. This is achieved via an extension of our previous approach using the information-theoretic concept of entropy rate. It can be shown that, despite a differing implementation, our prior method is a special case of the new approach. The entropy rate of a random process quantifies future uncertainty conditioned on the past and side-information (e.g. the experimental protocol, confounding signals, etc.) without making strong assumptions about the nature of that uncertainty (e.g. Gaussianity). Furthermore, we allow the form of the dependency to vary from voxel to voxel in an adaptive fashion. The combination of the information theoretic principles and adaptive estimation of the temporal dependency allows for a more powerful and flexible approach to fMRI analysis. Empirical results are presented on three fMRI datasets measuring motor, auditory, and visual cortex activation comparing the new approach to the previous one as well as a variation on the general linear model. Particular attention is paid to the differences in the type of phenomenology detected by the respective approaches.