Complexity measures and cellular automata
Complex Systems
Computation at the edge of chaos: phase transitions and emergent computation
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
Elements of information theory
Elements of information theory
Evolving cellular automata to perform computations: mechanisms and impediments
Proceedings of the NATO advanced research workshop and EGS topical workshop on Chaotic advection, tracer dynamics and turbulent dispersion
Computational mechanics of cellular automata: an example
Proceedings of the workshop on Lattice dynamics
A new kind of science
Causal architecture, complexity and self-organization in time series and cellular automata
Causal architecture, complexity and self-organization in time series and cellular automata
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Non-uniform cellular automata based associative memory: Evolutionary design and basins of attraction
Information Sciences: an International Journal
Detecting non-trivial computation in complex dynamics
ECAL'07 Proceedings of the 9th European conference on Advances in artificial life
Networks of the Brain
Information dynamics in small-world boolean networks
Artificial Life
Evolving spatiotemporal coordination in a modular robotic system
SAB'06 Proceedings of the 9th international conference on From Animals to Animats: simulation of Adaptive Behavior
The differential ant-stigmergy algorithm
Information Sciences: an International Journal
Local Shannon entropy measure with statistical tests for image randomness
Information Sciences: an International Journal
Hi-index | 0.07 |
Information storage is a key component of intrinsic distributed computation. Despite the existence of appropriate measures for it (e.g. excess entropy), its role in interacting with information transfer and modification to give rise to distributed computation is not yet well-established. We explore how to quantify information storage on a local scale in space and time, so as to understand its role in the dynamics of distributed computation. To assist these explorations, we introduce the active information storage, which quantifies the information storage component that is directly in use in the computation of the next state of a process. We present the first profiles of local excess entropy and local active information storage in cellular automata, providing evidence that blinkers and background domains are dominant information storage processes in these systems. This application also demonstrates the manner in which these two measures of information storage are distinct but complementary. It also reveals other information storage phenomena, including the misinformative nature of local storage when information transfer dominates the computation, and demonstrates that the local entropy rate is a useful spatiotemporal filter for information transfer structure.