Elements of information theory
Elements of information theory
Information-based objective functions for active data selection
Neural Computation
Economical experiments: Bayesian efficient experimental design
International Journal of Game Theory
Mutual information, Fisher information, and population coding
Neural Computation
A Bayesian approach to on-line learning
On-line learning in neural networks
The importance of complexity in model selection
Journal of Mathematical Psychology
Estimation of entropy and mutual information
Neural Computation
Asymptotic Theory of Information-Theoretic Experimental Design
Neural Computation
Sequential optimal design of neurophysiology experiments
Neural Computation
Optimal Decision Stimuli for Risky Choice Experiments: An Adaptive Approach
Management Science
Computational Statistics & Data Analysis
Hi-index | 0.00 |
Discriminating among competing statistical models is a pressing issue for many experimentalists in the field of cognitive science. Resolving this issue begins with designing maximally informative experiments. To this end, the problem to be solved in adaptive design optimization is identifying experimental designs under which one can infer the underlying model in the fewest possible steps. When the models under consideration are nonlinear, as is often the case in cognitive science, this problem can be impossible to solve analytically without simplifying assumptions. However, as we show in this letter, a full solution can be found numerically with the help of a Bayesian computational trick derived from the statistics literature, which recasts the problem as a probability density simulation in which the optimal design is the mode of the density. We use a utility function based on mutual information and give three intuitive interpretations of the utility function in terms of Bayesian posterior estimates. As a proof of concept, we offer a simple example application to an experiment on memory retention.