Asymptotic theory of finite dimensional normed spaces
Asymptotic theory of finite dimensional normed spaces
Elements of information theory
Elements of information theory
Information-based objective functions for active data selection
Neural Computation
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Active learning with statistical models
Journal of Artificial Intelligence Research
The Journal of Machine Learning Research
Sequential optimal design of neurophysiology experiments
Neural Computation
Information theory in neuroscience
Journal of Computational Neuroscience
Automating the design of informative sequences of sensory stimuli
Journal of Computational Neuroscience
Hi-index | 0.00 |
We discuss an idea for collecting data in a relatively efficient manner. Our point of view is Bayesian and information-theoretic: on any given trial, we want to adaptively choose the input in such a way that the mutual information between the (unknown) state of the system and the (stochastic) output is maximal, given any prior information (including data collected on any previous trials). We prove a theorem that quantifies the effectiveness of this strategy and give a few illustrative examples comparing the performance of this adaptive technique to that of the more usual nonadaptive experimental design. In particular, we calculate the asymptotic efficiency of the information-maximization strategy and demonstrate that this method is in a well-defined sense never less efficient—and is generically more efficient—than the nonadaptive strategy. For example, we are able to explicitly calculate the asymptotic relative efficiency of the staircase method widely employed in psychophysics research and to demonstrate the dependence of this efficiency on the form of the psychometric function underlying the output responses.