Communications of the ACM
Computational limitations on learning from examples
Journal of the ACM (JACM)
A general lower bound on the number of examples needed for learning
Information and Computation
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
SIAM Journal on Computing
On learning ring-sum-expansions
SIAM Journal on Computing
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
An introduction to computational learning theory
An introduction to computational learning theory
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Artificial Intelligence
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Evolvability from learning algorithms
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Proceedings of the forty-first annual ACM symposium on Theory of computing
Partial observability and learnability
Artificial Intelligence
PAC learning and genetic programming
Proceedings of the 13th annual conference on Genetic and evolutionary computation
LATA'11 Proceedings of the 5th international conference on Language and automata theory and applications
Distribution free evolvability of polynomial functions over all convex loss functions
Proceedings of the 3rd Innovations in Theoretical Computer Science Conference
A complete characterization of statistical query learning with applications to evolvability
Journal of Computer and System Sciences
The max problem revisited: the importance of mutation in genetic programming
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Evolvability via the Fourier transform
Theoretical Computer Science
Evolution by adapting surrogates
Evolutionary Computation
Attribute-efficient evolvability of linear functions
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 0.01 |
Living organisms function in accordance with complex mechanisms that operate in different ways depending on conditions. Darwin's theory of evolution suggests that such mechanisms evolved through variation guided by natural selection. However, there has existed no theory that would explain quantitatively which mechanisms can so evolve in realistic population sizes within realistic time periods, and which are too complex. In this article, we suggest such a theory. We treat Darwinian evolution as a form of computational learning from examples in which the course of learning is influenced only by the aggregate fitness of the hypotheses on the examples, and not otherwise by specific examples. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. We show that in a single stage of evolution monotone Boolean conjunctions and disjunctions are evolvable over the uniform distribution, while Boolean parity functions are not. We suggest that the mechanism that underlies biological evolution overall is “evolvable target pursuit”, which consists of a series of evolutionary stages, each one inexorably pursuing an evolvable target in the technical sense suggested above, each such target being rendered evolvable by the serendipitous combination of the environment and the outcomes of previous evolutionary stages.