Communications of the ACM
Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
On the analysis of the (1+ 1) evolutionary algorithm
Theoretical Computer Science
EuroGP '98 Proceedings of the First European Workshop on Genetic Programming
Gene Expression Programming: Mathematical Modeling by an Artificial Intelligence (Studies in Computational Intelligence)
Evolvability from learning algorithms
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Journal of the ACM (JACM)
A Complete Characterization of Statistical Query Learning with Applications to Evolvability
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
A Field Guide to Genetic Programming
A Field Guide to Genetic Programming
Theoretical results in genetic programming: the next ten years?
Genetic Programming and Evolvable Machines
Bioinspired Computation in Combinatorial Optimization: Algorithms and Their Computational Complexity
Bioinspired Computation in Combinatorial Optimization: Algorithms and Their Computational Complexity
Proceedings of the 11th workshop proceedings on Foundations of genetic algorithms
Genetic programming and evolutionary generalization
IEEE Transactions on Evolutionary Computation
Computational complexity analysis of multi-objective genetic programming
Proceedings of the 14th annual conference on Genetic and evolutionary computation
The max problem revisited: the importance of mutation in genetic programming
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Genetic programming (GP) is a very successful type of learning algorithm that is hard to understand from a theoretical point of view. With this paper we contribute to the computational complexity analysis of genetic programming that has been started recently. We analyze GP in the well-known PAC learning framework and point out how it can observe quality changes in the the evolution of functions by random sampling. This leads to computational complexity bounds for a linear GP algorithm for perfectly learning any member of a simple class of linear pseudo-Boolean functions. Furthermore, we show that the same algorithm on the functions from the same class finds good approximations of the target function in less time.