Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Boolean Feature Discovery in Empirical Learning
Machine Learning
Computational learning theory: an introduction
Computational learning theory: an introduction
The nature of statistical learning theory
The nature of statistical learning theory
Efficient and Accurate Parallel Genetic Algorithms
Efficient and Accurate Parallel Genetic Algorithms
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
Computational Complexity of Machine Learning
Computational Complexity of Machine Learning
Machine Learning
Principles in the Evolutionary Design of Digital Circuits—Part I
Genetic Programming and Evolvable Machines
Sizing Populations for Serial and Parallel Genetic Algorithms
Proceedings of the 3rd International Conference on Genetic Algorithms
Using Genetic Algorithms with Small Populations
Proceedings of the 5th International Conference on Genetic Algorithms
Machine Learning Approach to Gate-Level Evolvable Hardware
ICES '96 Proceedings of the First International Conference on Evolvable Systems: From Biology to Hardware
The probably approximately correct (PAC) population size of a genetic algorithm
ICTAI '00 Proceedings of the 12th IEEE International Conference on Tools with Artificial Intelligence
Predictive models for the breeder genetic algorithm i. continuous parameter optimization
Evolutionary Computation
The science of breeding and its application to the breeder genetic algorithm (bga)
Evolutionary Computation
Explorations in design space: unconventional electronics designthrough artificial evolution
IEEE Transactions on Evolutionary Computation
Generalization and PAC learning: some new results for the class of generalized single-layer networks
IEEE Transactions on Neural Networks
Evolutionary synthesis of logic circuits using information theory
Artificial intelligence in logic design
Hi-index | 0.00 |
The number of samples needed to learn an instance of the representation class kDNFns of Boolean formulas is predicted using some tolerance parameters by the PAC framework. When the learning machine is a simple genetic algorithm, the initial population is an issue. Using PAC-learning we derive the population size that has at least one individual at some given Hamming distance from the optimum. Then we show that the population does not need to be close to the optimum in order to learn the concept.