Learning monotone ku DNF formulas on product distributions
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Improved learning of AC0 functions
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
An introduction to computational learning theory
An introduction to computational learning theory
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Learning decision lists and trees with equivalence-queries
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Simple PAC Learning of Simple Decision Lists
ALT '95 Proceedings of the 6th International Conference on Algorithmic Learning Theory
On learning monotone DNF under product distributions
Information and Computation
Evolvability from learning algorithms
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
A complete and tight average-case analysis of learning monomials
STACS'99 Proceedings of the 16th annual conference on Theoretical aspects of computer science
MFCS'07 Proceedings of the 32nd international conference on Mathematical Foundations of Computer Science
Hi-index | 0.00 |
Valiant recently introduced a learning theoretic framework for evolution, and showed that his swapping algorithm evolves monotone conjunctions efficiently over the uniform distribution. We continue the study of the swapping algorithm for monotone conjunctions. A modified presentation is given for the uniform distribution, which leads to a characterization of best approximations, a simplified analysis and improved complexity bounds. It is shown that for product distributions a similar characterization does not hold, and there may be local optima of the fitness function. However, the characterization holds if the correlation fitness function is replaced by covariance. Evolvability results are given for product distributions using the covariance fitness function, assuming either arbitrary tolerances, or a non-degeneracy condition for the distribution and a size bound on the target.