Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Fitting Equations to Data: Computer Analysis of Multifactor Data
Fitting Equations to Data: Computer Analysis of Multifactor Data
Smoothed analysis of algorithms: Why the simplex algorithm usually takes polynomial time
Journal of the ACM (JACM)
Evolvability from learning algorithms
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Journal of the ACM (JACM)
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
Learning and Smoothed Analysis
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing
Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing
Optimization with Sparsity-Inducing Penalties
Foundations and Trends® in Machine Learning
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
Stable recovery of sparse overcomplete representations in the presence of noise
IEEE Transactions on Information Theory
Computational questions in evolution
Computational questions in evolution
Hi-index | 0.00 |
In a seminal paper, Valiant (2006) introduced a computational model for evolution to address the question of complexity that can arise through Darwinian mechanisms. Valiant views evolution as a restricted form of computational learning, where the goal is to evolve a hypothesis that is close to the ideal function. Feldman (2008) showed that (correlational) statistical query learning algorithms could be framed as evolutionary mechanisms in Valiant's model. P. Valiant (2012) considered evolvability of real-valued functions and also showed that weak-optimization algorithms that use weak-evaluation oracles could be converted to evolutionary mechanisms. In this work, we focus on the complexity of representations of evolutionary mechanisms. In general, the reductions of Feldman and P. Valiant may result in intermediate representations that are arbitrarily complex polynomial-sized circuits). We argue that biological constraints often dictate that the representations have low complexity, such as constant depth and fan-in circuits. We give mechanisms for evolving sparse linear functions under a large class of smooth distributions. These evolutionary algorithms are attribute-efficient in the sense that the size of the representations and the number of generations required depend only on the sparsity of the target function and the accuracy parameter, but have no dependence on the total number of attributes.