An introduction to computational learning theory
An introduction to computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Introduction to data compression
Introduction to data compression
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
A minimum description length approach to grammar inference
Connectionist, Statistical, and Symbolic Approaches to Learning for Natural Language Processing
The Logical Problem of Language Change
The Logical Problem of Language Change
ECAL '01 Proceedings of the 6th European Conference on Advances in Artificial Life
Hi-index | 0.00 |
What permits some systems to evolve and adapt more effectively than others? Gell-Mann [3] has stressed the importance of "compression" for adaptive complex systems. Information about the environment is not simply recorded as a look-up table, but is rather compressed in a theory or schema. Several conjectures are proposed; (I) compression aids in generalization; (II) compression occurs more easily in a "smooth", as opposed to a "rugged", string space; and (III) constraints from compression make it likely that natural languages evolve towards smooth string spaces. We have been examining the role of such compression for learning and evolution of formal languages by artificial agents. Our system does seem to conform generally to these expectations, but the trade-offs between compression and the errors that sometimes accompany it need careful consideration.