Population dynamics of grammar acquisition
Simulating the evolution of language
The covering number in learning theory
Journal of Complexity
A word-order database for testing computational models of language acquisition
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
Neural Computation
Prediction of Creole Emergence in Spatial Language Dynamics
LATA '09 Proceedings of the 3rd International Conference on Language and Automata Theory and Applications
Journal of Logic, Language and Information
Hi-index | 0.00 |
From the Publisher:Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change.