Naive physics, event perception, lexical semantics, and language acquisition
Naive physics, event perception, lexical semantics, and language acquisition
The syntactic process
Knowledge and learning in natural language
Knowledge and learning in natural language
Online Model Selection Based on the Variational Bayes
Neural Computation
Learning for semantic parsing with statistical machine translation
HLT-NAACL '06 Proceedings of the main conference on Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics
A generative model for parsing natural language to meaning representations
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Learning language semantics from ambiguous supervision
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Learning context-dependent mappings from sentences to logical form
ACL '09 Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 2 - Volume 2
Training a multilingual sportscaster: using perceptual context to learn language
Journal of Artificial Intelligence Research
Inducing probabilistic CCG grammars from logical form with higher-order unification
EMNLP '10 Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
Reducing grounded learning tasks to grammatical inference
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Lexical generalization in CCG grammar induction for semantic parsing
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Probabilistic models of grammar acquisition
Proceedings of the Workshop on Computational Models of Language Acquisition and Loss
Concurrent acquisition of word meaning and lexical categories
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Hi-index | 0.00 |
This paper presents an incremental probabilistic learner that models the acquistion of syntax and semantics from a corpus of child-directed utterances paired with possible representations of their meanings. These meaning representations approximate the contextual input available to the child; they do not specify the meanings of individual words or syntactic derivations. The learner then has to infer the meanings and syntactic properties of the words in the input along with a parsing model. We use the CCG grammatical framework and train a non-parametric Bayesian model of parse structure with online variational Bayesian expectation maximization. When tested on utterances from the CHILDES corpus, our learner outperforms a state-of-the-art semantic parser. In addition, it models such aspects of child acquisition as "fast mapping," while also countering previous criticisms of statistical syntactic learners.