Linear concepts and hidden variables
Machine Learning
The necessity of syntactic parsing for semantic role labeling
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Minimally supervised model of early language acquisition
CoNLL '09 Proceedings of the Thirteenth Conference on Computational Natural Language Learning
Starting from scratch in semantic role labeling
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Online latent structure training for language acquisition
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Three
Hi-index | 0.00 |
A fundamental task in sentence comprehension is to assign semantic roles to sentence constituents. The structure-mapping account proposes that children start with a shallow structural analysis of sentences: children treat the number of nouns in the sentence as a cue to its semantic predicate-argument structure, and represent language experience in an abstract format that permits rapid generalization to new verbs. In this paper, we tested the consequences of these representational assumptions via experiments with a system for automatic semantic role labeling (SRL), trained on a sample of child-directed speech. When the SRL was presented with representations of sentence structure consisting simply of an ordered set of nouns, it mimicked experimental findings with toddlers, including a striking error found in children. Adding features representing the position of the verb increased accuracy and eliminated the error. We show the SRL system can use incremental knowledge gain to switch from error-prone noun order features to a more accurate representation, demonstrating a possible mechanism for this process in child development.