Linear concepts and hidden variables
Machine Learning
The Proposition Bank: An Annotated Corpus of Semantic Roles
Computational Linguistics
Baby SRL: modeling early language acquisition
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
The necessity of syntactic parsing for semantic role labeling
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Learning and inference over constrained output
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Starting from scratch in semantic role labeling
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Online latent structure training for language acquisition
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Three
Hi-index | 0.00 |
Theories of human language acquisition assume that learning to understand sentences is a partially-supervised task (at best). Instead of using 'gold-standard' feedback, we train a simplified "Baby" Semantic Role Labeling system by combining world knowledge and simple grammatical constraints to form a potentially noisy training signal. This combination of knowledge sources is vital for learning; a training signal derived from a single component leads the learner astray. When this largely unsupervised training approach is applied to a corpus of child directed speech, the BabySRL learns shallow structural cues that allow it to mimic striking behaviors found in experiments with children and begin to correctly identify agents in a sentence.