A maximum entropy approach to natural language processing
Computational Linguistics
Automatic labeling of semantic roles
Computational Linguistics
Maximum Entropy Markov Models for Information Extraction and Segmentation
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Semantic role labeling via integer linear programming inference
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
Non-projective dependency parsing using spanning tree algorithms
HLT '05 Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing
Multilingual dependency analysis with a two-stage discriminative parser
CoNLL-X '06 Proceedings of the Tenth Conference on Computational Natural Language Learning
The CoNLL-2008 shared task on joint parsing of syntactic and semantic dependencies
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
A statistical semantic parser that integrates syntax and semantics
CONLL '05 Proceedings of the Ninth Conference on Computational Natural Language Learning
Generalized inference with multiple semantic role labeling systems
CONLL '05 Proceedings of the Ninth Conference on Computational Natural Language Learning
The integration of syntactic parsing and semantic role labeling
CONLL '05 Proceedings of the Ninth Conference on Computational Natural Language Learning
The CoNLL-2008 shared task on joint parsing of syntactic and semantic dependencies
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
Hi-index | 0.00 |
This paper describes a system to solve the joint learning of syntactic and semantic dependencies. An directed graphical model is put forward to integrate dependency relation classification and semantic role labeling. We present a bilayer directed graph to express probabilistic relationships between syntactic and semantic relations. Maximum Entropy Markov Models are implemented to estimate conditional probability distribution and to do inference. The submitted model yields 76.28% macro-average F1 performance, for the joint task, 85.75% syntactic dependencies LAS and 66.61% semantic dependencies F1.