Centering: a framework for modeling the local coherence of discourse
Computational Linguistics
The nature of statistical learning theory
The nature of statistical learning theory
Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Discriminative Reranking for Natural Language Parsing
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Building a large annotated corpus of English: the penn treebank
Computational Linguistics - Special issue on using large corpora: II
Applied morphological processing of English
Natural Language Engineering
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Ranking algorithms for named-entity extraction: boosting and the voted perceptron
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Sentence level discourse parsing using syntactic and lexical information
NAACL '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology - Volume 1
Probabilistic text structuring: experiments with sentence ordering
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
An SVM based voting algorithm with application to parse reranking
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
Modeling local coherence: an entity-based approach
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Coarse-to-fine n-best parsing and MaxEnt discriminative reranking
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Online Passive-Aggressive Algorithms
The Journal of Machine Learning Research
Discourse generation using utility-trained coherence models
COLING-ACL '06 Proceedings of the COLING/ACL on Main conference poster sessions
A global joint model for semantic role labeling
Computational Linguistics
LIBLINEAR: A Library for Large Linear Classification
The Journal of Machine Learning Research
Parsing '05 Proceedings of the Ninth International Workshop on Parsing Technology
Using syntax to disambiguate explicit discourse connectives in text
ACLShort '09 Proceedings of the ACL-IJCNLP 2009 Conference Short Papers
Syntactic and semantic structure for opinion expression detection
CoNLL '10 Proceedings of the Fourteenth Conference on Computational Natural Language Learning
End-to-End Discourse Parser Evaluation
ICSC '11 Proceedings of the 2011 IEEE Fifth International Conference on Semantic Computing
Hi-index | 0.00 |
A coherently related group of sentences may be referred to as a discourse. In this paper we address the problem of parsing coherence relations as defined in the Penn Discourse Tree Bank (PDTB). A good model for discourse structure analysis needs to account both for local dependencies at the token-level and for global dependencies and statistics. We present techniques on using inter-sentential or sentence-level (global), data-driven, non-grammatical features in the task of parsing discourse. The parser model follows up previous approach based on using token-level (local) features with conditional random fields for shallow discourse parsing, which is lacking in structural knowledge of discourse. The parser adopts a two-stage approach where first the local constraints are applied and then global constraints are used on a reduced weighted search space (n-best). In the latter stage we experiment with different rerankers trained on the first stage n-best parses, which are generated using lexico-syntactic local features. The two-stage parser yields significant improvements over the best performing model of discourse parser on the PDTB corpus.