C4.5: programs for machine learning
C4.5: programs for machine learning
A maximum entropy approach to natural language processing
Computational Linguistics
Summarization beyond sentence extraction: a probabilistic approach to sentence compression
Artificial Intelligence
Sentence reduction for automatic text summarization
ANLC '00 Proceedings of the sixth conference on Applied natural language processing
The automated acquisition of topic signatures for text summarization
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 1
Optimization, maxent models, and conditional estimation without magic
NAACL-Tutorials '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology: Tutorials - Volume 5
Discriminative Reranking for Natural Language Parsing
Computational Linguistics
Empirically-based control of natural language generation
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
Design of a multi-lingual, parallel-processing statistical parsing engine
HLT '02 Proceedings of the second international conference on Human Language Technology Research
Measuring importance and query relevance in topic-focused multi-document summarization
ACL '07 Proceedings of the 45th Annual Meeting of the ACL on Interactive Poster and Demonstration Sessions
Sentence compression beyond word deletion
COLING '08 Proceedings of the 22nd International Conference on Computational Linguistics - Volume 1
An open-source natural language generator for OWL ontologies and its use in Protégé and Second Life
EACL '09 Proceedings of the 12th Conference of the European Chapter of the Association for Computational Linguistics: Demonstrations Session
Global inference for sentence compression an integer linear programming approach
Journal of Artificial Intelligence Research
Sentence compression as tree transduction
Journal of Artificial Intelligence Research
A comparison of model free versus model intensive approaches to sentence compression
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
Unsupervised induction of sentence compression rules
UCNLG+Sum '09 Proceedings of the 2009 Workshop on Language Generation and Summarisation
A survey of paraphrasing and textual entailment methods
Journal of Artificial Intelligence Research
Paraphrastic sentence compression with a character-based metric: tightening without deletion
MTTG '11 Proceedings of the Workshop on Monolingual Text-To-Text Generation
Evaluating sentence compression: pitfalls and suggested remedies
MTTG '11 Proceedings of the Workshop on Monolingual Text-To-Text Generation
UCNLG+EVAL '11 Proceedings of the UCNLG+Eval: Language Generation and Evaluation Workshop
Sentence compression with semantic role constraints
ACL '12 Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Short Papers - Volume 2
Hi-index | 0.00 |
We present a new method that compresses sentences by removing words. In a first stage, it generates candidate compressions by removing branches from the source sentence's dependency tree using a Maximum Entropy classifier. In a second stage, it chooses the best among the candidate compressions using a Support Vector Machine Regression model. Experimental results show that our method achieves state-of-the-art performance without requiring any manually written rules.