The decomposition of human-written summary sentences
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Summarization beyond sentence extraction: a probabilistic approach to sentence compression
Artificial Intelligence
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
TSD '02 Proceedings of the 5th International Conference on Text, Speech and Dialogue
BLEU: a method for automatic evaluation of machine translation
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Online large-margin training of dependency parsers
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Supervised and unsupervised learning for sentence compression
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
Trimming CFG parse trees for sentence compression using machine learning approaches
COLING-ACL '06 Proceedings of the COLING/ACL on Main conference poster sessions
Discriminative sentence compression with conditional random fields
Information Processing and Management: an International Journal
A new approach to automatic speech summarization
IEEE Transactions on Multimedia
An abstractive approach to sentence compression
ACM Transactions on Intelligent Systems and Technology (TIST) - Special Sections on Paraphrasing; Intelligent Systems for Socially Aware Computing; Social Computing, Behavioral-Cultural Modeling, and Prediction
Hi-index | 0.00 |
Conventional sentence compression methods employ a syntactic parser to compress a sentence without changing its meaning. However, the reference compressions made by humans do not always retain the syntactic structures of the original sentences. Moreover, for the goal of on-demand sentence compression, the time spent in the parsing stage is not negligible. As an alternative to syntactic parsing, we propose a novel term weighting technique based on the positional information within the original sentence and a novel language model that combines statistics from the original sentence and a general corpus. Experiments that involve both human subjective evaluations and automatic evaluations show that our method outperforms Hori's method, a state-of-the-art conventional technique. Because our method does not use a syntactic parser, it is 4.3 times faster than Hori's method.