A scalable global model for summarization

  • Authors:
  • Dan Gillick;Benoit Favre

  • Affiliations:
  • University of California Berkeley and International Computer Science Institute, Berkeley;International Computer Science Institute, Berkeley

  • Venue:
  • ILP '09 Proceedings of the Workshop on Integer Linear Programming for Natural Langauge Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an Integer Linear Program for exact inference under a maximum coverage model for automatic summarization. We compare our model, which operates at the sub-sentence or "concept-level, to a sentence-level model, previously solved with an ILP. Our model scales more efficiently to larger problems because it does not require a quadratic number of variables to address redundancy in pairs of selected sentences. We also show how to include sentence compression in the ILP formulation, which has the desirable property of performing compression and sentence selection simultaneously. The resulting system performs at least as well as the best systems participating in the recent Text Analysis Conference, as judged by a variety of automatic and manual content-based metrics.