How many bits are needed to store probabilities for phrase-based translation?

  • Authors:
  • Marcello Federico;Nicola Bertoldi

  • Affiliations:
  • ITC-irst -- Centro per la Ricerca Scientifica e Tecnologica, Povo -- Trento, Italy;ITC-irst -- Centro per la Ricerca Scientifica e Tecnologica, Povo -- Trento, Italy

  • Venue:
  • StatMT '06 Proceedings of the Workshop on Statistical Machine Translation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

State of the art in statistical machine translation is currently represented by phrase-based models, which typically incorporate a large number of probabilities of phrase-pairs and word n-grams. In this work, we investigate data compression methods for efficiently encoding n-gram and phrase-pair probabilities, that are usually encoded in 32-bit floating point numbers. We measured the impact of compression on translation quality through a phrase-based decoder trained on two distinct tasks: the translation of European Parliament speeches from Spanish to English, and the translation of news agencies from Chinese to English. We show that with a very simple quantization scheme all probabilities can be encoded in just 4 bits with a relative loss in BLEU score on the two tasks by 1.0% and 1.6%, respectively.