Efficient text summarization using lexical chains
Proceedings of the 5th international conference on Intelligent user interfaces
Lexical semantics and automatic hypertext construction
ACM Computing Surveys (CSUR)
Lexical cohesion computed by thesaural relations as an indicator of the structure of text
Computational Linguistics
Word sense disambiguation and text segmentation based on lexical cohesion
COLING '94 Proceedings of the 15th conference on Computational linguistics - Volume 2
Evaluation of a sentence ranker for text summarization based on Roget's thesaurus
TSD'10 Proceedings of the 13th international conference on Text, speech and dialogue
Hi-index | 0.00 |
Morris and Hirst [10] present a method of linking significant words that are about the same topic. The resulting lexical chains are a means of identifying cohesive regions in a text, with applications in many natural language processing tasks, including text summarization. The first lexical chains were constructed manually using Roget's International Thesaurus. Morris and Hirst wrote that automation would be straightforward given an electronic thesaurus. All applications so far have used WordNet to produce lexical chains, perhaps because adequate electronic versions of Roget's were not available until recently. We discuss the building of lexical chains using an electronic version of Roget's Thesaurus. We implement a variant of the original algorithm, and explain the necessary design decisions. We include a comparison with other implementations.