Splitting-merging model of Chinese word tokenization and segmentation

  • Authors:
  • Yuan Yao;Kim Ten Lua

  • Affiliations:
  • Department of Information Systems & Computer Science, National University of Singapore, Lower Kent Ridge Road, Singapore 119260, e-mail: yaoyuan@iscs.nw.edu.sg luakt@iscs.nw.edu.sg;Department of Information Systems & Computer Science, National University of Singapore, Lower Kent Ridge Road, Singapore 119260, e-mail: yaoyuan@iscs.nw.edu.sg luakt@iscs.nw.edu.sg

  • Venue:
  • Natural Language Engineering
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a number of linguistic and computational issues identified during the implementation of a general use grammar checker for contemporary Brazilian Portuguese, ReGra, that has been incorporated in the word processor REDATOR by Itautec/Philco ...