Identifying terms by their family and friends
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 1
Man vs. machine: a case study in base noun phrase learning
ACL '99 Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics
A language--independent shallow--parser compiler
ACL '01 Proceedings of the 39th Annual Meeting on Association for Computational Linguistics
Detecting dominant locations from search queries
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Hi-index | 0.00 |
LEXTER is a software package for extracting terminology. A corpus of French language texts on any subject field is fed in, and LEXTER produces a list of likely terminological units to be submitted to an expert to be validated. To identify the terminological units, LEXTER takes their form into account and proceeds in two main stages: analysis, parsing. In the first stage, LEXTER uses a base of rules designed to indentify frontier markers in view to analysing the texts and extracting maximal-length noun phrases. In the second stage, LEXTER parses these maximal-length noun phrases to extract subgroups which by virtue of their grammatical structure and their place in the maximal-length noun phrases are likely to be terminological units. In this article, the type of analysis used (surface grammatical analysis) is highlighted, as the methodological approach adopted to adapt the rules (experimental approach).