Foundations of statistical natural language processing
Foundations of statistical natural language processing
CICLing '02 Proceedings of the Third International Conference on Computational Linguistics and Intelligent Text Processing
Boosting Applied toe Word Sense Disambiguation
ECML '00 Proceedings of the 11th European Conference on Machine Learning
An Iterative Approach to Word Sense Disambiguation
Proceedings of the Thirteenth International Florida Artificial Intelligence Research Society Conference
A Baseline Methodology for Word Sense Disambiguation
CICLing '02 Proceedings of the Third International Conference on Computational Linguistics and Intelligent Text Processing
Feature Selection Analysis for Maximum Entropy-Based WSD
CICLing '02 Proceedings of the Third International Conference on Computational Linguistics and Intelligent Text Processing
Maximum entropy models for natural language ambiguity resolution
Maximum entropy models for natural language ambiguity resolution
A non-projective dependency parser
ANLC '97 Proceedings of the fifth conference on Applied natural language processing
Integrating multiple knowledge sources to disambiguate word sense: an exemplar-based approach
ACL '96 Proceedings of the 34th annual meeting on Association for Computational Linguistics
Multidimensional transformation-based learning
ConLL '01 Proceedings of the 2001 workshop on Computational Natural Language Learning - Volume 7
Experiments in word domain disambiguation for parallel texts
WorkSense '00 Proceedings of the ACL-2000 Workshop on Word Senses and Multi-Linguality
Hi-index | 0.00 |
In this paper, an evaluation of several feature selections for word sense disambiguation is presented. The method used to classify linguistic contexts in its correct sense is based on maximum entropy probability models. In order to study their relevance for each word, several types of features have been analyzed for a few words selected from the DSO corpus. An improved definition of features in order to increase efficiency is presented as well.