Introduction to the special issue on evaluating word sense disambiguation systems
Natural Language Engineering
Evaluating sense disambiguation across diverse parameter spaces
Natural Language Engineering
Parameter optimization for machine-learning of word sense disambiguation
Natural Language Engineering
Word sense disambiguation with pattern learning and automatic feature selection
Natural Language Engineering
Learning from little: comparison of classifiers given little training
PKDD '04 Proceedings of the 8th European Conference on Principles and Practice of Knowledge Discovery in Databases
YALE: rapid prototyping for complex data mining tasks
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Building an optimal WSD ensemble using per-word selection of best system
CIARP'06 Proceedings of the 11th Iberoamerican conference on Progress in Pattern Recognition, Image Analysis and Applications
Combining heterogeneous classifiers for word-sense disambiguation
SENSEVAL '01 The Proceedings of the Second International Workshop on Evaluating Word Sense Disambiguation Systems
Machine learning with lexical features: the Duluth approach to Senseval-2
SENSEVAL '01 The Proceedings of the Second International Workshop on Evaluating Word Sense Disambiguation Systems
Enhancing Cross-Language Question Answering by Combining Multiple Question Translations
CICLing '07 Proceedings of the 8th International Conference on Computational Linguistics and Intelligent Text Processing
How context and semantic information can help a machine learning system?
MICAI'07 Proceedings of the artificial intelligence 6th Mexican international conference on Advances in artificial intelligence
Hi-index | 0.00 |
Based on recent evaluation of word sense disambiguation (WSD) systems [10], disambiguation methods have reached a standstill. In [10] we showed that it is possible to predict the best system for target word using word features and that using this 'optimal ensembling method' more accurate WSD ensembles can be built (3-5% over Senseval state of the art systems with the same amount of possible potential remaining). In the interest of developing if more accurate ensembles, w e here define the strong regions for three popular and effective classifiers used for WSD task (Naive Bayes – NB, Support Vector Machine – SVM, Decision Rules – D) using word features (word grain, amount of positive and negative training examples, dominant sense ratio). We also discuss the effect of remaining factors (feature-based).