Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Slot Grammar: A System for Simpler Construction of Practical Natural Language Grammars
Proceedings of the International Symposium on Natural Language and Logic
Solving large scale linear prediction problems using stochastic gradient descent algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Modeling consensus: classifier combination for word sense disambiguation
EMNLP '02 Proceedings of the ACL-02 conference on Empirical methods in natural language processing - Volume 10
An empirical evaluation of knowledge sources and learning algorithms for word sense disambiguation
EMNLP '02 Proceedings of the ACL-02 conference on Empirical methods in natural language processing - Volume 10
Named entity recognition through classifier combination
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
A Framework for Learning Predictive Structures from Multiple Tasks and Unlabeled Data
The Journal of Machine Learning Research
A kernel PCA method for superior word sense disambiguation
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
A high-performance semi-supervised learning method for text chunking
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Learning semantic classes for word sense disambiguation
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Word sense disambiguation across two domains: Biomedical literature and clinical notes
Journal of Biomedical Informatics
On robustness and domain adaptation using SVD for word sense disambiguation
COLING '08 Proceedings of the 22nd International Conference on Computational Linguistics - Volume 1
OntoNotes: corpus cleanup of mistaken agreement using word sense disambiguation
COLING '08 Proceedings of the 22nd International Conference on Computational Linguistics - Volume 1
Supervised domain adaption for WSD
EACL '09 Proceedings of the 12th Conference of the European Chapter of the Association for Computational Linguistics
Structural correspondence learning for parse disambiguation
EACL '09 Proceedings of the 12th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop
Semi-supervised learning for blog classification
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Transfer learning, feature selection and word sense disambguation
ACLShort '09 Proceedings of the ACL-IJCNLP 2009 Conference Short Papers
A rich feature vector for protein-protein interaction extraction from multiple corpora
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
Minimum Description Length Penalization for Group and Multi-Task Sparse Learning
The Journal of Machine Learning Research
Regular polysemy: a distributional model
SemEval '12 Proceedings of the First Joint Conference on Lexical and Computational Semantics - Volume 1: Proceedings of the main conference and the shared task, and Volume 2: Proceedings of the Sixth International Workshop on Semantic Evaluation
Hi-index | 0.00 |
This paper presents a new application of the recently proposed machine learning method Alternating Structure Optimization (ASO), to word sense disambiguation (WSD). Given a set of WSD problems and their respective labeled examples, we seek to improve overall performance on that set by using all the labeled examples (irrespective of target words) for the entire set in learning a disambiguator for each individual problem. Thus, in effect, on each individual problem (e.g., disambiguation of "art") we benefit from training examples for other problems (e.g., disambiguation of "bar", "canal", and so forth). We empirically study the effective use of ASO for this purpose in the multitask and semi-supervised learning configurations. Our performance results rival or exceed those of the previous best systems on several Senseval lexical sample task data sets.