Proceedings of the 1992 ACM/IEEE conference on Supercomputing
A vector space model for automatic indexing
Communications of the ACM
Contextual correlates of synonymy
Communications of the ACM
Placing search in context: the concept revisited
ACM Transactions on Information Systems (TOIS)
Mining the Web for Synonyms: PMI-IR versus LSA on TOEFL
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Locality preserving indexing for document representation
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Dependency-Based Construction of Semantic Space Models
Computational Linguistics
Regularized locality preserving indexing via spectral regression
Proceedings of the sixteenth ACM conference on Conference on information and knowledge management
The S-Space package: an open source package for word space models
ACLDemos '10 Proceedings of the ACL 2010 System Demonstrations
Hi-index | 0.00 |
Dimensionality reduction has been shown to improve processing and information extraction from high dimensional data. Word space algorithms typically employ linear reduction techniques that assume the space is Euclidean. We investigate the effects of extracting nonlinear structure in the word space using Locality Preserving Projections, a reduction algorithm that performs manifold learning. We apply this reduction to two common word space models and show improved performance over the original models on benchmarks.