OHSUMED: an interactive retrieval evaluation and new large test collection for research
SIGIR '94 Proceedings of the 17th annual international ACM SIGIR conference on Research and development in information retrieval
Large-scale information retrieval with latent semantic indexing
Information Sciences: an International Journal
Introduction to matrix analysis (2nd ed.)
Introduction to matrix analysis (2nd ed.)
Probabilistic latent semantic indexing
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Matrices with Low-Rank-Plus-Shift Structure: Partial SVD and Latent Semantic Indexing
SIAM Journal on Matrix Analysis and Applications
Matrix analysis and applied linear algebra
Matrix analysis and applied linear algebra
Latent semantic space: iterative scaling improves precision of inter-document similarity measurement
SIGIR '00 Proceedings of the 23rd annual international ACM SIGIR conference on Research and development in information retrieval
Latent semantic indexing: a probabilistic analysis
Journal of Computer and System Sciences - Special issue on the seventeenth ACM SIGACT-SIGMOD-SIGART symposium on principles of database systems
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Proceedings of the 24th annual international ACM SIGIR conference on Research and development in information retrieval
Stable algorithms for link analysis
Proceedings of the 24th annual international ACM SIGIR conference on Research and development in information retrieval
Information Retrieval
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
Exploiting concept clusters for content-based information retrieval
Information Sciences—Informatics and Computer Science: An International Journal
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
Link analysis, eigenvectors and stability
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Multiple-vector user profiles in support of knowledge sharing
Information Sciences: an International Journal
Metaphor-based meaning excavation
Information Sciences: an International Journal
Information Sciences: an International Journal
Concept-based learning of human behavior for customer relationship management
Information Sciences: an International Journal
Hi-index | 0.07 |
Latent Semantic Analysis (LSA) is a well-known method for information retrieval. It has also been applied as a model of cognitive processing and word-meaning acquisition. This dual importance of LSA derives from its capacity to modulate the meaning of words by contexts, dealing successfully with polysemy and synonymy. The underlying reasons that make the method work are not clear enough. We propose that the method works because it detects an underlying block structure (the blocks corresponding to topics) in the term-by-document matrix. In real cases this block structure is hidden because of perturbations. We propose that the correct explanation for LSA must be searched in the structure of singular vectors rather than in the profile of singular values. Using the Perron-Frobenius theory we show that the presence of disjoint blocks of documents is marked by sign-homogeneous entries in the vectors corresponding to the documents of one block and zeros elsewhere. In the case of nearly disjoint blocks, perturbation theory shows that if the perturbations are small, the zeros in the leading vectors are replaced by small numbers (pseudo-zeros). Since the singular values of each block might be very different in magnitude, their order does not mirror the order of blocks. When the norms of the blocks are similar, LSA works fine, but we propose that when the topics have different sizes, the usual procedure of selecting the first k singular triplets (k being the number of blocks) should be replaced by a method that selects the perturbed Perron vectors for each block.