Word association norms, mutual information, and lexicography
Computational Linguistics
The multiinformation function as a tool for measuring stochastic dependence
Proceedings of the NATO Advanced Study Institute on Learning in graphical models
Discovering word senses from text
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Information theoretical analysis of multivariate correlation
IBM Journal of Research and Development
Towards a matrix-based distributional model of meaning
HLT-SRWS '10 Proceedings of the NAACL HLT 2010 Student Research Workshop
A non-negative tensor factorization model for selectional preference induction
Natural Language Engineering
Distributional memory: A general framework for corpus-based semantics
Computational Linguistics
Joint inference for cross-document information extraction
Proceedings of the 20th ACM international conference on Information and knowledge management
Hi-index | 0.00 |
Since its introduction into the NLP community, pointwise mutual information has proven to be a useful association measure in numerous natural language processing applications such as collocation extraction and word space models. In its original form, it is restricted to the analysis of two-way co-occurrences. NLP problems, however, need not be restricted to two-way co-occurrences; often, a particular problem can be more naturally tackled when formulated as a multi-way problem. In this paper, we explore two multivariate generalizations of pointwise mutual information, and explore their usefulness and nature in the extraction of subject verb object triples.