Artificial Intelligence - Special volume on natural language processing
Machine Learning
Parsing the WSJ using CCG and log-linear models
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Wide-coverage semantic representations from a CCG parser
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
Recognising textual entailment with logical inference
HLT '05 Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
A structured vector space model for word meaning in context
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
NAACL-Short '06 Proceedings of the Human Language Technology Conference of the NAACL, Companion Volume: Short Papers
An extended model of natural logic
IWCS-8 '09 Proceedings of the Eighth International Conference on Computational Semantics
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
Contextualizing semantic representations using syntactically enriched vector models
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Exemplar-based models for word meaning in context
ACLShort '10 Proceedings of the ACL 2010 Conference Short Papers
Recognizing Inference in Texts with Markov Logic Networks
ACM Transactions on Asian Language Information Processing (TALIP) - Special Issue on RITE
Leveraging Diverse Lexical Resources for Textual Entailment Recognition
ACM Transactions on Asian Language Information Processing (TALIP) - Special Issue on RITE
Semantic compositionality through recursive matrix-vector spaces
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
First-order vs. higher-order modification in distributional semantics
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Hi-index | 0.00 |
First-order logic provides a powerful and flexible mechanism for representing natural language semantics. However, it is an open question of how best to integrate it with uncertain, probabilistic knowledge, for example regarding word meaning. This paper describes the first steps of an approach to recasting first-order semantics into the probabilistic models that are part of Statistical Relational AI. Specifically, we show how Discourse Representation Structures can be combined with distributional models for word meaning inside a Markov Logic Network and used to successfully perform inferences that take advantage of logical concepts such as factivity as well as probabilistic information on word meaning in context.