Bayesian learning of probabilistic language models
Bayesian learning of probabilistic language models
Moving right along: a computational model of metaphoric reasoning about events
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Introduction to Bayesian Networks
Introduction to Bayesian Networks
Semantics and Inference for Recursive Probability Models
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
Question answering based on semantic structures
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
Constructing grammar: a computational model of the emergence of early constructions
Constructing grammar: a computational model of the emergence of early constructions
Best-fit constructional analysis
Best-fit constructional analysis
Toward the formal verification of a unification system
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Toward the formal verification of a unification system
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on cybernetics and cognitive informatics
A statistical test for grammar
CMCL '11 Proceedings of the 2nd Workshop on Cognitive Modeling and Computational Linguistics
Hi-index | 0.00 |
The talk will describe an ongoing project (modestly named the Neural Theory of Language) that is attempting to model language behavior in a way that is both neurally plausible and computationally practical. The cornerstone of the effort is a formalism called Embodied Construction Grammar (ECG). I will describe the formalism, a robust semantic parser based on it, and a variety of applications of moderate scale. These include a system for understanding the (probabilistic and metaphorical) implications of news stories, and the first cognitively plausible model of how children learn grammar.