Foundations of logic programming
Foundations of logic programming
Fundamentals of speech recognition
Fundamentals of speech recognition
Context-sensitive statistics for improved grammatical language models
AAAI '94 Proceedings of the twelfth national conference on Artificial intelligence (vol. 1)
An efficient probabilistic context-free parsing algorithm that computes prefix probabilities
Computational Linguistics
Foundations of statistical natural language processing
Foundations of statistical natural language processing
Expert Systems and Probabiistic Network Models
Expert Systems and Probabiistic Network Models
Computing extended abduction through transaction programs
Annals of Mathematics and Artificial Intelligence
OLD Resolution with Tabulation
Proceedings of the Third International Conference on Logic Programming
Parameterized Logic Programs where Computing Meets Learning
FLOPS '01 Proceedings of the 5th International Symposium on Functional and Logic Programming
Efficient EM Learning with Tabulation for Parameterized Logic Programs
CL '00 Proceedings of the First International Conference on Computational Logic
PRISM: a language for symbolic-statistical modeling
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Hi-index | 0.00 |
We first review a logical-statistical framework called statistical abduction and identify its three computational tasks, one of which is the learning of parameters from observations by ML (maximum likelihood) estimation. Traditionally, in the presence of missing values, the EM algorithm has been used for ML estimation. We report that the graphical EM algorithm, a new EM algorithm developed for statistical abduction, achieved the same time complexity as specialized EM algorithms developed in each discipline such as the Inside-Outside algorithm for PCFGs (probabilistic context free grammars). Furthermore, learning experiments using two corpora revealed that it can outperform the Inside-Outside algorithm by orders of magnitude. We then specifically look into a family of extensions of PCFGs that incorporate context sensitiveness into PCFGs. Experiments show that they are learnable by the graphical EM algorithm using at most twice as much time as plain PCFGs even though these extensions have higher time complexity.