Blocking Gibbs sampling in very large probabilistic expert systems
International Journal of Human-Computer Studies - Special issue: real-world applications of uncertain reasoning
A maximum entropy approach to natural language processing
Computational Linguistics
Document Image Decoding Using Markov Source Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Structuring documents according to their table of contents
Proceedings of the 2005 ACM symposium on Document engineering
Efficient Geometric Algorithms for Parsing in Two Dimensions
ICDAR '05 Proceedings of the Eighth International Conference on Document Analysis and Recognition
DIAL '06 Proceedings of the Second International Conference on Document Image Analysis for Libraries
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Relational Dependency Networks
The Journal of Machine Learning Research
Simulated Iterative Classification A New Learning Procedure for Graph Labeling
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Hi-index | 0.00 |
We address the problems of structuring and annotation of layout-oriented documents. We model the annotation problems as the collective classification on graph-like structures with typed instances and links that capture the domain-specific knowledge. We use the relational dependency networks (RDNs) for the collective inference on the multi-typed graphs. We then describe a variant of RDNs where a stacked approximation replaces the Gibbs sampling in order to accelerate the inference. We report results of evaluation tests for both the Gibbs sampling and stacking inference on two document structuring examples.