Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
A model for reasoning about persistence and causation
Computational Intelligence
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Maximum Entropy Markov Models for Information Extraction and Segmentation
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Table extraction using conditional random fields
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Dynamic bayesian networks: representation, inference and learning
Dynamic bayesian networks: representation, inference and learning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Shallow parsing with conditional random fields
NAACL '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology - Volume 1
2D Conditional Random Fields for Web information extraction
ICML '05 Proceedings of the 22nd international conference on Machine learning
Introduction to the CoNLL-2000 shared task: chunking
ConLL '00 Proceedings of the 2nd workshop on Learning language in logic and the 4th conference on Computational natural language learning - Volume 7
The relationship between Precision-Recall and ROC curves
ICML '06 Proceedings of the 23rd international conference on Machine learning
Hidden-variable models for discriminative reranking
HLT '05 Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing
Interactive information extraction with constrained conditional random fields
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
Hi-index | 0.00 |
We propose a new discriminative framework, namely Hidden Dynamic Conditional Random Fields (HDCRFs), for building probabilistic models which can capture both internal and external class dynamics to label sequence data. We introduce a small number of hidden state variables to model the sub-structure of a observation sequence and learn dynamics between different class labels. An HDCRF offers several advantages over previous discriminative models and is attractive both, conceptually and computationally. We performed experiments on three well-established sequence labeling tasks in natural language, including part-of-speech tagging, noun phrase chunking, and named entity recognition. The results demonstrate the validity and competitiveness of our model. In addition, our model compares favorably with current state-of-the-art sequence labeling approach, Conditional Random Fields (CRFs), which can only model the external dynamics.