Artificial intelligence: a modern approach
Artificial intelligence: a modern approach
Coarse-to-Fine Dynamic Programming
IEEE Transactions on Pattern Analysis and Machine Intelligence
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
A parsing: fast exact Viterbi parse selection
NAACL '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology - Volume 1
Shallow parsing with conditional random fields
NAACL '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology - Volume 1
Fast inference and learning in large-state-space HMMs
ICML '05 Proceedings of the 22nd international conference on Machine learning
EMNLP '02 Proceedings of the ACL-02 conference on Empirical methods in natural language processing - Volume 10
Bidirectional inference with the easiest-first strategy for tagging sequence data
HLT '05 Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing
Multilevel coarse-to-fine PCFG parsing
HLT-NAACL '06 Proceedings of the main conference on Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics
The generalized A* architecture
Journal of Artificial Intelligence Research
Efficient HPSG parsing with supertagging and CFG-filtering
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Parsing '05 Proceedings of the Ninth International Workshop on Parsing Technology
Efficient inference of CRFs for large-scale natural language data
ACLShort '09 Proceedings of the ACL-IJCNLP 2009 Conference Short Papers
ACL '09 Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 2 - Volume 2
CarpeDiem: Optimizing the Viterbi Algorithm and Applications to Supervised Sequential Learning
The Journal of Machine Learning Research
Efficient staggered decoding for sequence labeling
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Efficient inference in large conditional random fields
ECML'06 Proceedings of the 17th European conference on Machine Learning
Error bounds for convolutional codes and an asymptotically optimum decoding algorithm
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Sequential modeling has been widely used in a variety of important applications including named entity recognition and shallow parsing. However, as more and more real time large-scale tagging applications arise, decoding speed has become a bottleneck for existing sequential tagging algorithms. In this paper we propose 1-best A*, 1-best iterative A*, k-best A* and k-best iterative Viterbi A* algorithms for sequential decoding. We show the efficiency of these proposed algorithms for five NLP tagging tasks. In particular, we show that iterative Viterbi A* decoding can be several times or orders of magnitude faster than the state-of-the-art algorithm for tagging tasks with a large number of labels. This algorithm makes real-time large-scale tagging applications with thousands of labels feasible.