Learning Information Extraction Rules for Semi-Structured and Free Text
Machine Learning - Special issue on natural language learning
Foundations of statistical natural language processing
Foundations of statistical natural language processing
Optimized Substructure Discovery for Semi-structured Data
PKDD '02 Proceedings of the 6th European Conference on Principles of Data Mining and Knowledge Discovery
Efficiently mining frequent trees in a forest
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Unsupervised discovery of scenario-level patterns for Information Extraction
ANLC '00 Proceedings of the sixth conference on Applied natural language processing
Automatic pattern acquisition for Japanese information extraction
HLT '01 Proceedings of the first international conference on Human language technology research
An improved extraction pattern representation model for automatic IE pattern acquisition
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Counter-training in discovery of semantic patterns
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Boosting-based parse reranking with subtree features
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
A semantic approach to IE pattern induction
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Comparing information extraction pattern models
IEBeyondDoc '06 Proceedings of the Workshop on Information Extraction Beyond The Document
Automatically generating extraction patterns from untagged text
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
Several recent approaches to Information Extraction (IE) have used dependency trees as the basis for an extraction pattern representation. These approaches have used a variety of pattern models (schemes which define the parts of the dependency tree which can be used to form extraction patterns). Previous comparisons of these pattern models are limited by the fact that they have used indirect tasks to evaluate each model. This limitation is addressed here in an experiment which compares four pattern models using an unsupervised learning algorithm and a standard IE scenario. It is found that there is a wide variation between the models' performance and suggests that one model is the most useful for IE.