Building a question answering test collection
SIGIR '00 Proceedings of the 23rd annual international ACM SIGIR conference on Research and development in information retrieval
Exploiting redundancy in question answering
Proceedings of the 24th annual international ACM SIGIR conference on Research and development in information retrieval
Question answering from the web using knowledge annotation and knowledge mining techniques
CIKM '03 Proceedings of the twelfth international conference on Information and knowledge management
A LF based answer indexing method for encyclopedia question-answering system
AIRS'05 Proceedings of the Second Asia conference on Asia Information Retrieval Technology
Fine-Grained named entity recognition using conditional random fields for question answering
AIRS'06 Proceedings of the Third Asia conference on Information Retrieval Technology
Finding more trustworthy answers: Various trustworthiness factors in question answering
Journal of Information Science
Hi-index | 0.01 |
With the advances in natural language processing (NLP) techniques and the need to deliver more fine-grained information or answers than a set of documents, various QA techniques have been developed corresponding to different question and answer types. A comprehensive QA system must be able to incorporate individual QA techniques as they are developed and integrate their functionality to maximize the system's overall capability in handling increasingly diverse types of questions. To this end, a new QA method was developed to learn strategies for determining module invocation sequences and boosting answer weights for different types of questions. In this article, we examine the roles and effects of the answer verification and weight boosting method, which is the main core of the automatically generated strategy-driven QA framework, in comparison with a strategy-less, straightforward answer-merging approach and a strategy-driven but with manually constructed strategies.