Knowledge Representation and the Semantics of Natural Language (Cognitive Technologies)
Knowledge Representation and the Semantics of Natural Language (Cognitive Technologies)
Coreference resolution with syntactico-semantic rules and corpus statistics
ConLL '01 Proceedings of the 2001 workshop on Computational Natural Language Learning - Volume 7
Semantic Decomposition for Question Answering
Proceedings of the 2008 conference on ECAI 2008: 18th European Conference on Artificial Intelligence
Integrating methods from IR and QA for geographic information retrieval
CLEF'08 Proceedings of the 9th Cross-language evaluation forum conference on Evaluating systems for multilingual and multimodal information access
Question answering using sentence parsing and semantic network matching
CLEF'04 Proceedings of the 5th conference on Cross-Language Evaluation Forum: multilingual Information Access for Text, Speech and Images
An application of automated reasoning in natural language question answering
AI Communications - Practical Aspects of Automated Reasoning
RAVE: a fast logic-based answer validator
CLEF'08 Proceedings of the 9th Cross-language evaluation forum conference on Evaluating systems for multilingual and multimodal information access
Studying syntactic analysis in a QA system: FIDJI @ respubliqa'09
CLEF'09 Proceedings of the 10th cross-language evaluation forum conference on Multilingual information access evaluation: text retrieval experiments
Semantic QA for encyclopaedic questions: EQUAL in GikiCLEF
CLEF'09 Proceedings of the 10th cross-language evaluation forum conference on Multilingual information access evaluation: text retrieval experiments
Sibyl, a factoid question-answering system for spoken documents
ACM Transactions on Information Systems (TOIS)
Hi-index | 0.00 |
The German question answering (QA) system IRSAW (formerly: InSicht) participated in QA@CLEF for the fifth time. IRSAW was introduced in 2007 by integrating the deep answer producer InSicht, several shallow answer producers, and a logical validator. InSicht builds on a deep QA approach: it transforms documents to semantic representations using a parser, draws inferences on semantic representations with rules, and matches semantic representations derived from questions and documents. InSicht was improved for QA@CLEF 2008 mainly in the following two areas. The coreference resolver was trained on question series instead of newspaper texts in order to be better applicable for follow-up questions. Questions are decomposed by several methods on the level of semantic representations. On the shallow processing side, the number of answer producers was increased from two to four by adding FACT, a fact index, and SHASE, a shallow semantic network matcher. The answer validator introduced in 2007 was replaced by the faster RAVE validator designed for logic-based answer validation under time constraints. Using RAVE for merging the results of the answer producers, monolingual German runs and bilingual runs with source language English and Spanish were produced by applying the machine translation web service Promt. An error analysis shows the main problems for the precision-oriented deep answer producer InSicht and the potential offered by the recalloriented shallow answer producers.