The second release of the RASP system
COLING-ACL '06 Proceedings of the COLING/ACL on Interactive presentation sessions
Overview of the Answer Validation Exercise 2007
Advances in Multilingual and Multimodal Information Retrieval
UNED at Answer Validation Exercise 2007
Advances in Multilingual and Multimodal Information Retrieval
The third PASCAL recognizing textual entailment challenge
RTE '07 Proceedings of the ACL-PASCAL Workshop on Textual Entailment and Paraphrasing
Overview of the answer validation exercise 2008
CLEF'08 Proceedings of the 9th Cross-language evaluation forum conference on Evaluating systems for multilingual and multimodal information access
Answer validation on English and Romanian languages
CLEF'08 Proceedings of the 9th Cross-language evaluation forum conference on Evaluating systems for multilingual and multimodal information access
Information synthesis for answer validation
CLEF'08 Proceedings of the 9th Cross-language evaluation forum conference on Evaluating systems for multilingual and multimodal information access
SemEval-2010 task 12: Parser evaluation using textual entailments
SemEval '10 Proceedings of the 5th International Workshop on Semantic Evaluation
Overview of the answer validation exercise 2006
CLEF'06 Proceedings of the 7th international conference on Cross-Language Evaluation Forum: evaluation of multilingual and multi-modal information retrieval
Answer validation through textual entailment
NLDB'11 Proceedings of the 16th international conference on Natural language processing and information systems
Hi-index | 0.00 |
We present an Answer Validation System (AV) based on Textual Entailment and Question Answering. The important features used to develop the AV system are Lexical Textual Entailment, Named Entity Recognition, Question-Answer type analysis, chunk boundary module and syntactic similarity module. The proposed AV system is rule based. We first combine the question and the answer into Hypothesis (H) and the Supporting Text as Text (T) to identify the entailment relation as either "VALIDATED" or "REJECTED". The important features used for the lexical Textual Entailment module in the present system are: WordNet based unigram match, bigram match and skip-gram. In the syntactic similarity module, the important features used are: subject-subject comparison, subject-verb comparison, object-verb comparison and cross subject-verb comparison. The results obtained from the answer validation modules are integrated using a voting technique. For training purpose, we used the AVE 2008 development set. Evaluation scores obtained on the AVE 2008 test set show 66% precision and 65% F-Score for "VALIDATED" decision.