Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Effective backpropagation training with variable stepsize
Neural Networks
Probabilistic model for definitional question answering
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Hi-index | 0.00 |
This paper presents an integrated system for the task of definitional question answering. Firstly, we extract question-related knowledge as much as possible which include 3 categories. The first is based on language model. We train our language model on four different corpora. The second resource is the syntax dependency relations, which are extracted by Minipar. And the third resource contains only one feature, the document score provided by Information Retrieval (IR) engine. After that, we use a novel Adaptive Optical Back-Propagation (AOBP) neural network to score candidate sentences by using above extracted knowledge. The topk candidates will be selected as the final question answers. We experiment on the task of definitional question answering in TREC2006&2005. The experimental results show that our method can greatly improve the performance.