A computer-aided environment for generating multiple-choice test items
Natural Language Engineering
Learning accurate, compact, and interpretable tree annotation
ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
Automatic question generation for vocabulary assessment
HLT '05 Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing
A web-based system for automatic language skill assessment: EVALING
ASSESSEVALNLP '99 Proceedings of a Symposium on Computer Mediated Language Assessment and Evaluation in Natural Language Processing
Applications of lexical information for algorithmically composing multiple-choice cloze items
EdAppsNLP 05 Proceedings of the second workshop on Building Educational Applications Using NLP
A real-time multiple-choice question generation for language testing: a preliminary study
EdAppsNLP 05 Proceedings of the second workshop on Building Educational Applications Using NLP
EdAppsNLP 05 Proceedings of the second workshop on Building Educational Applications Using NLP
Hi-index | 0.00 |
We describe a methodology for improving the generation of multiple-choice test items through the usage of language technologies. We apply common natural language processing techniques, like constituency parsing and automatic term extraction together with additional morpho-syntactic rules on raw instructional material in order to determine its key terms. These key terms are then used for the creation of fill-in-the blank test items and the selection of distractors. Our work aims at proving the availability and compatibility of language resources and technologies for Bulgarian, as well as at assessing the readiness for implementation of these techniques in real-world applications.