Participating in explanatory dialogues: interpreting and responding to questions in context
Participating in explanatory dialogues: interpreting and responding to questions in context
Centering: a framework for modeling the local coherence of discourse
Computational Linguistics
Tailoring the interaction with users in electronic shops
UM '99 Proceedings of the seventh international conference on User modeling
Building natural language generation systems
Building natural language generation systems
Choosing a Set of Coherence Relations for Text Generation: A Data-Driven Approach
EWNLG '93 Selected papers from the Fourth European Workshop on Trends in Natural Language Generation, An Artificial Intelligence Perspective
An empirical study of the influence of argument conciseness on argument effectiveness
ACL '00 Proceedings of the 38th Annual Meeting on Association for Computational Linguistics
A task-based framework to evaluate evaluative arguments
INLG '00 Proceedings of the first international conference on Natural language generation - Volume 14
A strategy for generating evaluative arguments
INLG '00 Proceedings of the first international conference on Natural language generation - Volume 14
DPOCL: a principled approach to discourse planning
INLG '94 Proceedings of the Seventh International Workshop on Natural Language Generation
Hi-index | 0.00 |
This paper presents a system for generating user tailored evaluative arguments, known as the Generator of Evaluative Arguments (GEA). GEA design is based on a pipelined architecture commonly used in natural language generation. After an overview description of GEA main components, we focus on how GEA performs the microplanning tasks. Details are provided by examining the generation of a sample argument.