The “GENERATION GAP”: the problem of expressibility in text planning
The “GENERATION GAP”: the problem of expressibility in text planning
Trainable methods for surface natural language generation
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
Generation that exploits corpus-based statistical knowledge
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 1
Two-level, many-paths generation
ACL '95 Proceedings of the 33rd annual meeting on Association for Computational Linguistics
Exploiting a probabilistic hierarchical model for generation
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 1
Generation as structure driven derivation
COLING '88 Proceedings of the 12th conference on Computational linguistics - Volume 2
ACL '99 Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics
Instance-based natural language generation
NAACL '01 Proceedings of the second meeting of the North American Chapter of the Association for Computational Linguistics on Language technologies
Overgeneration and ranking for spoken dialogue systems
INLG '06 Proceedings of the Fourth International Natural Language Generation Conference
Instance-based natural language generation
Natural Language Engineering
Hi-index | 0.00 |
A fundamental assumption underlying candidate ranking in corpus-based approaches to Natural Language Generation is the idea that in order to be fluent the output should be as similar to a (human-authored) corpus as possible. However, the goal of maximizing fluency can conflict with other goals, like conveying the maximal amount of input and being faithful. We employ an instance-based sentence generation system to investigate how the right balance between the different goals can be struck and show empirical results supporting our proposals.