Attention, intentions, and the structure of discourse
Computational Linguistics
Decision-theoretic troubleshooting
Communications of the ACM
Planning and acting in partially observable stochastic domains
Artificial Intelligence
Planning and Acting under Uncertainty: A New Model for Spoken Dialogue System
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
A collaborative planning model of intentional structure
Computational Linguistics
Partially observable Markov decision processes for spoken dialog systems
Computer Speech and Language
Perseus: randomized point-based value iteration for POMDPs
Journal of Artificial Intelligence Research
Evaluating user simulations with the Cramér-von Mises divergence
Speech Communication
Learning lexical alignment policies for generating referring expressions in spoken dialogue systems
ENLG '09 Proceedings of the 12th European Workshop on Natural Language Generation
Optimal dialog in consumer-rating systems using a POMDP framework
SIGdial '08 Proceedings of the 9th SIGdial Workshop on Discourse and Dialogue
The Knowledge Engineering Review
Bayesian update of dialogue state: A POMDP framework for spoken dialogue systems
Computer Speech and Language
Learning adaptive referring expression generation policies for spoken dialogue systems
Empirical methods in natural language generation
Hi-index | 0.00 |
This paper reports on progress applying partially observable Markov decision processes (POMDPs) to a commercial dialog domain: troubleshooting. In the troubleshooting domain, a spoken dialog system helps a user to fix a product such as a failed DSL connection. Past work has argued that a POMDP is a principled approach to building spoken dialog systems in the simpler slot-filling domain; this paper explains how the POMDPs formulation can be extended to the more complex troubleshooting domain. Results from dialog simulation verify that a POMDP outperforms a handcrafted baseline.