Plan-based interfaces: keeping track of user tasks and acting to cooperate

  • Authors:
  • David Franklin;Jay Budzik;Kristian Hammond

  • Affiliations:
  • Northwestern University, Evanston IL;Northwestern University, Evanston IL;Northwestern University, Evanston IL

  • Venue:
  • Proceedings of the 7th international conference on Intelligent user interfaces
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

The ability to reason about the activity of a user is crucial to the implementation of any Intelligent User Interface. If it is able to recognize what a user is doing, a computer can act to cooperate. Most computer systems limit themselves to command-response interactions-their trivial understandings of their users cannot support a more complicated interaction. However, by looking at the tasks that their users are performing and reasoning about sequences of actions, a computer system can provide a more interesting level of interaction that is more efficient and does not demand as much of its users. Furthermore, the understanding of the user's activity provides a context within which to better understand future actions and to tune the sensing systems to look and listen for the actions that the user is most likely to take next. Finally, in many domains, such computer systems can recognize user tasks and act to cooperate without requiring a deep, goal-oriented understanding. In this paper, we look at the process-based interface used in the Intelligent Classroom, focusing on how a human lecturer can control it by simply going about her presentation. Also, we look at how the general ideas have been adapted to Jabberwocky, a speech-based interface to Microsoft PowerPoint that automatically switches slides, and how they are being applied to extend the functionality of Watson, an autonomous web research tool that uses the document a user is viewing as a search context.