Guided gesture support in the paper PDA

  • Authors:
  • Daniel Avrahami;Scott E. Hudson;Thomas P. Moran;Brian D. Williams

  • Affiliations:
  • Carnegie Mellon University, Pittsburgh, PA;Carnegie Mellon University, Pittsburgh, PA;IBM Almaden Research Center, San Jose, CA;Carnegie Mellon University, Pittsburgh, PA

  • Venue:
  • Proceedings of the 14th annual ACM symposium on User interface software and technology
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

Ordinary paper offers properties of readability, fluidity, flexibility, cost, and portability that current electronic devices are often hard pressed to match. In fact, a lofty goal for many interactive systems is to be "as easy to use as pencil and paper". However, the static nature of paper does not support a number of capabilities, such as search and hyperlinking that an electronic device can provide. The Paper PDA project explores ways in which hybrid paper electronic interfaces can bring some of the capabilities of the electronic medium to interactions occurring on real paper. Key to this effort is the invention of on-paper interaction techniques which retain the flexibility and fluidity of normal pen and paper, but which are structured enough to allow robust interpretation and processing in the digital world. This paper considers the design of a class of simple printed templates that allow users to make common marks in a fluid fashion, and allow additional gestures to be invented by the users to meet their needs, but at the same time encourages marks that are quite easy to recognize.